sentence1
stringlengths
1
133k
sentence2
stringlengths
1
131k
are operated by Military Sealift Command. They can carry more than 177,000 barrels of oil, 2,150 tons of ammunition, 500 tons of dry stores and 250 tons of refrigerated stores. It receives petroleum products, ammunition and stores from various shuttle ships and redistributes these items when needed to ships in the carrier battle group. This greatly reduces the number of service ships needed to travel with carrier battle groups. The four ships of the were 53,000 tons at full load, 796 feet overall length, and carried two Boeing Vertol CH-46 Sea Knight helicopters. The Sacramento class was retired
needed to travel with carrier battle groups. The four ships of the were 53,000 tons at full load, 796 feet overall length, and carried two Boeing Vertol CH-46 Sea Knight helicopters. The Sacramento class was retired in 2005. The ships displace 48,800 tons full load and carried two Boeing Vertol CH-46 Sea Knight helicopters or two Sikorsky MH-60S Knighthawk helicopters. Air defense includes the Sea Sparrow radar and infrared surface-to-air missile in eight-cell launchers to provide point defence with 15km to 25km range. There are also two Phalanx mk15 20mm gatling gun close-in weapon systems (CIWS) and two 25mm Raytheon mk88 guns. China has developed the Type 901 fast combat support ship which serves a similar mission in their navy. List of Fast Combat Support Ships (AOE-1) USS Sacramento (AOE-2) USS Camden (AOE-3) USS Seattle (AOE-4) USS Detroit (AOE-6) USS Supply (New Class) (AOE-7) USS Rainier (AOE-8)
the name and logo under license from the parent company. FASA Games Inc. works alongside Ral Partha Europe, also a subsidiary of FASA Corporation, to bring out new editions of existing properties such as Earthdawn and Demonworld, and to develop new properties within the FASA cosmology. FASA first appeared as a Traveller licensee, producing supplements for that Game Designers' Workshop role-playing game, especially the work of the Keith Brothers. The company went on to establish itself as a major gaming company with the publication of the first licensed Star Trek RPG, then several successful original games. Noteworthy lines included BattleTech and Shadowrun. Their Star Trek role-playing supplements and tactical ship game enjoyed popularity outside the wargaming community since, at the time, official descriptions of the Star Trek universe were not common, and the gaming supplements offered details fans craved. The highly successful BattleTech line led to a series of video games, some of the first virtual reality gaming suites, called Virtual World (created by a subdivision of the company known at the time of development as ESP, an acronym for "Extremely Secret Project") and a Saturday-morning animated TV series. Originally the name FASA was an acronym for "Freedonian Aeronautics and Space Administration", a joking allusion to the Marx Brothers film Duck Soup. This tongue-in-cheek attitude was carried over in humorous self-references in its games. For example, in Shadowrun, a tactical nuclear device was detonated near FASA's offices at 1026 W. Van Buren St in Chicago, Illinois. History FASA Corporation was founded by Jordan Weisman and L. Ross Babcock III in 1980 with a starting capital of $350 ($1,200 adjusted for inflation). The two were fellow gamers at the United States Merchant Marine Academy. Mort Weisman, Jordan's father, joined the company in 1985 to lead the company's operational management having sold his book publishing business, Swallow Press. Under the new commercial direction and with Mort's capital injection, the company diversified into books and miniature figures. After consulting their UK distributor, Chart Hobby Distributors, FASA licensed the manufacture of its BattleTech figurines to Miniature Figurines (also known as Minifigs). FASA would later acquire the U.S. figures manufacturer Ral Partha, which was the
the intellectual property of FASA to be of high value but did not wish to continue working as he had been for the last decade or more. Unwilling to wrestle with the complexities of dividing up the going concern, the owners issued a press release on January 25, 2001 announcing the immediate closure of the business. The BattleTech and Shadowrun properties were sold to WizKids, who in turn licensed their publication to FanPro LLC and then to Catalyst Game Labs. The Earthdawn license was sold to WizKids, and then back to FASA. Living Room Games published Earthdawn (Second Edition), RedBrick published Earthdawn (Classic and Third Editions), but the license has now returned to FASA Corporation, and FASA Games, Inc. is the current license holder for new material. Crimson Skies was originally developed by Zipper Interactive under the FASA Interactive brand in late 2000 and used under license by FASA; FASA Interactive had been purchased by Microsoft, so rights to Crimson Skies stayed with Microsoft. Rights to the miniatures game VOR: The Maelstrom reverted to the designer Mike "Skuzzy" Nielsen, but it has not been republished in any form due partly to legal difficulties. Microsoft officially closed the FASA team in the company's gaming division on September 12, 2007. On December 6, 2007, FASA founder Jordan Weisman announced that his new venture, Smith & Tinker, had licensed the electronic gaming rights to MechWarrior, Shadowrun, and Crimson Skies from Microsoft. On April 28, 2008 Mike "Skuzzy" Nielsen announced plans to create Vor 2.0. At Gen Con 2012, FASA Games, Inc. was revealed, which includes FASA Corporation co-founder Ross Babcock on the Board of Directors. While FASA Corporation still owns and manages the FASA IP and brands, FASA Games, Inc would release new games and content. As of 2020, FASA Games has released contents for 2 games; a 4th edition for Earthdawn and the new game 1879 which aims to replace and/or create an alternate future '6th Age' in 'replacement' to Shadowrun. Notable games Role-playing games Star Trek: The Role Playing Game (1982) Star Trek: Starship Tactical Combat Simulator Doctor Who (1985) MechWarrior (1986) Shadowrun (1989) Legionnaire (1990) Earthdawn (1993) Board games BattleTech (released in 1984 as BattleDroids, titled BattleTech as of 1985) Renegade Legion (1989) Crimson Skies (1998) Miniature games VOR: The Maelstrom (1999) Demonworld (second edition: 2011, miniatures by Ral Partha
and QF-4S target drones operated by the Naval Air Warfare Center at NAS Point Mugu, California. These airframes were subsequently retired in 2004. United States Marine Corps The Marine Corps received its first F-4Bs in June 1962, with the "Black Knights" of VMFA-314 at Marine Corps Air Station El Toro, California becoming the first operational squadron. Marine Phantoms from VMFA-531 "Grey Ghosts" were assigned to Da Nang airbase on South Vietnam's northeast coast on 10 May 1965 and were initially assigned to provide air defense for the USMC. They soon began close air support missions (CAS) and VMFA-314 'Black Knights', VMFA-232 'Red Devils, VMFA-323 'Death Rattlers', and VMFA-542 'Bengals' soon arrived at the primitive airfield. Marine F-4 pilots claimed three enemy MiGs (two while on exchange duty with the USAF) at the cost of 75 aircraft lost in combat, mostly to ground fire, and four in accidents. The VMCJ-1 Golden Hawks (later VMAQ-1 and VMAQ-4 which had the old RM tailcode) flew the first photo recon mission with an RF-4B variant on 3 November 1966 from Da Nang AB, South Vietnam and remained there until 1970 with no RF-4B losses and only one aircraft damaged by anti-aircraft artillery (AAA) fire. VMCJ-2 and VMCJ-3 (now VMAQ-3) provided aircraft for VMCJ-1 in Da Nang and VMFP-3 was formed in 1975 at MCAS El Toro, CA consolidating all USMC RF-4Bs in one unit that became known as "The Eyes of the Corps." VMFP-3 disestablished in August 1990 after the Advanced Tactical Airborne Reconnaissance System was introduced for the F/A-18D Hornet. The F-4 continued to equip fighter-attack squadrons in both active and reserve Marine Corps units throughout the 1960s, 1970s and 1980s and into the early 1990s. In the early 1980s, these squadrons began to transition to the F/A-18 Hornet, starting with the same squadron that introduced the F-4 to the Marine Corps, VMFA-314 at MCAS El Toro, California. On 18 January 1992, the last Marine Corps Phantom, an F-4S in the Marine Corps Reserve, was retired by the "Cowboys" of VMFA-112 at NAS Dallas, Texas, after which the squadron was re-equipped with F/A-18 Hornets. Aerial combat in the Vietnam War The USAF and the US Navy had high expectations of the F-4 Phantom, assuming that the massive firepower, the best available on-board radar, the highest speed and acceleration properties, coupled with new tactics, would provide Phantoms with an advantage over the MiGs. However, in confrontations with the lighter MiG-21, F-4s did not always succeed and began to suffer losses. Over the course of the air war in Vietnam, between 3 April 1965 and 8 January 1973, each side would ultimately claim favorable kill ratios. During the war, U.S. Navy F-4 Phantoms claimed 40 air-to-air victories at a loss of seven Phantoms to enemy aircraft. USMC F-4 pilots claimed three enemy MiGs at the cost of one aircraft in air-combat. USAF F-4 Phantom crews scored MiG kills (including MiG-17s, eight MiG-19s and 66 MiG-21s) at a cost of 33 Phantoms in air-combat. F-4 pilots were credited with a total of MiG kills at a cost of 42 Phantoms in air-combat. According to the VPAF, 103 F-4 Phantoms were shot down by MiG-21s at a cost of 54 MiG-21s downed by F-4s. During the war, the VPAF lost 131 MiGs in air combat (63 MiG-17s, eight MiG-19s and 60 MiG-21s) of which one half were by F-4s. From 1966 to November 1968, in 46 air battles conducted over North Vietnam between F-4s and MiG-21s, VPAF claimed 27 F-4s were shot down by MiG-21s at a cost of 20 MiG-21s In 1970, one F-4 Phantom was shot down by a MiG-21. The struggle culminated on 10 May 1972, with VPAF aircraft completing 64 sorties, resulting in 15 air battles. The VPAF claimed seven F-4s were shot down, while U.S. confirmed five F-4s were lost. The Phantoms, in turn, managed to destroy two MiG-21s, three MiG-17s, and one MiG-19. On 11 May, two MiG-21s, which played the role of "bait", brought the four F-4s to two MiG-21s circling at low altitude. The MiGs quickly engaged and shot down two F-4s. On 18 May, Vietnamese aircraft made 26 sorties in eight air engagements, which cost 4 F-4 Phantoms; Vietnamese fighters on that day did not suffer losses. Non-U.S. users The Phantom has served with the air forces of many countries, including Australia, Egypt, Germany, United Kingdom, Greece, Iran, Israel, Japan, Spain, South Korea and Turkey. Australia The Royal Australian Air Force (RAAF) leased 24 USAF F-4Es from 1970 to 1973 while waiting for their order for the General Dynamics F-111C to be delivered. They were so well-liked that the RAAF considered retaining the aircraft after the F-111Cs were delivered. They were operated from RAAF Amberley by No. 1 Squadron and No. 6 Squadron. Egypt In 1979, the Egyptian Air Force purchased 35 former USAF F-4Es along with a number of Sparrow, Sidewinder, and Maverick missiles from the U.S. for $594 million as part of the "Peace Pharaoh" program. An additional seven surplus USAF aircraft were purchased in 1988. Three attrition replacements had been received by the end of the 1990s. Egyptian F-4Es were retired in 2020, with their former base at Cairo West Airport being reconfigured for the operation of F-16C/D Fighting Falcons. Germany The German Air Force (Luftwaffe) initially ordered the reconnaissance RF-4E in 1969, receiving a total of 88 aircraft from January 1971. In 1982, the initially unarmed RF-4Es were given a secondary ground attack capability; these aircraft were retired in 1994. In 1973, under the "Peace Rhine" program, the Luftwaffe purchased the F-4F (a lightened and simplified version of the F-4E) which was upgraded in the mid-1980s. 24 German F-4F Phantom IIs were operated by the 49th Tactical Fighter Wing of the USAF at Holloman AFB to train Luftwaffe crews until December 2004. In 1975, Germany also received 10 F-4Es for training in the U.S. In the late 1990s, these were withdrawn from service after being replaced by F-4Fs. Germany also initiated the Improved Combat Efficiency (ICE) program in 1983. The 110 ICE-upgraded F-4Fs entered service in 1992, and were expected to remain in service until 2012. All the remaining Luftwaffe Phantoms were based at Wittmund with Jagdgeschwader 71 (fighter wing 71) in Northern Germany and WTD61 at Manching. Phantoms were deployed to NATO states under the Baltic Air Policing starting in 2005, 2008, 2009, 2011 and 2012. The German Air Force retired its last F-4Fs on 29 June 2013. German F-4Fs flew 279,000 hours from entering service on 31 August 1973 until retirement. Greece In 1971, the Hellenic Air Force ordered brand new F-4E Phantoms, with deliveries starting in 1974. In the early 1990s, the Hellenic AF acquired surplus RF-4Es and F-4Es from the Luftwaffe and U.S. ANG. Following the success of the German ICE program, on 11 August 1997, a contract was signed between DASA of Germany and Hellenic Aerospace Industry for the upgrade of 39 aircraft to the very similar "Peace Icarus 2000" standard. The Hellenic AF operated 34 upgraded F-4E-PI2000 (338 and 339 Squadrons) and 12 RF-4E aircraft (348 Squadron) as of September 2013. On 5 May 2017, the Hellenic Air Force officially retired the RF-4E Phantom II during a public ceremony. Iran In the 1960s and 1970s when the U.S. and Iran were on friendly terms, the U.S. sold 225 F-4D, F-4E, and RF-4E Phantoms to Iran. The Imperial Iranian Air Force saw at least one engagement, resulting in a loss, after an RF-4C was rammed by a Soviet MiG-21 during Project Dark Gene, an ELINT operation during the Cold War. The Islamic Republic of Iran Air Force Phantoms saw heavy action in the Iran–Iraq War in the 1980s and are kept operational by overhaul and servicing from Iran's aerospace industry. Notable operations of Iranian F-4s during the war included Operation Scorch Sword, an attack by two F-4s against the Iraqi Osirak nuclear reactor site near Baghdad on 30 September 1980, and the attack on H3, a 4 April 1981 strike by eight Iranian F-4s against the H-3 complex of air bases in the far west of Iraq, which resulted in many Iraqi aircraft being destroyed or damaged for no Iranian losses. On 5 June 1984, two Saudi Arabian fighter pilots shot down two Iranian F-4 fighters. The Royal Saudi Air Force pilots were flying American-built F-15s and fired air-to-air missiles to bring down the Iranian planes. The Saudi fighter pilots had KC-135 aerial tanker planes and Boeing E-3 Sentry AWACS surveillance planes assist in the encounter. The aerial fight occurred in Saudi airspace over the Persian Gulf near the Saudi island Al Arabiyah, about 60 miles northeast of Jubail. Iranian F-4s were in use as of late 2014; the aircraft reportedly conducted air strikes on ISIS targets in the eastern Iraqi province of Diyala. Israel The Israeli Air Force was the largest foreign operator of the Phantom, flying both newly built and ex-USAF aircraft, as well as several one-off special reconnaissance variants. The first F-4Es, nicknamed "Kurnass" (Sledgehammer), and RF-4Es, nicknamed "Orev" (Raven), were delivered in 1969 under the "Peace Echo I" program. Additional Phantoms arrived during the 1970s under "Peace Echo II" through "Peace Echo V" and "Nickel Grass" programs. Israeli Phantoms saw extensive combat during Arab–Israeli conflicts, first seeing action during the War of Attrition. In the 1980s, Israel began the "Kurnass 2000" modernization program which significantly updated avionics. The last Israeli F-4s were retired in 2004. Japan From 1968, the Japan Air Self-Defense Force (JASDF) purchased a total of 140 F-4EJ Phantoms without aerial refueling, AGM-12 Bullpup missile system, nuclear control system or ground attack capabilities. Mitsubishi built 138 under license in Japan and 14 unarmed reconnaissance RF-4Es were imported. One of the aircraft (17-8440) was the last of the 5,195 F-4 Phantoms to be produced. It was manufactured by Mitsubishi Heavy Industries on 21 May 1981. "The Final Phantom" served with 306th Tactical Fighter Squadron and later transferred to the 301st Tactical Fighter Squadron. Of these, 96 F-4EJs were modified to the F-4EJ standard. 15 F-4EJ and F-4EJ Kai were converted to reconnaissance aircraft designated RF-4EJ. Japan had a fleet of 90 F-4s in service in 2007. After studying several replacement fighters the F-35A Lightning II was chosen in 2011. The 302nd Tactical Fighter Squadron became the first JASDF F-35 Squadron at Misawa Air Base when it converted from the F-4EJ Kai on 29 March 2019. The JASDF's sole aerial reconnaissance unit, the 501st Tactical Reconnaissance Squadron, retired their RF-4Es and RF-4EJs on 9 March 2020, and the unit itself dissolved on 26 March. The 301st Tactical Fighter Squadron then became the sole user of the F-4EJ in the Air Defense Command, with their retirement originally scheduled in 2021 along with the unit's transition to the F-35A. However, on 20 November 2020, the 301st Tactical Fighter Squadron announced the earlier retirement of their remaining F-4EJs, concluding the Phantom's long-running career in the JASDF Air Defense Command. Although retirement was announced, the 301st TFS continued operations up until 10 December 2020, with the squadron's Phantoms being decommissioned on 14 December. Two F-4EJs and a F-4EJ Kai continued to be operated by the Air Development and Test Wing in Gifu Prefecture until their retirement on 17 March 2021, marking an end of Phantom operations in Japan. South Korea The Republic of Korea Air Force purchased its first batch of secondhand USAF F-4D Phantoms in 1968 under the "Peace Spectator" program. The F-4Ds continued to be delivered until 1988. The "Peace Pheasant II" program also provided new-built and former USAF F-4Es. Spain The Spanish Air Force acquired its first batch of ex-USAF F-4C Phantoms in 1971 under the "Peace Alfa" program. Designated C.12, the aircraft were retired in 1989. At the same time, the air arm received a number of ex-USAF RF-4Cs, designated CR.12. In 1995–1996, these aircraft received extensive avionics upgrades. Spain retired its RF-4s in 2002. Turkey The Turkish Air Force (TAF) received 40 F-4Es in 1974, with a further 32 F-4Es and 8 RF-4Es in 1977–78 under the "Peace Diamond III" program, followed by 40 ex-USAF aircraft in "Peace Diamond IV" in 1987, and a further 40 ex-U.S. Air National Guard Aircraft in 1991. A further 32 RF-4Es were transferred to Turkey after being retired by the Luftwaffe between 1992 and 1994. In 1995, Israel Aerospace Industries (IAI) implemented an upgrade similar to Kurnass 2000 on 54 Turkish F-4Es which were dubbed the F-4E 2020 Terminator. Turkish F-4s, and more modern F-16s have been used to strike Kurdish PKK bases in ongoing military operations in Northern Iraq. On 22 June 2012, a Turkish RF-4E was shot down by Syrian air defenses while flying a reconnaissance flight near the Turkish-Syrian border. Turkey has stated the reconnaissance aircraft was in international airspace when it was shot down, while Syrian authorities stated it was inside Syrian airspace. Turkish F-4s remained in use as of 2020. On 24 February 2015, two RF-4Es crashed in the Malatya region in the southeast of Turkey, under yet unknown circumstances, killing both crew of two each. On 5 March 2015, an F-4E-2020 crashed in central Anatolia killing both crew. After the recent accidents, the TAF withdrew RF-4Es from active service. Turkey was reported to have used F-4 jets to attack PKK separatists and the ISIS capital on 19 September 2015. The Turkish Air Force has reportedly used the F-4E 2020s against the more recent Third Phase of the PKK conflict on heavy bombardment missions into Iraq on 15 November 2015, 12 January 2016, and 12 March 2016. United Kingdom The United Kingdom bought versions based on the U.S. Navy's F-4J for use with the Royal Air Force and the Royal Navy's Fleet Air Arm. The UK was the only country outside the United States to operate the Phantom at sea, with them operating from . The main differences were the use of the British Rolls-Royce Spey engines and of British-made avionics. The RN and RAF versions were given the designation F-4K and F-4M respectively, and entered service with the British military aircraft designations Phantom FG.1 (fighter/ground attack) and Phantom FGR.2 (fighter/ground attack/reconnaissance). Initially, the FGR.2 was used in the ground attack and reconnaissance role, primarily with RAF Germany, while 43 Squadron was formed in the air defence role using the FG.1s that had been intended for the Fleet Air Arm for use aboard . The superiority of the Phantom over the English Electric Lightning in terms of both range and weapons system capability, combined with the successful introduction of the SEPECAT Jaguar, meant that, during the mid-1970s, most of the ground attack Phantoms in Germany were redeployed to the UK to replace air defence Lightning squadrons. A second RAF squadron, 111 Squadron, was formed on the FG.1 in 1979 after the disbandment of 892 NAS. In 1982, during the Falklands War, three Phantom FGR2s of No. 29 Squadron were on active Quick Reaction Alert duty on Ascension Island to protect the base from air attack. After the Falklands War, 15 upgraded ex-USN F-4Js, known as the F-4J(UK) entered RAF service to compensate for one interceptor squadron redeployed to the Falklands. Around 15 RAF squadrons received various marks of Phantom, many of them based in Germany. The first to be equipped was No. 228 Operational Conversion Unit at RAF Coningsby in August 1968. One noteworthy operator was No. 43 Squadron where Phantom FG1s remained the squadron equipment for 20 years, arriving in September 1969 and departing in July 1989. During this period the squadron was based at Leuchars. The interceptor Phantoms were replaced by the Panavia Tornado F3 from the late 1980s onwards, and the last combat British Phantoms were retired in October 1992 when No. 74(F) Squadron was disbanded. Phantom FG.1 XT597 was the last British Phantom to be retired on 28 January 1994, it was used as a test jet by the Aeroplane and Armament Experimental Establishment for its whole service life. Civilian use Sandia National Laboratories expended an F-4 mounted on a "rocket sled" in a crash test to record the results of an aircraft impacting a reinforced concrete structure, such as a nuclear power plant. One aircraft, an F-4D (civilian registration N749CF), is operated by the Massachusetts-based non-profit organization Collings Foundation as a "living history" exhibit. Funds to maintain and operate the aircraft, which is based in Houston, Texas, are raised through donations/sponsorships from public and commercial parties. After finding the Lockheed F-104 Starfighter inadequate, NASA used the F-4 to photograph and film Titan II missiles after launch from Cape Canaveral during the 1960s. Retired U.S. Air Force colonel Jack Petry described how he put his F-4 into a Mach 1.2 dive synchronized to the launch countdown, then "walked the (rocket's) contrail". Petry's Phantom stayed with the Titan for 90 seconds, reaching 68,000 feet, then broke away as the missile continued into space. NASA's Dryden Flight Research Center acquired an F-4A on 3 December 1965. It made 55 flights in support of short programs, chase on X-15 missions and lifting body flights. The F-4 also supported a biomedical monitoring program involving 1,000 flights by NASA Flight Research Center aerospace research pilots and students of the USAF Aerospace Research Pilot School flying high-performance aircraft. The pilots were instrumented to record accurate and reliable data of electrocardiogram, respiration rate, and normal acceleration. In 1967, the Phantom supported a brief military-inspired program to determine whether an airplane's sonic boom could be directed and whether it could be used as a weapon of sorts, or at least an annoyance. NASA also flew an F-4C in a spanwise blowing study from 1983 to 1985, after which it was returned. Variants F-4A, B, J, N and S Variants for the U.S. Navy and the U.S. Marine Corps. F-4B was upgraded to F-4N, and F-4J was upgraded to F-4S. F-110 (original USAF designation for F-4C), F-4C, D and E Variants for the U.S. Air Force. F-4E introduced an internal M61 Vulcan cannon. The F-4D and E were the most numerously produced, widely exported, and also extensively used under the Semi Automatic Ground Environment (SAGE) U.S. air defense system. F-4G Wild Weasel V A dedicated SEAD variant for the U.S. Air Force with updated radar and avionics, converted from F-4E. The designation F-4G was applied earlier to an entirely different U.S. Navy Phantom. F-4K and M Variants for the Royal Navy and Royal Air Force, respectively, re-engined with Rolls-Royce Spey turbofan engines. F-4EJ and RF-4EJ Simplified F-4E exported to and license-built in Japan. Some modified for reconnaissance role, carrying photographic and/or electronic reconnaissance pods and designated RF-4EJ. F-4F Simplified F-4E exported to Germany. QRF-4C, QF-4B, E, G, N and S Retired aircraft converted into remote-controlled target drones used for weapons and defensive systems research by USAF and USN / USMC. RF-4B, C, and E Tactical reconnaissance variants. Operators Operators Hellenic Air Force – 18 F-4E AUPs in service Andravida Air Base, Elis 338 MDV Islamic Republic of Iran Air Force – 62 F-4D, F-4E, and RF-4Es in service Bandar Abbas Air Base, Hormozgan Province 91st Tactical Fighter Squadron (F-4E) Bushehr Air Base, Bushehr Province 61st Tactical Fighter Squadron (F-4E) Chabahar Konarak Air Base, Sistan and Baluchestan Province 101st Tactical Fighter Squadron (F-4D) Hamadan Air Base, Hamadan Province 31st Tactical Reconnaissance Squadron (RF-4E) 31st Tactical Fighter Squadron (F-4E) Republic of Korea Air Force – 27 F-4Es in service Suwon Air Base, Gyeonggi Province 153rd Fighter Squadron Turkish Air Force – 26 F-4E 2020 Terminators in service Eskişehir Air Base, Eskişehir Province 111 Filo Former operators Royal Australian Air Force (F-4E 1970 to 1973) Egyptian Air Force (F-4E 1977 to 2020) German Air Force (RF-4E 1971 to 1994; F-4F 1973 to 2013; F-4E 1978 to 1992) Hellenic Air Force (RF-4E 1978 to 2017) Imperial Iranian Air Force (F-4D 1968 to 1979; F-4E 1971 to 1979; RF-4E 1971 to 1979) Israeli Air Force (F-4E 1969 to 2004; RF-4C 1970 to 1971; RF-4E 1971 to 2004) Japan Air Self-Defense Force (F-4EJ 1971 to 2021; RF-4E 1974 to 2020; RF-4EJ 1992 to 2020) Republic of Korea Air Force (F-4D 1969 to 2010; RF-4C 1989 to 2014) Spanish Air Force (F-4C 1971 to 1990; RF-4C 1978 to 2002) Turkish Air Force (RF-4E 1980 to 2015) Aeroplane and Armament Experimental Establishment (F-4K 1970 to 1994) Fleet Air Arm (F-4K 1968 to 1978) Royal Air Force (F-4M 1968 to 1992; F-4K 1969 to 1990; F-4J(UK) 1984 to 1991) NASA (F-4A 1965 to 1967; F-4C 1983 to 1985) United States Air Force (F-4B 1963 to 1964; F-4C 1964 to 1989; RF-4C 1964 to 1995; F-4D 1965 to 1992; F-4E 1967 to 1991; F-4G 1978 to 1996; QF-4 1996 to 2016) United States Marine Corps (F-4B 1962 to 1979; RF-4B 1965 to 1990; F-4J 1967 to 1984; F-4N 1973 to 1985; F-4S 1978 to 1992) United States Navy (F-4A 1960 to 1968; F-4B 1961 to 1974; F-4J 1966 to 1982; F-4N 1973 to 1984; F-4S 1979 to 1987; QF-4 1983 to 2004) Culture Nicknames The Phantom gathered a number of nicknames during its career. Some of these names included "Snoopy", "Rhino", "Double Ugly", "Old Smokey", the "Flying Anvil", "Flying Footlocker", "Flying Brick", "Lead Sled", the "Big Iron Sled", and the "St. Louis Slugger". In recognition of its record of downing large numbers of Soviet-built MiGs, it was called the "World's Leading Distributor of MiG Parts". As a reflection of excellent performance in spite of its bulk, the F-4 was dubbed "the triumph of thrust over aerodynamics." German Luftwaffe crews called their F-4s the Eisenschwein ("Iron Pig"), Fliegender Ziegelstein ("Flying Brick") and Luftverteidigungsdiesel ("Air Defense Diesel"). Reputation Imitating the spelling of the aircraft's name, McDonnell issued a series of patches. Pilots became "Phantom Phlyers", backseaters became "Phantom Pherrets", fans of the F-4 "Phantom Phanatics", and call it the "Phabulous Phantom". Ground crewmen who worked on
SAMs and 65 to AAA). By war's end, the U.S. Air Force had lost a total of 528 F-4 and RF-4C Phantoms. When combined with U.S. Navy and Marine Corps losses of 233 Phantoms, 761 F-4/RF-4 Phantoms were lost in the Vietnam War. On 28 August 1972, Captain Steve Ritchie became the first USAF ace of the war. On 9 September 1972, WSO Capt Charles B. DeBellevue became the highest-scoring American ace of the war with six victories. and WSO Capt Jeffrey Feinstein became the last USAF ace of the war on 13 October 1972. Upon return to the United States, DeBellevue and Feinstein were assigned to undergraduate pilot training (Feinstein was given a vision waiver) and requalified as USAF pilots in the F-4. USAF F-4C/D/E crews claimed MiG kills in Southeast Asia (50 by Sparrow, 31 by Sidewinder, five by Falcon, 15.5 by gun, and six by other means). On 31 January 1972, the 170th Tactical Fighter Squadron/183d Tactical Fighter Group of the Illinois Air National Guard became the first Air National Guard unit to transition to Phantoms from Republic F-84F Thunderstreaks which were found to have corrosion problems. Phantoms would eventually equip numerous tactical fighter and tactical reconnaissance units in the USAF active, National Guard, and reserve. On 2 June 1972, a Phantom flying at supersonic speed shot down a MiG-19 over Thud Ridge in Vietnam with its cannon. At a recorded speed of Mach 1.2, Major Phil Handley's shoot down was the first and only recorded gun kill while flying at supersonic speeds. On 15 August 1990, 24 F-4G Wild Weasel Vs and six RF-4Cs were deployed to Shaikh Isa AB, Bahrain, for Operation Desert Storm. The F-4G was the only aircraft in the USAF inventory equipped for the Suppression of Enemy Air Defenses (SEAD) role, and was needed to protect coalition aircraft from Iraq's extensive air defense system. The RF-4C was the only aircraft equipped with the ultra-long-range KS-127 LOROP (long-range oblique photography) camera, and was used for a variety of reconnaissance missions. In spite of flying almost daily missions, only one RF-4C was lost in a fatal accident before the start of hostilities. One F-4G was lost when enemy fire damaged the fuel tanks and the aircraft ran out of fuel near a friendly airbase. The last USAF Phantoms, F-4G Wild Weasel Vs from 561st Fighter Squadron, were retired on 26 March 1996. The last operational flight of the F-4G Wild Weasel was from the 190th Fighter Squadron, Idaho Air National Guard, in April 1996. The last operational USAF/ANG F-4 to land was flown by Maj Mike Webb and Maj Gary Leeder of the Idaho ANG. Like the Navy, the Air Force has operated QF-4 target drones, serving with the 82d Aerial Targets Squadron at Tyndall Air Force Base, Florida, and Holloman Air Force Base, New Mexico. It was expected that the F-4 would remain in the target role with the 82d ATRS until at least 2015, when they would be replaced by early versions of the F-16 Fighting Falcon converted to a QF-16 configuration. Several QF-4s also retain capability as manned aircraft and are maintained in historical color schemes, being displayed as part of Air Combat Command's Heritage Flight at air shows, base open houses, and other events while serving as non-expendable target aircraft during the week. On 19 November 2013, BAE Systems delivered the last QF-4 aerial target to the Air Force. The example had been in storage for over 20 years before being converted. Over 16 years, BAE had converted 314 F-4 and RF-4 Phantom IIs into QF-4s and QRF-4s, with each aircraft taking six months to adapt. As of December 2013, QF-4 and QRF-4 aircraft had flown over 16,000 manned and 600 unmanned training sorties, with 250 unmanned aircraft being shot down in firing exercises. The remaining QF-4s and QRF-4s held their training role until the first of 126 QF-16s were delivered by Boeing. The final flight of an Air Force QF-4 from Tyndall AFB took place on 27 May 2015 to Holloman AFB. After Tyndall AFB ceased operations, the 53d Weapons Evaluation Group at Holloman became the fleet of 22 QF-4s' last remaining operator. The base continued using them to fly manned test and unmanned live fire test support and Foreign Military Sales testing, with the final unmanned flight taking place in August 2016. The type was officially retired from US military service with a four–ship flight at Holloman during an event on 21 December 2016. The remaining QF-4s were to be demilitarized after 1 January 2017. United States Navy On 30 December 1960, the VF-121 "Pacemakers" at NAS Miramar became the first Phantom operator with its F4H-1Fs (F-4As). The VF-74 "Be-devilers" at NAS Oceana became the first deployable Phantom squadron when it received its F4H-1s (F-4Bs) on 8 July 1961. The squadron completed carrier qualifications in October 1961 and Phantom's first full carrier deployment between August 1962 and March 1963 aboard . The second deployable U.S. Atlantic Fleet squadron to receive F-4Bs was the VF-102 "Diamondbacks", who promptly took their new aircraft on the shakedown cruise of . The first deployable U.S. Pacific Fleet squadron to receive the F-4B was the VF-114 "Aardvarks", which participated in the September 1962 cruise aboard . By the time of the Tonkin Gulf incident, 13 of 31 deployable navy squadrons were armed with the type. F-4Bs from made the first Phantom combat sortie of the Vietnam War on 5 August 1964, flying bomber escort in Operation Pierce Arrow. Navy fighter pilots were unused to flying with a non-pilot RIO, but learned from air combat in Vietnam the benefits of the GiB "guy in back" or "voice in the luggage compartment" helping with the workload. The first Phantom air-to-air victory of the war took place on 9 April 1965 when an F-4B from VF-96 "Fighting Falcons" piloted by Lieutenant (junior grade) Terence M. Murphy and his RIO, Ensign Ronald Fegan, shot down a Chinese MiG-17 "Fresco". The Phantom was then shot down, probably by an AIM-7 Sparrow from one of its wingmen. There continues to be controversy over whether the Phantom was shot down by MiG guns or, as enemy reports later indicated, an AIM-7 Sparrow III from one of Murphy's and Fegan's wingmen. On 17 June 1965, an F-4B from VF-21 "Freelancers" piloted by Commander Louis Page and Lieutenant John C. Smith shot down the first North Vietnamese MiG of the war. On 10 May 1972, Lieutenant Randy "Duke" Cunningham and Lieutenant (junior grade) William P. Driscoll flying an F-4J, call sign "Showtime 100", shot down three MiG-17s to become the first American flying aces of the war. Their fifth victory was believed at the time to be over a mysterious North Vietnamese ace, Colonel Nguyen Toon, now considered mythical. On the return flight, the Phantom was damaged by an enemy surface-to-air missile. To avoid being captured, Cunningham and Driscoll flew their burning aircraft using only the rudder and afterburner (the damage to the aircraft rendered conventional control nearly impossible), until they could eject over water. During the war, U.S. Navy F-4 Phantom squadrons participated in 84 combat tours with F-4Bs, F-4Js, and F-4Ns. The Navy claimed 40 air-to-air victories at a cost of 73 Phantoms lost in combat (seven to enemy aircraft, 13 to SAMs, and 53 to AAA). An additional 54 Phantoms were lost in mishaps. In 1984, all Navy F-4Ns were retired from Fleet service in deployable USN squadrons and by 1987 the last F-4Ss were retired from deployable USN squadrons. On 25 March 1986, an F-4S belonging to the VF-151 "Vigilantes," became the last active duty U.S. Navy Phantom to launch from an aircraft carrier, in this case, . On 18 October 1986, an F-4S from the VF-202 "Superheats", a Naval Reserve fighter squadron, made the last-ever Phantom carrier landing while operating aboard . In 1987, the last of the Naval Reserve-operated F-4S aircraft were replaced by F-14As. The last Phantoms in service with the Navy were QF-4N and QF-4S target drones operated by the Naval Air Warfare Center at NAS Point Mugu, California. These airframes were subsequently retired in 2004. United States Marine Corps The Marine Corps received its first F-4Bs in June 1962, with the "Black Knights" of VMFA-314 at Marine Corps Air Station El Toro, California becoming the first operational squadron. Marine Phantoms from VMFA-531 "Grey Ghosts" were assigned to Da Nang airbase on South Vietnam's northeast coast on 10 May 1965 and were initially assigned to provide air defense for the USMC. They soon began close air support missions (CAS) and VMFA-314 'Black Knights', VMFA-232 'Red Devils, VMFA-323 'Death Rattlers', and VMFA-542 'Bengals' soon arrived at the primitive airfield. Marine F-4 pilots claimed three enemy MiGs (two while on exchange duty with the USAF) at the cost of 75 aircraft lost in combat, mostly to ground fire, and four in accidents. The VMCJ-1 Golden Hawks (later VMAQ-1 and VMAQ-4 which had the old RM tailcode) flew the first photo recon mission with an RF-4B variant on 3 November 1966 from Da Nang AB, South Vietnam and remained there until 1970 with no RF-4B losses and only one aircraft damaged by anti-aircraft artillery (AAA) fire. VMCJ-2 and VMCJ-3 (now VMAQ-3) provided aircraft for VMCJ-1 in Da Nang and VMFP-3 was formed in 1975 at MCAS El Toro, CA consolidating all USMC RF-4Bs in one unit that became known as "The Eyes of the Corps." VMFP-3 disestablished in August 1990 after the Advanced Tactical Airborne Reconnaissance System was introduced for the F/A-18D Hornet. The F-4 continued to equip fighter-attack squadrons in both active and reserve Marine Corps units throughout the 1960s, 1970s and 1980s and into the early 1990s. In the early 1980s, these squadrons began to transition to the F/A-18 Hornet, starting with the same squadron that introduced the F-4 to the Marine Corps, VMFA-314 at MCAS El Toro, California. On 18 January 1992, the last Marine Corps Phantom, an F-4S in the Marine Corps Reserve, was retired by the "Cowboys" of VMFA-112 at NAS Dallas, Texas, after which the squadron was re-equipped with F/A-18 Hornets. Aerial combat in the Vietnam War The USAF and the US Navy had high expectations of the F-4 Phantom, assuming that the massive firepower, the best available on-board radar, the highest speed and acceleration properties, coupled with new tactics, would provide Phantoms with an advantage over the MiGs. However, in confrontations with the lighter MiG-21, F-4s did not always succeed and began to suffer losses. Over the course of the air war in Vietnam, between 3 April 1965 and 8 January 1973, each side would ultimately claim favorable kill ratios. During the war, U.S. Navy F-4 Phantoms claimed 40 air-to-air victories at a loss of seven Phantoms to enemy aircraft. USMC F-4 pilots claimed three enemy MiGs at the cost of one aircraft in air-combat. USAF F-4 Phantom crews scored MiG kills (including MiG-17s, eight MiG-19s and 66 MiG-21s) at a cost of 33 Phantoms in air-combat. F-4 pilots were credited with a total of MiG kills at a cost of 42 Phantoms in air-combat. According to the VPAF, 103 F-4 Phantoms were shot down by MiG-21s at a cost of 54 MiG-21s downed by F-4s. During the war, the VPAF lost 131 MiGs in air combat (63 MiG-17s, eight MiG-19s and 60 MiG-21s) of which one half were by F-4s. From 1966 to November 1968, in 46 air battles conducted over North Vietnam between F-4s and MiG-21s, VPAF claimed 27 F-4s were shot down by MiG-21s at a cost of 20 MiG-21s In 1970, one F-4 Phantom was shot down by a MiG-21. The struggle culminated on 10 May 1972, with VPAF aircraft completing 64 sorties, resulting in 15 air battles. The VPAF claimed seven F-4s were shot down, while U.S. confirmed five F-4s were lost. The Phantoms, in turn, managed to destroy two MiG-21s, three MiG-17s, and one MiG-19. On 11 May, two MiG-21s, which played the role of "bait", brought the four F-4s to two MiG-21s circling at low altitude. The MiGs quickly engaged and shot down two F-4s. On 18 May, Vietnamese aircraft made 26 sorties in eight air engagements, which cost 4 F-4 Phantoms; Vietnamese fighters on that day did not suffer losses. Non-U.S. users The Phantom has served with the air forces of many countries, including Australia, Egypt, Germany, United Kingdom, Greece, Iran, Israel, Japan, Spain, South Korea and Turkey. Australia The Royal Australian Air Force (RAAF) leased 24 USAF F-4Es from 1970 to 1973 while waiting for their order for the General Dynamics F-111C to be delivered. They were so well-liked that the RAAF considered retaining the aircraft after the F-111Cs were delivered. They were operated from RAAF Amberley by No. 1 Squadron and No. 6 Squadron. Egypt In 1979, the Egyptian Air Force purchased 35 former USAF F-4Es along with a number of Sparrow, Sidewinder, and Maverick missiles from the U.S. for $594 million as part of the "Peace Pharaoh" program. An additional seven surplus USAF aircraft were purchased in 1988. Three attrition replacements had been received by the end of the 1990s. Egyptian F-4Es were retired in 2020, with their former base at Cairo West Airport being reconfigured for the operation of F-16C/D Fighting Falcons. Germany The German Air Force (Luftwaffe) initially ordered the reconnaissance RF-4E in 1969, receiving a total of 88 aircraft from January 1971. In 1982, the initially unarmed RF-4Es were given a secondary ground attack capability; these aircraft were retired in 1994. In 1973, under the "Peace Rhine" program, the Luftwaffe purchased the F-4F (a lightened and simplified version of the F-4E) which was upgraded in the mid-1980s. 24 German F-4F Phantom IIs were operated by the 49th Tactical Fighter Wing of the USAF at Holloman AFB to train Luftwaffe crews until December 2004. In 1975, Germany also received 10 F-4Es for training in the U.S. In the late 1990s, these were withdrawn from service after being replaced by F-4Fs. Germany also initiated the Improved Combat Efficiency (ICE) program in 1983. The 110 ICE-upgraded F-4Fs entered service in 1992, and were expected to remain in service until 2012. All the remaining Luftwaffe Phantoms were based at Wittmund with Jagdgeschwader 71 (fighter wing 71) in Northern Germany and WTD61 at Manching. Phantoms were deployed to NATO states under the Baltic Air Policing starting in 2005, 2008, 2009, 2011 and 2012. The German Air Force retired its last F-4Fs on 29 June 2013. German F-4Fs flew 279,000 hours from entering service on 31 August 1973 until retirement. Greece In 1971, the Hellenic Air Force ordered brand new F-4E Phantoms, with deliveries starting in 1974. In the early 1990s, the Hellenic AF acquired surplus RF-4Es and F-4Es from the Luftwaffe and U.S. ANG. Following the success of the German ICE program, on 11 August 1997, a contract was signed between DASA of Germany and Hellenic Aerospace Industry for the upgrade of 39 aircraft to the very similar "Peace Icarus 2000" standard. The Hellenic AF operated 34 upgraded F-4E-PI2000 (338 and 339 Squadrons) and 12 RF-4E aircraft (348 Squadron) as of September 2013. On 5 May 2017, the Hellenic Air Force officially retired the RF-4E Phantom II during a public ceremony. Iran In the 1960s and 1970s when the U.S. and Iran were on friendly terms, the U.S. sold 225 F-4D, F-4E, and RF-4E Phantoms to Iran. The Imperial Iranian Air Force saw at least one engagement, resulting in a loss, after an RF-4C was rammed by a Soviet MiG-21 during Project Dark Gene, an ELINT operation during the Cold War. The Islamic Republic of Iran Air Force Phantoms saw heavy action in the Iran–Iraq War in the 1980s and are kept operational by overhaul and servicing from Iran's aerospace industry. Notable operations of Iranian F-4s during the war included Operation Scorch Sword, an attack by two F-4s against the Iraqi Osirak nuclear reactor site near Baghdad on 30 September 1980, and the attack on H3, a 4 April 1981 strike by eight Iranian F-4s against the H-3 complex of air bases in the far west of Iraq, which resulted in many Iraqi aircraft being destroyed or damaged for no Iranian losses. On 5 June 1984, two Saudi Arabian fighter pilots shot down two Iranian F-4 fighters. The Royal Saudi Air Force pilots were flying American-built F-15s and fired air-to-air missiles to bring down the Iranian planes. The Saudi fighter pilots had KC-135 aerial tanker planes and Boeing E-3 Sentry AWACS surveillance planes assist in the encounter. The aerial fight occurred in Saudi airspace over the Persian Gulf near the Saudi island Al Arabiyah, about 60 miles northeast of Jubail. Iranian F-4s were in use as of late 2014; the aircraft reportedly conducted air strikes on ISIS targets in the eastern Iraqi province of Diyala. Israel The Israeli Air Force was the largest foreign operator of the Phantom, flying both newly built and ex-USAF aircraft, as well as several one-off special reconnaissance variants. The first F-4Es, nicknamed "Kurnass" (Sledgehammer), and RF-4Es, nicknamed "Orev" (Raven), were delivered in 1969 under the "Peace Echo I" program. Additional Phantoms arrived during the 1970s under "Peace Echo II" through "Peace Echo V" and "Nickel Grass" programs. Israeli Phantoms saw extensive combat during Arab–Israeli conflicts, first seeing action during the War of Attrition. In the 1980s, Israel began the "Kurnass 2000" modernization program which significantly updated avionics. The last Israeli F-4s were retired in 2004. Japan From 1968, the Japan Air Self-Defense Force (JASDF) purchased a total of 140 F-4EJ Phantoms without aerial refueling, AGM-12 Bullpup missile system, nuclear control system or ground attack capabilities. Mitsubishi built 138 under license in Japan and 14 unarmed reconnaissance RF-4Es were imported. One of the aircraft (17-8440) was the last of the 5,195 F-4 Phantoms to be produced. It was manufactured by Mitsubishi Heavy Industries on 21 May 1981. "The Final Phantom" served with 306th Tactical Fighter Squadron and later transferred to the 301st Tactical Fighter Squadron. Of these, 96 F-4EJs were modified to the F-4EJ standard. 15 F-4EJ and F-4EJ Kai were converted to reconnaissance aircraft designated RF-4EJ. Japan had a fleet of 90 F-4s in service in 2007. After studying several replacement fighters the F-35A Lightning II was chosen in 2011. The 302nd Tactical Fighter Squadron became the first JASDF F-35 Squadron at Misawa Air Base when it converted from the F-4EJ Kai on 29 March 2019. The JASDF's sole aerial reconnaissance unit, the 501st Tactical Reconnaissance Squadron, retired their RF-4Es and RF-4EJs on 9 March 2020, and the unit itself dissolved on 26 March. The 301st Tactical Fighter Squadron then became the sole user of the F-4EJ in the Air Defense Command, with their retirement originally scheduled in 2021 along with the unit's transition to the F-35A. However, on 20 November 2020, the 301st Tactical Fighter Squadron announced the earlier retirement of their remaining F-4EJs, concluding the Phantom's long-running career in the JASDF Air Defense Command. Although retirement was announced, the 301st TFS continued operations up until 10 December 2020, with the squadron's Phantoms being decommissioned on 14 December. Two F-4EJs and a F-4EJ Kai continued to be operated by the Air Development and Test Wing in Gifu Prefecture until their retirement on 17 March 2021, marking an end of Phantom operations in Japan. South Korea The Republic of Korea Air Force purchased its first batch of secondhand USAF F-4D Phantoms in 1968 under the "Peace Spectator" program. The F-4Ds continued to be delivered until 1988. The "Peace Pheasant II" program also provided new-built and former USAF F-4Es. Spain The Spanish Air Force acquired its first batch of ex-USAF F-4C Phantoms in 1971 under the "Peace Alfa" program. Designated C.12, the aircraft were retired in 1989. At the same time, the air arm received a number of ex-USAF RF-4Cs, designated CR.12. In 1995–1996, these aircraft received extensive avionics upgrades. Spain retired its RF-4s in 2002. Turkey The Turkish Air Force (TAF) received 40 F-4Es in 1974, with a further 32 F-4Es and 8 RF-4Es in 1977–78 under the "Peace Diamond III" program, followed by 40 ex-USAF aircraft in "Peace Diamond IV" in 1987, and a further 40 ex-U.S. Air National Guard Aircraft in 1991. A further 32 RF-4Es were transferred to Turkey after being retired by the Luftwaffe between 1992 and 1994. In 1995, Israel Aerospace Industries (IAI) implemented an upgrade similar to Kurnass 2000 on 54 Turkish F-4Es which were dubbed the F-4E 2020 Terminator. Turkish F-4s, and more modern F-16s have been used to strike Kurdish PKK bases in ongoing military operations in Northern Iraq. On 22 June 2012, a Turkish RF-4E was shot down by Syrian air defenses while flying a reconnaissance flight near the Turkish-Syrian border. Turkey has stated the reconnaissance aircraft was in international airspace when it was shot down, while Syrian authorities stated it was inside Syrian airspace. Turkish F-4s remained in use as of 2020. On 24 February 2015, two RF-4Es crashed in the Malatya region in the southeast of Turkey, under yet unknown circumstances, killing both crew of two each. On 5 March 2015, an F-4E-2020 crashed in central Anatolia killing both crew. After the recent accidents, the TAF withdrew RF-4Es from active service. Turkey was reported to have used F-4 jets to attack PKK separatists and the ISIS capital on 19 September 2015. The Turkish Air Force has reportedly used the F-4E 2020s against the more recent Third Phase of the PKK conflict on heavy bombardment missions into Iraq on 15 November 2015, 12 January 2016, and 12 March 2016. United Kingdom The United Kingdom bought versions based on the U.S. Navy's F-4J for use with the Royal Air Force and the Royal Navy's Fleet Air Arm. The UK was the only country outside the United States to operate the Phantom at sea, with them operating from . The main differences were the use of the British Rolls-Royce Spey engines and of British-made avionics. The RN and RAF versions were given the designation F-4K and F-4M respectively, and entered service with the British military aircraft designations Phantom FG.1 (fighter/ground attack) and Phantom FGR.2 (fighter/ground attack/reconnaissance). Initially, the FGR.2 was used in the ground attack and reconnaissance role, primarily with RAF Germany, while 43 Squadron was formed in the air defence role using the FG.1s that had been intended for the Fleet Air Arm for use aboard . The superiority of the Phantom over the English Electric Lightning in terms of both range and weapons system capability, combined with the successful introduction of the SEPECAT Jaguar, meant that, during the mid-1970s, most of the ground attack Phantoms in Germany were redeployed to the UK to replace air defence Lightning squadrons. A second RAF squadron, 111 Squadron, was formed on the FG.1 in 1979 after the disbandment of 892 NAS. In 1982, during the Falklands War, three Phantom FGR2s of No. 29 Squadron were on active Quick Reaction Alert duty on Ascension Island to protect the base from air attack. After the Falklands War, 15 upgraded ex-USN F-4Js, known as the F-4J(UK) entered RAF service to compensate for one interceptor squadron redeployed to the Falklands. Around 15 RAF squadrons received various marks of Phantom, many of them based in Germany. The first to be equipped was No. 228 Operational Conversion Unit at RAF Coningsby in August 1968. One noteworthy operator was No. 43 Squadron where Phantom FG1s remained the squadron equipment for 20 years, arriving in September 1969 and departing in July 1989. During this period the squadron was based at Leuchars. The interceptor Phantoms were replaced by the Panavia Tornado F3 from the late 1980s onwards, and the last combat British Phantoms were retired in October 1992 when No. 74(F) Squadron was disbanded. Phantom FG.1 XT597 was the last British Phantom to be retired on 28 January 1994, it was used as a test jet by the Aeroplane and Armament Experimental Establishment for its whole service life. Civilian use Sandia National Laboratories expended an F-4 mounted on a "rocket sled" in a crash test to record the results of an aircraft impacting a reinforced concrete structure, such as a nuclear power plant. One aircraft, an F-4D (civilian registration N749CF), is operated by the Massachusetts-based non-profit organization Collings Foundation as a "living history" exhibit. Funds to maintain and operate the aircraft, which is based in Houston, Texas, are raised through donations/sponsorships from public and commercial parties. After finding the Lockheed F-104 Starfighter inadequate, NASA used the F-4 to photograph and film Titan II missiles after launch from Cape Canaveral during the 1960s. Retired U.S. Air Force colonel Jack Petry described how he put his F-4 into a Mach 1.2 dive synchronized to the launch countdown, then "walked the (rocket's) contrail". Petry's Phantom stayed with the Titan for 90 seconds, reaching 68,000 feet, then broke away as the missile continued into space. NASA's Dryden Flight Research Center acquired an F-4A on 3 December 1965. It made 55 flights in support of short programs, chase on X-15 missions and lifting body flights. The F-4 also supported a biomedical monitoring program involving 1,000 flights by NASA Flight Research Center aerospace research pilots and students of the USAF Aerospace Research Pilot School flying high-performance aircraft. The pilots were instrumented to record accurate and reliable data of electrocardiogram, respiration rate, and normal acceleration. In 1967, the Phantom supported a brief military-inspired program to determine whether an airplane's sonic boom could be directed and whether it could be used as a weapon of sorts, or at least an annoyance. NASA also flew an F-4C in a spanwise blowing study from 1983 to 1985, after which it was returned. Variants F-4A, B, J, N and S Variants for the U.S. Navy and the U.S. Marine Corps. F-4B was upgraded to F-4N, and F-4J was upgraded to F-4S. F-110 (original USAF designation for F-4C), F-4C, D and E Variants for the U.S. Air Force. F-4E introduced an internal M61 Vulcan cannon. The F-4D and E were the most numerously produced, widely exported, and also extensively used under the Semi Automatic Ground Environment (SAGE) U.S. air defense system. F-4G Wild Weasel V A dedicated SEAD variant for the U.S. Air Force with updated radar and avionics, converted from F-4E. The designation F-4G was applied earlier to an entirely different U.S. Navy Phantom. F-4K and M Variants for the Royal Navy and Royal Air Force, respectively, re-engined with Rolls-Royce Spey turbofan engines. F-4EJ and RF-4EJ Simplified F-4E exported to and license-built in Japan. Some modified for reconnaissance role, carrying photographic and/or electronic reconnaissance pods and designated RF-4EJ. F-4F Simplified F-4E exported to Germany. QRF-4C, QF-4B, E, G, N and S Retired
system, the letter "D" before the dash designated the aircraft's manufacturer. The Douglas Aircraft Company had previously been assigned this letter, but the USN elected to reassign it to McDonnell because Douglas had not provided any fighters for navy service in years. McDonnell engineers evaluated a number of engine combinations, varying from eight 9.5 in (24 cm) diameter engines down to two engines of 19 inch (48 cm) diameter. The final design used the two 19 in (48 cm) engines after it was found to be the lightest and simplest configuration. The engines were buried in the wing root to keep intake and exhaust ducts short, offering greater aerodynamic efficiency than underwing nacelles, and the engines were angled slightly outwards to protect the fuselage from the hot exhaust blast. Placement of the engines in the middle of the airframe allowed the cockpit with its bubble-style canopy to be placed ahead of the wing, granting the pilot excellent visibility in all directions. This engine location also freed up space under the nose, allowing designers to use tricycle gear, thereby elevating the engine exhaust path and reducing the risk that the hot blast would damage the aircraft carrier deck. The construction methods and aerodynamic design of the Phantom were fairly conventional for the time; the aircraft had unswept wings, a conventional empennage, and an aluminum monocoque structure with flush riveted aluminum skin. Folding wings were used to reduce the width of the aircraft in storage configuration. Provisions for four .50-caliber (12.7 mm) machine guns were made in the nose, while racks for eight 5 in (127 mm) High Velocity Aircraft Rockets could be fitted under the wings, although these were seldom used in service. Adapting a jet to carrier use was a much greater challenge than producing a land-based fighter because of slower landing and takeoff speeds required on a small carrier deck. The Phantom used split flaps on both the folding and fixed wing sections to enhance low-speed landing performance, but no other high-lift devices were used. Provisions were also made for Rocket Assisted Take Off (RATO) bottles to improve takeoff performance. When the first XFD-1, serial number 48235, was completed in January 1945, only one Westinghouse 19XB-2B engine was available for installation. Ground runs and taxi tests were conducted with the single engine, and such was the confidence in the aircraft that the first flight on 26 January 1945 was made with only the one turbojet engine. During flight tests, the Phantom became the first U.S. Navy aircraft to exceed 500 mph (434 kn, 805 km/h). With successful completion of tests, a production contract was awarded on 7 March 1945 for 100 FD-1 aircraft. With the end of the war, the Phantom production contract was reduced to 30 aircraft, but was soon increased back to 60. The first prototype was lost in a fatal crash on 1 November 1945, but the second and final Phantom prototype (serial number 48236) was completed early the next year and became the first purely jet-powered aircraft to operate from an American aircraft carrier, completing four successful takeoffs and landings on 21 July 1946, from near Norfolk, Virginia. At the time, she was the largest carrier serving with the U.S. Navy, allowing the aircraft to take off without assistance from a catapult. The second prototype crashed on 26 August 1946. Production Phantoms incorporated a number of design improvements. These included provisions for a flush-fitting centerline drop tank, an improved gunsight, and the addition of speed brakes. Production models used Westinghouse J30-WE-20 engines with 1,600 lbf (7.1 kN) of thrust per engine. The top of the vertical tail had a more square shape than the rounder tail used on the prototypes, and a smaller rudder was used to resolve problems with control surface clearance discovered during test flights. The horizontal tail surfaces were shortened slightly, while the fuselage was stretched by 19 in (48 cm). The amount of framing in the windshield was reduced to enhance pilot visibility. Halfway through the production run, the navy reassigned the designation letter "D" back to Douglas, with the Phantom being redesignated FH-1. Including the two prototypes, a total of 62 Phantoms were finally produced, with the last FH-1 rolling off the assembly line in May 1948. Realizing that the production of more powerful jet engines was imminent, McDonnell engineers proposed a more powerful variant of the Phantom while the original aircraft was still under development – a proposal that would lead to the design of the Phantom's replacement, the F2H Banshee. Although the new aircraft was originally envisioned as a modified Phantom, the need for heavier armament, greater internal fuel capacity, and other improvements eventually led to a substantially heavier and bulkier aircraft that shared few parts with its agile predecessor. Despite this, the two aircraft were similar enough that McDonnell was able to complete its first F2H-1 in August 1948, a mere three months
the cockpit with its bubble-style canopy to be placed ahead of the wing, granting the pilot excellent visibility in all directions. This engine location also freed up space under the nose, allowing designers to use tricycle gear, thereby elevating the engine exhaust path and reducing the risk that the hot blast would damage the aircraft carrier deck. The construction methods and aerodynamic design of the Phantom were fairly conventional for the time; the aircraft had unswept wings, a conventional empennage, and an aluminum monocoque structure with flush riveted aluminum skin. Folding wings were used to reduce the width of the aircraft in storage configuration. Provisions for four .50-caliber (12.7 mm) machine guns were made in the nose, while racks for eight 5 in (127 mm) High Velocity Aircraft Rockets could be fitted under the wings, although these were seldom used in service. Adapting a jet to carrier use was a much greater challenge than producing a land-based fighter because of slower landing and takeoff speeds required on a small carrier deck. The Phantom used split flaps on both the folding and fixed wing sections to enhance low-speed landing performance, but no other high-lift devices were used. Provisions were also made for Rocket Assisted Take Off (RATO) bottles to improve takeoff performance. When the first XFD-1, serial number 48235, was completed in January 1945, only one Westinghouse 19XB-2B engine was available for installation. Ground runs and taxi tests were conducted with the single engine, and such was the confidence in the aircraft that the first flight on 26 January 1945 was made with only the one turbojet engine. During flight tests, the Phantom became the first U.S. Navy aircraft to exceed 500 mph (434 kn, 805 km/h). With successful completion of tests, a production contract was awarded on 7 March 1945 for 100 FD-1 aircraft. With the end of the war, the Phantom production contract was reduced to 30 aircraft, but was soon increased back to 60. The first prototype was lost in a fatal crash on 1 November 1945, but the second and final Phantom prototype (serial number 48236) was completed early the next year and became the first purely jet-powered aircraft to operate from an American aircraft carrier, completing four successful takeoffs and landings on 21 July 1946, from near Norfolk, Virginia. At the time, she was the largest carrier serving with the U.S. Navy, allowing the aircraft to take off without assistance from a catapult. The second prototype crashed on 26 August 1946. Production Phantoms incorporated a number of design improvements. These included provisions for a flush-fitting centerline drop tank, an improved gunsight, and the addition of speed brakes. Production models used Westinghouse J30-WE-20 engines with 1,600 lbf (7.1 kN) of thrust per engine. The top of the vertical tail had a more square shape than the rounder tail used on the prototypes, and a smaller rudder was used to resolve problems with control surface clearance discovered during test flights. The horizontal tail surfaces were shortened slightly, while the fuselage was stretched by 19 in (48 cm). The amount of framing in the windshield was reduced to enhance pilot visibility. Halfway through the production run, the navy reassigned the designation letter "D" back to Douglas, with the Phantom being redesignated FH-1. Including the two prototypes, a total of 62 Phantoms were finally produced, with the last FH-1 rolling off the assembly line in May 1948. Realizing that the production of more powerful jet engines was imminent, McDonnell engineers proposed a more powerful variant of the Phantom while the original aircraft was still under development – a proposal that would lead to the design of the Phantom's replacement, the F2H Banshee. Although the new aircraft was originally envisioned as a modified Phantom, the need for heavier armament, greater internal fuel capacity, and other improvements eventually led to a substantially heavier and bulkier aircraft that shared few parts with its agile predecessor. Despite this, the two aircraft were similar enough that McDonnell was able to complete its first F2H-1 in August 1948, a mere three months after the last FH-1 had rolled off the assembly line. Operational history The first Phantoms were delivered to USN fighter squadron VF-17A (later redesignated VF-171) in August 1947; the squadron received a full complement of 24 aircraft on 29 May 1948. Beginning in November 1947, Phantoms were delivered to United States Marine Corps squadron VMF-122, making it the first USMC combat squadron to deploy jets. VF-17A became the USN's first fully operational jet carrier squadron when it deployed aboard on 5 May 1948. The Phantom was one of the first jets used by the U.S. military for exhibition flying. Three Phantoms used by the Naval Air Test Center were used by a unique demonstration team called the Gray Angels,
is called frication. A particular subset of fricatives are the sibilants. When forming a sibilant, one still is forcing air through a narrow channel, but in addition, the tongue is curled lengthwise to direct the air over the edge of the teeth. English , , , and are examples of sibilants. The usage of two other terms is less standardized: "Spirant" is an older term for fricatives used by some American and European phoneticians and phonologists. "Strident" could mean just "sibilant", but some authors include also labiodental and uvular fricatives in the class. Types The airflow is not completely stopped in the production of fricative consonants. In other words, the airflow experiences friction. Sibilants voiceless coronal sibilant, as in English sip voiced coronal sibilant, as in English zip voiceless dental sibilant voiced dental sibilant voiceless apical sibilant voiced apical sibilant voiceless predorsal sibilant (laminal, with tongue tip at lower teeth) voiced predorsal sibilant (laminal) voiceless postalveolar sibilant (laminal) voiced postalveolar sibilant (laminal) voiceless palato-alveolar sibilant (domed, partially palatalized), as in English ship voiced palato-alveolar sibilant (domed, partially palatalized), as the si in English vision voiceless alveolo-palatal sibilant (laminal, palatalized) voiced alveolo-palatal sibilant (laminal, palatalized) voiceless retroflex sibilant (apical or subapical) voiced retroflex sibilant (apical or subapical) All sibilants are coronal, but may be dental, alveolar, postalveolar, or palatal (retroflex) within that range. However, at the postalveolar place of articulation, the tongue may take several shapes: domed, laminal, or apical, and each of these is given a separate symbol and a separate name. Prototypical retroflexes are subapical and palatal, but they are usually written with the same symbol as the apical postalveolars. The alveolars and dentals may also be either apical or laminal, but this difference is indicated with diacritics rather than with separate symbols. Central non-sibilant fricatives voiceless bilabial fricative voiced bilabial fricative voiceless labiodental fricative, as in English fine voiced labiodental fricative, as in English vine voiceless linguolabial fricative voiced linguolabial fricative voiceless dental non-sibilant fricative, as in English thing voiced dental non-sibilant fricative, as in English that voiceless alveolar non-sibilant fricative voiced alveolar non-sibilant fricative voiceless trilled fricative voiced trilled fricative voiceless palatal fricative voiced palatal fricative voiceless velar fricative voiced velar fricative voiceless palatal-velar fricative (articulation disputed) voiceless uvular fricative voiceless pharyngeal fricative The IPA also has letters for epiglottal fricatives, voiceless epiglottal fricative voiced epiglottal fricative with allophonic trilling, but these might be better analyzed as pharyngeal trills. voiceless velopharyngeal fricative (often occurs with a cleft palate) voiced velopharyngeal fricative Lateral fricatives voiceless dental lateral fricative voiced dental lateral fricative voiceless alveolar lateral fricative voiced alveolar lateral fricative voiceless postalveolar lateral fricative (Mehri) or extIPA voiceless retroflex lateral fricative or extIPA Voiced retroflex lateral fricative (in Ao) or or extIPA voiceless palatal lateral fricative (PUA ) or extIPA voiced palatal lateral fricative (allophonic in Jebero) or extIPA voiceless velar lateral fricative (PUA ) or extIPA voiced velar lateral fricative The lateral fricative occurs as the ll of Welsh, as in Lloyd, Llewelyn, and Machynlleth (, a town), as the unvoiced 'hl' and voiced 'dl' or 'dhl' in the several languages of Southern Africa (such as Xhosa and Zulu), and in Mongolian. or and voiceless grooved lateral alveolar fricative (a laterally lisped or ) (Modern South Arabian) or and voiced grooved lateral alveolar fricative (a laterally lisped or ) (Modern South Arabian)
fricative voiceless palatal-velar fricative (articulation disputed) voiceless uvular fricative voiceless pharyngeal fricative The IPA also has letters for epiglottal fricatives, voiceless epiglottal fricative voiced epiglottal fricative with allophonic trilling, but these might be better analyzed as pharyngeal trills. voiceless velopharyngeal fricative (often occurs with a cleft palate) voiced velopharyngeal fricative Lateral fricatives voiceless dental lateral fricative voiced dental lateral fricative voiceless alveolar lateral fricative voiced alveolar lateral fricative voiceless postalveolar lateral fricative (Mehri) or extIPA voiceless retroflex lateral fricative or extIPA Voiced retroflex lateral fricative (in Ao) or or extIPA voiceless palatal lateral fricative (PUA ) or extIPA voiced palatal lateral fricative (allophonic in Jebero) or extIPA voiceless velar lateral fricative (PUA ) or extIPA voiced velar lateral fricative The lateral fricative occurs as the ll of Welsh, as in Lloyd, Llewelyn, and Machynlleth (, a town), as the unvoiced 'hl' and voiced 'dl' or 'dhl' in the several languages of Southern Africa (such as Xhosa and Zulu), and in Mongolian. or and voiceless grooved lateral alveolar fricative (a laterally lisped or ) (Modern South Arabian) or and voiced grooved lateral alveolar fricative (a laterally lisped or ) (Modern South Arabian) IPA letters used for both fricatives and approximants voiced uvular fricative voiced pharyngeal fricative No language distinguishes voiced fricatives from approximants at these places, so the same symbol is used for both. For the pharyngeal, approximants are more numerous than fricatives. A fricative realization may be specified by adding the uptack to the letters, . Likewise, the downtack may be added to specify an approximant realization, . (The bilabial approximant and dental approximant do not have dedicated symbols either and are transcribed in a similar fashion: . However, the base letters are understood to specifically refer to the fricatives.) Pseudo-fricatives voiceless glottal transition, as in English hat breathy-voiced glottal transition In many languages, such as English, the glottal "fricatives" are unaccompanied phonation states of the glottis, without any accompanying manner, fricative or otherwise. However, in languages such as Arabic, they are true fricatives. In addition, is usually called a "voiceless labial-velar fricative", but it is actually an approximant. True doubly articulated fricatives may not occur in any language; but see voiceless palatal-velar fricative for a putative (and rather controversial) example. Aspirated fricatives Fricatives are very commonly voiced, though cross-linguistically voiced fricatives are not nearly as common as tenuis ("plain") fricatives. Other phonations are common in languages that have those phonations in their stop consonants. However, phonemically aspirated fricatives are rare. contrasts with a tense, unaspirated in Korean; aspirated fricatives are also found in a few Sino-Tibetan languages, in some Oto-Manguean languages, in the Siouan language Ofo ( and ), and in the (central?) Chumash languages ( and ). The record may be Cone Tibetan, which has four contrastive aspirated fricatives: , , and . Nasalized fricatives Phonemically nasalized fricatives are rare. Umbundu has and Kwangali and Souletin Basque have . In Coatzospan Mixtec, appear allophonically before a nasal vowel, and in Igbo nasality is a feature of the syllable; when occur in nasal syllables they are themselves nasalized. Occurrence Until its extinction, Ubykh may have been the language with the most fricatives (29 not including ), some of which did not have dedicated symbols or diacritics in the IPA. This number actually outstrips the number of all consonants in English (which has 24 consonants). By contrast, approximately 8.7% of the world's languages have no phonemic fricatives at all. This is a typical feature of Australian Aboriginal languages, where the few fricatives that exist result from changes to plosives or approximants, but also occurs in some indigenous languages of New Guinea and South America that have especially small numbers of consonants. However, whereas is entirely unknown in indigenous Australian languages, most of the other
On unheated motor vehicles, the frost usually forms on the outside surface of the glass first. The glass surface influences the shape of crystals, so imperfections, scratches, or dust can modify the way ice nucleates. The patterns in window frost form a fractal with a fractal dimension greater than one, but less than two. This is a consequence of the nucleation process being constrained to unfold in two dimensions, unlike a snowflake, which is shaped by a similar process, but forms in three dimensions and has a fractal dimension greater than two. If the indoor air is very humid, rather than moderately so, water first condenses in small droplets, and then freezes into clear ice. Similar patterns of freezing may occur on other smooth vertical surfaces, but they seldom are as obvious or spectacular as on clear glass. White frost White frost is a solid deposition of ice that forms directly from water vapour contained in air. White frost forms when relative humidity is above 90% and the temperature below −8 °C (18 °F), and it grows against the wind direction, since air arriving from windward has a higher humidity than leeward air, but the wind must not be strong, else it damages the delicate icy structures as they begin to form. White frost resembles a heavy coating of hoar frost with big, interlocking crystals, usually needle-shaped. Rime Rime is a type of ice deposition that occurs quickly, often under heavily humid and windy conditions. Technically speaking, it is not a type of frost, since usually supercooled water drops are involved, in contrast to the formation of hoar frost, in which water vapour desublimates slowly and directly. Ships travelling through Arctic seas may accumulate large quantities of rime on the rigging. Unlike hoar frost, which has a feathery appearance, rime generally has an icy, solid appearance. Black frost Black frost (or "killing frost") is not strictly speaking frost at all, because it is the condition seen in crops when the humidity is too low for frost to form, but the temperature falls so low that plant tissues freeze and die, becoming blackened, hence the term "black frost". Black frost often is called "killing frost" because white frost tends to be less cold, partly because the latent heat of freezing of the water reduces the temperature drop. Effect on plants Damage Many plants can be damaged or killed by freezing temperatures or frost. This varies with the type of plant, the tissue exposed, and how low temperatures get; a "light frost" of damages fewer types of plants than a "hard frost" below . Plants likely to be damaged even by a light frost include vines—such as beans, grapes, squashes, melons—along with nightshades such as tomatoes, eggplants, and peppers. Plants that may tolerate (or even benefit from) frosts include: root vegetables (e.g. beets, carrots, parsnips, onions) leafy greens (e.g. lettuces, spinach, chard, cucumber) cruciferous vegetables (e.g. cabbages, cauliflower, bok choy, broccoli, Brussels sprouts, radishes, kale, collard, mustard, turnips, rutabagas) Even those plants that tolerate frost may be damaged once temperatures drop even lower (below ). Hardy perennials, such as Hosta, become dormant after the first frosts and regrow when spring arrives. The entire visible plant may turn completely brown until the spring warmth, or may drop all of its leaves and flowers, leaving the stem and stalk only. Evergreen plants, such as pine trees, withstand frost although all or most growth stops. Frost crack is a bark defect caused by a combination of low temperatures and heat from the winter sun. Vegetation is not necessarily damaged when leaf temperatures drop below the freezing point of their cell contents. In the absence of a site nucleating the formation of ice crystals, the leaves remain in a supercooled liquid state, safely reaching temperatures of . However, once frost forms, the leaf cells may be damaged by sharp ice crystals. Hardening is the process by which a plant becomes tolerant to low temperatures. See also Cryobiology. Certain bacteria, notably Pseudomonas syringae, are particularly effective at triggering frost formation, raising the nucleation temperature to about . Bacteria lacking ice nucleation-active proteins (ice-minus bacteria) result in greatly reduced frost damage. Protection methods Typical measures to prevent frost or reduce its severity include one or more of: deploying powerful blowers to simulate wind, thereby preventing the formation of accumulations of cold air. There are variations on this theme. One variety is the wind machine, an engine-driven propeller mounted on a vertical pole that blows air almost horizontally. Wind machines were introduced as a method for frost protection in California during the 1920s, but they were not widely accepted until the 1940s and 1950s. Now, they are commonly used in many parts of the world. Another is the selective inverted sink, a device which prevents frost by drawing cold air from the ground and blowing it up through a chimney. It was originally developed to prevent frost damage to citrus fruits in Uruguay. In New Zealand, helicopters are used in similar fashion, especially in the vineyard regions such as Marlborough. By dragging down warmer air from the inversion layers, and preventing the ponding of colder air on the ground, the low-flying helicopters prevent damage to the fruit buds. As the operations are conducted at night, and have in the past involved up to 130 aircraft per night in one region, safety rules are strict. Although not a dedicated method, wind turbines have similar (small) effect of
frost crystals scatters light in all directions, the coating of frost appears white. Types of frost include crystalline frost (hoar frost or radiation frost) from deposition of water vapor from air of low humidity, white frost in humid conditions, window frost on glass surfaces, advection frost from cold wind over cold surfaces, black frost without visible ice at low temperatures and very low humidity, and rime under supercooled wet conditions. Plants that have evolved in warmer climates suffer damage when the temperature falls low enough to freeze the water in the cells that make up the plant tissue. The tissue damage resulting from this process is known as "frost damage". Farmers in those regions where frost damage is known to affect their crops often invest in substantial means to protect their crops from such damage. Formation If a solid surface is chilled below the dew point of the surrounding humid air, and the surface itself is colder than freezing, ice will form on it. If the water deposits as a liquid that then freezes, it forms a coating that may look glassy, opaque, or crystalline, depending on its type. Depending on context, that process also may be called atmospheric icing. The ice it produces differs in some ways from crystalline frost, which consists of spicules of ice that typically project from the solid surface on which they grow. The main difference between the ice coatings and frost spicules arises because the crystalline spicules grow directly from desublimation of water vapour from air, and desublimation is not a factor in icing of freezing surfaces. For desublimation to proceed, the surface must be below the frost point of the air, meaning that it is sufficiently cold for ice to form without passing through the liquid phase. The air must be humid, but not sufficiently humid to permit the condensation of liquid water, or icing will result instead of desublimation. The size of the crystals depends largely on the temperature, the amount of water vapor available, and how long they have been growing undisturbed. As a rule, except in conditions where supercooled droplets are present in the air, frost will form only if the deposition surface is colder than the surrounding air. For instance, frost may be observed around cracks in cold wooden sidewalks when humid air escapes from the warmer ground beneath. Other objects on which frost commonly forms are those with low specific heat or high thermal emissivity, such as blackened metals, hence the accumulation of frost on the heads of rusty nails. The apparently erratic occurrence of frost in adjacent localities is due partly to differences of elevation, the lower areas becoming colder on calm nights. Where static air settles above an area of ground in the absence of wind, the absorptivity and specific heat of the ground strongly influence the temperature that the trapped air attains. Types Hoar frost Hoar frost, also hoarfrost, radiation frost, or pruina, refers to white ice crystals deposited on the ground or loosely attached to exposed objects, such as wires or leaves. They form on cold, clear nights when conditions are such that heat radiates out to the open air faster than it can be replaced from nearby sources, such as wind or warm objects. Under suitable circumstances, objects cool to below the frost point of the surrounding air, well below the freezing point of water. Such freezing may be promoted by effects such as flood frost or frost pocket. These occur when ground-level radiation losses cool air until it flows downhill and accumulates in pockets of very cold air in valleys and hollows. Hoar frost may freeze in such low-lying cold air even when the air temperature a few feet above ground is well above freezing. The word "hoar" comes from an Old English adjective that means "showing signs of old age". In this context, it refers to the frost that makes trees and bushes look like white hair. Hoar frost may have different names depending on where it forms: Air hoar is a deposit of hoar frost on objects above the surface, such as tree branches, plant stems, and wires. Surface hoar refers to fern-like ice crystals directly deposited on snow, ice, or already frozen surfaces. Crevasse hoar consists of crystals that form in glacial crevasses where water vapour can accumulate under calm weather conditions. Depth hoar refers to faceted crystals that have slowly grown large within cavities beneath the surface of banks of dry snow. Depth hoar crystals grow continuously at the expense of neighbouring smaller crystals, so typically are visibly stepped and have faceted hollows. When surface hoar covers sloping snowbanks, the layer of frost crystals may create an avalanche risk; when heavy layers of new snow cover the frosty surface, furry crystals standing out from the old snow hold off the falling flakes, forming a layer of voids that prevents the new snow layers from bonding strongly to the old snow beneath. Ideal conditions for hoarfrost to form on snow are cold, clear nights, with very light, cold air currents conveying humidity at the right rate for growth of frost crystals. Wind that is too strong or warm destroys the furry crystals, and thereby may permit a stronger bond between the old and new snow layers. However, if the winds are strong enough and cold enough to lay the crystals flat and dry, carpeting the snow with cold, loose crystals without removing or destroying them or letting them warm up and become sticky, then the frost interface between the snow layers may still present an avalanche danger, because the texture of the frost crystals differs from the snow texture, and the dry crystals will not stick to fresh snow. Such conditions still prevent a strong bond between the snow layers. In very low temperatures where fluffy surface hoar crystals form without subsequently being covered with snow, strong winds may break them off, forming a dust of ice particles and blowing them over the surface. The ice dust then may form yukimarimo, as has been observed in parts of Antarctica, in a process similar to the formation of dust bunnies and similar structures. Hoar frost and white frost also occur in man-made environments such as in freezers or industrial cold-storage facilities. If such cold spaces or the pipes serving them are not well insulated and are exposed to ambient humidity, the moisture will freeze instantly depending on the freezer temperature. The frost may coat pipes thickly, partly insulating them, but such inefficient insulation still is a source of heat loss. Advection frost Advection frost (also called wind frost) refers to tiny ice spikes that form when very cold wind is blowing over tree branches, poles, and other surfaces. It looks like rimming on the edges of flowers and leaves, and usually forms against the direction of the wind. It can occur at any hour, day or night. Window frost Window frost (also called fern frost or ice flowers) forms when a glass pane is exposed to very cold air on the outside and warmer, moderately moist air on the inside. If the pane is a bad insulator (for example, if it is a single-pane window), water vapour condenses on the glass,
Life Schmidt was born in Pozsony/Pressburg, in the Hungarian part of Austria-Hungary (today Bratislava, Slovakia) to a half-Hungarian father – with the same name, born in the same city – and to a Hungarian mother, Mária Ravasz. He was a Roman Catholic. His earliest teacher was his mother, Mária Ravasz, an accomplished pianist, who gave him a systematic instruction in the keyboard works of J. S. Bach. He received a foundation in theory from , the organist at the Franciscan church in Pressburg. He studied piano briefly with Theodor Leschetizky, with whom he clashed. He moved to Vienna with his family in 1888, and studied at the Vienna Conservatory (composition with Robert Fuchs, cello with Ferdinand Hellmesberger, and, for a few lessons, counterpoint with Anton Bruckner, who was already seriously ill at that time), graduating "with excellence" in 1896. He obtained a post as cellist with the Vienna Court Opera Orchestra, where he played until 1914, often under Gustav Mahler. Mahler habitually had Schmidt play all the cello solos, even though Friedrich Buxbaum was the principal cellist. Schmidt was also in demand as a chamber musician. Schmidt and Arnold Schoenberg maintained cordial relations despite their vast differences in eventual outlook and style (Schmidt certainly shows a perceptible influence from Schoenberg's early, tonal works such as Verklärte Nacht, Op. 4, in whose Viennese première he participated as cellist, the Chamber Symphony No. 1, Op. 9 and the gigantic cantata Gurre-Lieder. Unable to procure a teaching position for Schoenberg at the Academy, Schmidt rehearsed his students in a performance of Pierrot Lunaire, Op. 21 which Schoenberg warmly praised). Also a brilliant pianist, in 1914 Schmidt took up a professorship in piano at the Vienna Conservatory, which had been recently renamed Imperial Academy of Music and the Performing Arts. (Apparently, when asked who the greatest living pianist was, Leopold Godowsky replied, "The other one is Franz Schmidt.") In 1925 he became Director of the Academy, and from 1927 to 1931 its Rector. As teacher of piano, cello and counterpoint and composition at the Academy, Schmidt trained numerous instrumentalists, conductors, and composers who later achieved fame. Among his best-known students were the pianist Friedrich Wührer and Alfred Rosé (son of Arnold Rosé, the founder of the Rosé Quartet, Konzertmeister of the Vienna Philharmonic and brother-in-law of Gustav Mahler). Among the composers were Walter Bricht (his favourite student), Theodor Berger, Marcel Rubin, Alfred Uhl and Ľudovít Rajter. He received many tokens of the high esteem in which he was held, notably the Order of Franz Joseph, and an Honorary Doctorate from the University of Vienna. Schmidt's private life was in stark contrast to the success of his distinguished professional career. His first wife, Karoline Perssin (c. 1880–1943), was confined in the Vienna mental hospital Am Steinhof in 1919, and three years after his death was murdered under the Nazi euthanasia program. Their daughter Emma Schmidt Holzschuh (1902–1932, married 1929) died unexpectedly after the birth of her first child. Schmidt experienced a spiritual and physical breakdown after this, and achieved an artistic revival and resolution in his Fourth Symphony of 1933 (which he inscribed as "Requiem for my Daughter") and, especially, in his oratorio The Book with Seven Seals. His second marriage in 1923, to a successful young piano student Margarethe Jirasek (1891–1964), for the first time brought some desperately needed stability into the private life of the artist, who was plagued by many serious health problems. Schmidt's worsening health forced his retirement from the Academy in early 1937. In the last year of his life Austria was brought into the German Reich by the Anschluss, and Schmidt was feted by the Nazi authorities as the greatest living composer of the so-called Ostmark. He was given a commission to write a cantata entitled The German Resurrection, which, after 1945, was taken by many as a reason to brand him as having been tainted by Nazi sympathy. However, Schmidt left this composition unfinished, and in the summer and autumn of 1938, a few months before his death, set it aside to devote himself to two other commissioned works for the one-armed pianist Paul Wittgenstein: the Quintet in A major for piano left-hand, clarinet, and string trio; and the Toccata in D minor for solo piano. Schmidt died on 11 February 1939. Musical works As a composer, Schmidt was slow to develop, but his reputation, at least in Austria, saw a steady growth from the late 1890s until his death in 1939. In his music, Schmidt continued to develop the Viennese classic-romantic traditions he inherited from Schubert, Brahms, and Bruckner. He also takes forward the "gypsy" style of Liszt and Brahms. His works are monumental in form and firmly tonal in language, though quite often innovative in their designs and clearly open to some of the new developments in musical syntax initiated by Mahler and Schoenberg. Although Schmidt did not write a lot of chamber music, what he did write, in the opinion of such critics as Wilhelm Altmann, was important and of high quality. Although Schmidt's organ works may resemble others of the era in terms of length, complexity, and difficulty, they are forward-looking in being conceived for the smaller, clearer, classical-style instruments of the Orgelbewegung, which he advocated. Schmidt worked mainly in large forms, including four symphonies (1899, 1913, 1928 and 1933) and two operas: Notre Dame (1904–6) and Fredigundis (1916–21). A CD recording of Notre Dame has been available for many years, starring Dame Gwyneth Jones and James King. Fredigundis No really adequate recording has been made of Schmidt's second and last opera Fredigundis, of which there has been but one "unauthorized" release in the early 1980s on the Voce label of an Austrian Radio broadcast of a 1979 Vienna performance under the direction of Ernst Märzendorfer. Aside from numerous "royal fanfares" (Fredigundis held the French throne in the sixth century) the score contains some fine examples of Schmidt's transitional style between his earlier and later manner. In many respects, Schmidt seldom ventured so far from traditional tonality again, and his third and final period (in the last decade-and-a-half of his life) was generally one of (at least partial) retrenchment, consolidation and the integration of the style of his opulently scored and melodious early compositions (the First Symphony, "Notre Dame") with elements of the overt experimentation seen in "Fredigundis", combined with an economy of utterance born of artistic maturity. New Grove encyclopaedia states that Fredigundis was a critical and popular failure, which may be partly attributable to the fact that Fredigundis (Fredegund, the widow of Chilperic I), is presented as a murderous and sadistic feminine monster. Add to this some structural problems with the libretto, and the opera's failure to make headway – despite an admirable and impressive score – becomes comprehensible. The Book with Seven Seals Aside from the mature symphonies (Nos. 2–4), Schmidt's crowning achievement was the oratorio The Book with Seven Seals (1935–37), a setting of passages from the Book of Revelation. His choice of subject was prophetic: with hindsight the work appears to foretell, in the most powerful terms, the disasters that were shortly to be visited upon Europe in the Second
structural problems with the libretto, and the opera's failure to make headway – despite an admirable and impressive score – becomes comprehensible. The Book with Seven Seals Aside from the mature symphonies (Nos. 2–4), Schmidt's crowning achievement was the oratorio The Book with Seven Seals (1935–37), a setting of passages from the Book of Revelation. His choice of subject was prophetic: with hindsight the work appears to foretell, in the most powerful terms, the disasters that were shortly to be visited upon Europe in the Second World War. Here his invention rises to a sustained pitch of genius. A narrative upon the text of the oratorio was provided by the composer. Schmidt's oratorio stands in the Austro-German tradition stretching back to the time of J. S. Bach and Handel. He was one of relatively few composers to write an oratorio fully on the subject of the Book of Revelation (earlier works include Georg Philipp Telemann: Der Tag des Gerichts, Schneider: Das Weltgericht, Louis Spohr: Die letzten Dinge, Joachim Raff: Weltende, and Ralph Vaughan Williams: Sancta Civitas). Far from glorifying its subject, it is a mystical contemplation, a horrified warning, and a prayer for salvation. The premiere was held in Vienna on 15 June 1938, with the Vienna Symphony Orchestra under Oswald Kabasta: the soloists were Rudolf Gerlach (John), Erika Rokyta, Enid Szánthó, Anton Dermota, Josef von Manowarda and Franz Schütz at the organ. Symphonies Schmidt is generally regarded as a conservative composer, but the rhythmic subtlety and harmonic complexity of much of his music belie this. His music combines a reverence for the Austro-German lineage of composers with innovations in harmony and orchestration (showing an awareness of the output of composers such as Debussy and Ravel, whose piano music he greatly admired, along with a knowledge of more recent composers in his own German-speaking realm, such as Schoenberg, Berg, Hindemith, etc.). Symphony No. 1 in E major. Written in 1896 at age 22. The scherzo (which shows a mature absorption of Bruckner and Richard Strauss) is especially noteworthy, while Schmidt demonstrates his contrapuntal skills in the Finale. Symphony No. 2 in E-flat major. Written in 1913 in a style reminiscent of Strauss and Reger, with homage to the grandiosity of Bruckner. This is Schmidt's longest symphony and it employs a huge orchestra. The central movement (of three) is an ingenious set of variations, which are grouped to suggest the characters of slow movement and scherzo. The complex scoring renders it a considerable challenge for most orchestras. Symphony No. 3 in A major. A sunny, melodic work in the Schubert vein (although its lyricism and superb orchestration do much to conceal the fact that it is one of the composer's most harmonically advanced works). Winner of the Austrian section of the 1928 International Columbia Graphophone Competition (the overall winner was Swedish composer Kurt Atterberg with his 6th Symphony), it enjoyed some popularity at the time (1928). Symphony No. 4 in C major.Written in 1933, this is the best-known work of his entire oeuvre. The composer called it "A requiem for my daughter". It begins with a long 23-bar melody on an unaccompanied solo trumpet (which returns at the symphony's close, "transfigured" by all that has intervened). The Adagio is an immense ABA ternary structure. The first A is an expansive threnody on solo cello (Schmidt's own instrument) whose seamless lyricism predates Strauss's Metamorphosen by more than a decade (its theme is later adjusted to form the scherzo of the symphony); the B section is an equally expansive funeral march (unmistakably referencing the Marcia Funebre from Beethoven's Eroica in its texture) whose dramatic climax is marked by an orchestral crescendo culminating in a gong and cymbal crash (again, a clear allusion to similar climaxes in the later symphonies of Bruckner, and followed by what Harold Truscott has described as a "reverse climax", leading back to a repeat of the A section). Schmidt and Nazism Schmidt's premiere of The Book with Seven Seals was made much of by the Nazis (who had annexed Austria shortly before in the Anschluss), and Schmidt was seen to give the Nazi salute (according to a report by Georg Tintner, who revered Schmidt and whose intent to record his symphonies was never realised). His conductor Oswald Kabasta was apparently an enthusiastic Nazi who, being prohibited from conducting in 1946 during de-nazification, committed suicide. These facts long placed Schmidt's posthumous reputation under a cloud. His lifelong friend and colleague Oskar Adler, who fled the Nazis in 1938, wrote afterwards that Schmidt was never a Nazi and never antisemitic but was extremely naive about politics. Hans Keller gave a similar endorsement. Regarding Schmidt's political naivety, Michael Steinberg, in his book The Symphony, tells of Schmidt's recommending Variations on a Hebrew Theme by his student Israel Brandmann to a musical group associated with the proto-Nazi German National Party. Most of Schmidt's principal musical friends were Jews, and they benefited from his generosity. Schmidt's last listed work, the cantata Deutsche Auferstehung (German Resurrection), was composed to a Nazi text. As one of the most famous living Austrian composers, Schmidt was well known to Hitler and received this commission after the Anschluss. He left it unfinished, to be completed later by Robert Wagner. Already seriously ill, Schmidt worked instead on other compositions such as the Quintet in A major for piano (left hand), clarinet and string trio, intended for Paul Wittgenstein and incorporating a variation set based on a theme by Wittgenstein's old teacher, Josef Labor. His failure to complete the cantata is likely to be a further indication that he was not committed to the Nazi cause; such, at any rate, was the opinion of his friend Oskar Adler. Listing of works Operas Notre Dame, romantic Opera in two acts, text after Victor Hugo by Franz Schmidt and Leopold Wilk; comp. 1902–4, premiered Vienna 1914 Fredigundis, Opera in three acts, text after Felix Dahn by and ; comp. 1916–21, premiered Berlin 1922 Oratorio The Book with Seven Seals (Das Buch mit sieben Siegeln) for Soli, Chorus, Organ and Orchestra, Text after the Revelation of St John; comp. 1935–37; premiered Vienna, 1938 Cantata Deutsche Auferstehung a Festival Song for Soli, Chorus, Organ and Orchestra, Text by Oskar Dietrich; comp. 1938–39, unfinished, prepared for performance by Dr. Robert Wagner; premiered Vienna, 1940 Symphonies Symphony No. 1 in E major; comp. 1896–99, premiered Vienna 1902 Symphony No. 2 in E-flat major; comp. 1911–13, premiered Vienna 1913 Symphony No. 3 in A major; comp. 1927–28, premiered Vienna 1928 Symphony No. 4 in C major; comp. 1932–33, premiered Vienna 1934 Piano concertos Concertante Variations on a Theme of Beethoven for Piano (left hand alone) with orchestral accompaniment; comp. 1923, premiered Vienna 1924; Two-handed arrangement by Friedrich Wührer (1952) Piano Concerto in E-flat major (for left hand alone); comp. 1934, premiered: Vienna 1935; Two-handed version by Friedrich Wührer (1952) Other orchestral works Carnival music and Intermezzo from the Opera Notre Dame; comp. 1902–03; premiered Vienna 1903 Variations on a Hussar Song for orchestra; comp. 1930–31; premiered Vienna 1931 Chaconne in D minor; transcribed from the Chaconne in C-sharp minor for organ from 1925; completed 1931; Manuscript Chamber music Four Little Fantasy pieces after Hungarian National Melodies, for cello with piano accompaniment; comp. 1892; premiered Vienna 1926 (three pieces) String Quartet in A major; comp. 1925; premiered Vienna 1925 String Quartet in G major; comp. 1929; premiered Vienna 1930 Quintet for piano left hand, two violins, viola and cello in G major; comp. 1926; premiered Stuttgart 1931; two-handed arrangement by Friedrich Wührer (1954) Quintet for clarinet, piano left hand, violin, viola and cello in B-flat major; comp. 1932; premiered Vienna 1933 Quintet for clarinet, piano left hand, violin, viola and cello in
known as Fucking until 2021 Fugging, Lower Austria, a village known as Fucking until 1836 See also Fakkin, abbreviation of Japanese restaurant First Kitchen Fuck
village known as Fucking until 2021 Fugging, Lower Austria, a village known as Fucking until
southern Finland. The strike leadership voted by a narrow majority to start a revolution on 16 November, but the uprising had to be called off the same day due to the lack of active revolutionaries to execute it. At the end of November 1917, the moderate socialists among the social democrats won a second vote over the radicals in a debate over revolutionary versus parliamentary means, but when they tried to pass a resolution to completely abandon the idea of a socialist revolution, the party representatives and several influential leaders voted it down. The Finnish labour movement wanted to sustain a military force of its own and to keep the revolutionary road open, too. The wavering Finnish socialists disappointed V. I. Lenin and in turn, he began to encourage the Finnish Bolsheviks in Petrograd. Among the labour movement, a more marked consequence of the events of 1917 was the rise of the Workers' Order Guards. There were 20–60 separate guards between 31 August and 30 September 1917, but on 20 October, after defeat in parliamentary elections, the Finnish labour movement proclaimed the need to establish more worker units. The announcement led to a rush of recruits: on 31 October the number of guards was 100–150; 342 on 30 November 1917 and 375 on 26 January 1918. Since May 1917, the paramilitary organisations of the left had grown in two phases, the majority of them as Workers' Order Guards. The minority were Red Guards, these were partly underground groups formed in industrialised towns and industrial centres, such as Helsinki, Kotka and Tampere, based on the original Red Guards that had been formed during 1905–1906 in Finland. The presence of the two opposing armed forces created a state of dual power and divided sovereignty on Finnish society. The decisive rift between the guards broke out during the general strike: the Reds executed several political opponents in southern Finland and the first armed clashes between the Whites and Reds took place. In total, 34 casualties were reported. Eventually, the political rivalries of 1917 led to an arms race and an escalation towards civil war. Independence of Finland The disintegration of Russia offered Finns an historic opportunity to gain national independence. After the October Revolution, the conservatives were eager for secession from Russia in order to control the left and minimise the influence of the Bolsheviks. The socialists were skeptical about sovereignty under conservative rule, but they feared a loss of support among nationalistic workers, particularly after having promised increased national liberty through the "Law of Supreme Power". Eventually, both political factions supported an independent Finland, despite strong disagreement over the composition of the nation's leadership. Nationalism had become a "civic religion" in Finland by the end of nineteenth century, but the goal during the general strike of 1905 was a return to the autonomy of 1809–1898, not full independence. In comparison to the unitary Swedish regime, the domestic power of Finns had increased under the less uniform Russian rule. Economically, the Grand Duchy of Finland benefited from having an independent domestic state budget, a central bank with national currency, the markka (deployed 1860), and customs organisation and the industrial progress of 1860–1916. The economy was dependent on the huge Russian market and separation would disrupt the profitable Finnish financial zone. The economic collapse of Russia and the power struggle of the Finnish state in 1917 were among the key factors that brought sovereignty to the fore in Finland. Svinhufvud's Senate introduced Finland's Declaration of Independence on 4 December 1917 and Parliament adopted it on 6 December. The social democrats voted against the Senate's proposal, while presenting an alternative declaration of sovereignty. The establishment of an independent state was not a guaranteed conclusion for the small Finnish nation. Recognition by Russia and other great powers was essential; Svinhufvud accepted that he had to negotiate with Lenin for the acknowledgement. The socialists, having been reluctant to enter talks with the Russian leadership in July 1917, sent two delegations to Petrograd to request that Lenin approve Finnish sovereignty. In December 1917, Lenin was under intense pressure from the Germans to conclude peace negotiations at Brest-Litovsk, and the Bolsheviks' rule was in crisis, with an inexperienced administration and the demoralised army facing powerful political and military opponents. Lenin calculated that the Bolsheviks could fight for central parts of Russia but had to give up some peripheral territories, including Finland in the geopolitically less important north-western corner. As a result, Svinhufvud's delegation won Lenin's concession of sovereignty on 31 December 1917. By the beginning of the Civil War, Austria-Hungary, Denmark, France, Germany, Greece, Norway, Sweden and Switzerland had recognised Finnish independence. The United Kingdom and United States did not approve it; they waited and monitored the relations between Finland and Germany (the main enemy of the Allies), hoping to override Lenin's regime and to get Russia back into the war against the German Empire. In turn, the Germans hastened Finland's separation from Russia so as to move the country to within their sphere of influence. Warfare Escalation The final escalation towards war began in early January 1918, as each military or political action of the Reds or the Whites resulted in a corresponding counteraction by the other. Both sides justified their activities as defensive measures, particularly to their own supporters. On the left, the vanguard of the movement was the urban Red Guards from Helsinki, Kotka and Turku; they led the rural Reds and convinced the socialist leaders who wavered between peace and war to support the revolution. On the right, the vanguard was the Jägers, who had transferred to Finland, and the volunteer Civil Guards of southwestern Finland, southern Ostrobothnia and Vyborg province in the southeastern corner of Finland. The first local battles were fought during 9–21 January 1918 in southern and southeastern Finland, mainly to win the arms race and to control Vyborg (; ). On 12 January 1918, Parliament authorised the Svinhufvud Senate to establish internal order and discipline on behalf of the state. On 15 January, Carl Gustaf Emil Mannerheim, a former Finnish general of the Imperial Russian Army, was appointed the commander-in-chief of the Civil Guards. The Senate appointed the Guards, henceforth called the White Guards, as the White Army of Finland. Mannerheim placed his Headquarters of the White Army in the Vaasa–Seinäjoki area. The White Order to engage was issued on 25 January. The Whites gained weaponry by disarming Russian garrisons during 21–28 January, in particular in southern Ostrobothnia. The Red Guards, led by Ali Aaltonen, refused to recognise the Whites' hegemony and established a military authority of their own. Aaltonen installed his headquarters in Helsinki and nicknamed it Smolna echoing the Smolny Institute, the Bolsheviks' headquarters in Petrograd. The Red Order of Revolution was issued on 26 January, and a red lantern, a symbolic indicator of the uprising, was lit in the tower of the Helsinki Workers' House. A large-scale mobilisation of the Reds began late in the evening of 27 January, with the Helsinki Red Guard and some of the Guards located along the Vyborg-Tampere railway having been activated between 23 and 26 January, in order to safeguard vital positions and escort a heavy railroad shipment of Bolshevik weapons from Petrograd to Finland. White troops tried to capture the shipment: 20–30 Finns, Red and White, died in the Battle of Kämärä at the Karelian Isthmus on 27 January 1918. The Finnish rivalry for power had culminated. Opposing parties Red Finland and White Finland At the beginning of the war, a discontinuous front line ran through southern Finland from west to east, dividing the country into White Finland and Red Finland. The Red Guards controlled the area to the south, including nearly all the major towns and industrial centres, along with the largest estates and farms with the highest numbers of crofters and tenant farmers. The White Army controlled the area to the north, which was predominantly agrarian and contained small or medium-sized farms and tenant farmers. The number of crofters was lower and they held a better social status than those in the south. Enclaves of the opposing forces existed on both sides of the front line: within the White area lay the industrial towns of Varkaus, Kuopio, Oulu, Raahe, Kemi and Tornio; within the Red area lay Porvoo, Kirkkonummi and Uusikaupunki. The elimination of these strongholds was a priority for both armies in February 1918. Red Finland was led by the Finnish People's Delegation (; ), established on 28 January 1918 in Helsinki, which was supervised by the Central Workers' Council. The delegation sought democratic socialism based on the Finnish Social Democratic Party's ethos; their visions differed from Lenin's dictatorship of the proletariat. Otto Ville Kuusinen formulated a proposal for a new constitution, influenced by those of Switzerland and the United States. With it, political power was to be concentrated to Parliament, with a lesser role for a government. The proposal included a multi-party system; freedom of assembly, speech and press; and the use of referenda in political decision-making. In order to ensure the authority of the labour movement, the common people would have a right to permanent revolution. The socialists planned to transfer a substantial part of property rights to the state and local administrations. In foreign policy, Red Finland leaned on Bolshevist Russia. A Red-initiated Finno–Russian treaty and peace agreement was signed on 1 March 1918, where Red Finland was called the Finnish Socialist Workers' Republic (; ). The negotiations for the treaty implied that –as in World War I in general– nationalism was more important for both sides than the principles of international socialism. The Red Finns did not simply accept an alliance with the Bolsheviks and major disputes appeared, for example, over the demarcation of the border between Red Finland and Soviet Russia. The significance of the Russo–Finnish Treaty evaporated quickly due to the signing of the Treaty of Brest-Litovsk between the Bolsheviks and the German Empire on 3 March 1918. Lenin's policy on the right of nations to self-determination aimed at preventing the disintegration of Russia during the period of military weakness. He assumed that in war-torn, splintering Europe, the proletariat of free nations would carry out socialist revolutions and unite with Soviet Russia later. The majority of the Finnish labour movement supported Finland's independence. The Finnish Bolsheviks, influential, though few in number, favoured annexation of Finland by Russia. The government of White Finland, Pehr Evind Svinhufvud's first senate, was called the Vaasa Senate after its relocation to the safer west-coast city of Vaasa, which acted as the capital of the Whites from 29 January to 3 May 1918. In domestic policy, the White Senate's main goal was to return the political right to power in Finland. The conservatives planned a monarchist political system, with a lesser role for Parliament. A section of the conservatives had always supported monarchy and opposed democracy; others had approved of parliamentarianism since the revolutionary reform of 1906, but after the crisis of 1917–1918, concluded that empowering the common people would not work. Social liberals and reformist non-socialists opposed any restriction of parliamentarianism. They initially resisted German military help, but the prolonged warfare changed their stance. In foreign policy, the Vaasa Senate relied on the German Empire for military and political aid. Their objective was to defeat the Finnish Reds; end the influence of Bolshevist Russia in Finland and expand Finnish territory to East Karelia, a geopolitically significant home to people speaking Finnic languages. The weakness of Russia inspired an idea of Greater Finland among the expansionist factions of both the right and left: the Reds had claims concerning the same areas. General Mannerheim agreed on the need to take over East Karelia and to request German weapons, but opposed actual German intervention in Finland. Mannerheim recognised the Red Guards' lack of combat skill and trusted in the abilities of the German-trained Finnish Jägers. As a former Russian army officer, Mannerheim was well aware of the demoralisation of the Russian army. He co-operated with White-aligned Russian officers in Finland and Russia. Soldiers and weapons The number of Finnish troops on each side varied from 70,000 to 90,000 and both had around 100,000 rifles, 300–400 machine guns and a few hundred cannons. While the Red Guards consisted mostly of volunteers, with wages paid at the beginning of the war, the White Army consisted predominantly of conscripts with 11,000–15,000 volunteers. The main motives for volunteering were socio-economic factors, such as salary and food, as well as idealism and peer pressure. The Red Guards included 2,600 women, mostly girls recruited from the industrial centres and cities of southern Finland. Urban and agricultural workers constituted the majority of the Red Guards, whereas land-owning farmers and well-educated people formed the backbone of the White Army. Both armies used child soldiers, mainly between 14 and 17 years of age. The use of juvenile soldiers was not rare in World War I; children of the time were under the absolute authority of adults and were not shielded against exploitation. Rifles and machine guns from Imperial Russia were the main armaments of the Reds and the Whites. The most commonly used rifle was the Russian Mosin–Nagant Model 1891. In total, around ten different rifle models were in service, causing problems for ammunition supply. The Maxim gun was the most-used machine gun, along with the less-used M1895 Colt–Browning, Lewis and Madsen guns. The machine guns caused a substantial part of the casualties in combat. Russian field guns were mostly used with direct fire. The Civil War was fought primarily along railways; vital means for transporting troops and supplies, as well for using armoured trains, equipped with light cannons and heavy machine guns. The strategically most important railway junction was Haapamäki, approximately northeast of Tampere, connecting eastern and western Finland and as well as southern and northern Finland. Other critical junctions included Kouvola, Riihimäki, Tampere, Toijala and Vyborg. The Whites captured Haapamäki at the end of January 1918, leading to the Battle of Vilppula. Red Guards and Soviet troops The Finnish Red Guards seized the early initiative in the war by taking control of Helsinki on 28 January 1918 and by undertaking a general offensive lasting from February till early March 1918. The Reds were relatively well-armed, but a chronic shortage of skilled leaders, both at the command level and in the field, left them unable to capitalise on this momentum, and most of the offensives came to nothing. The military chain of command functioned relatively well at company and platoon level, but leadership and authority remained weak as most of the field commanders were chosen by the vote of the troops. The common troops were more or less armed civilians, whose military training, discipline and combat morale were both inadequate and low. Ali Aaltonen was replaced on 28 January 1918 by Eero Haapalainen as commander-in-chief. He, in turn, was displaced by the Bolshevik triumvirate of Eino Rahja, Adolf Taimi and Evert Eloranta on 20 March. The last commander-in-chief of the Red Guard was Kullervo Manner, from 10 April until the last period of the war when the Reds no longer had a named leader. Some talented local commanders, such as Hugo Salmela in the Battle of Tampere, provided successful leadership, but could not change the course of the war. The Reds achieved some local victories as they retreated from southern Finland toward Russia, such as against German troops in the Battle of Syrjäntaka on 28–29 April in Tuulos. Around 50,000 of the former czar's army troops were stationed in Finland in January 1918. The soldiers were demoralised and war-weary, and the former serfs were thirsty for farmland set free by the revolutions. The majority of the troops returned to Russia by the end of March 1918. In total, 7,000 to 10,000 Red Russian soldiers supported the Finnish Reds, but only around 3,000, in separate, smaller units of 100–1,000 soldiers, could be persuaded to fight in the front line. The revolutions in Russia divided the Soviet army officers politically and their attitude towards the Finnish Civil War varied. Mikhail Svechnikov led Finnish Red troops in western Finland in February and Konstantin Yeremejev Soviet forces on the Karelian Isthmus, while other officers were mistrustful of their revolutionary peers and instead co-operated with General Mannerheim, in disarming Soviet garrisons in Finland. On 30 January 1918, Mannerheim proclaimed to Russian soldiers in Finland that the White Army did not fight against Russia, but that the objective of the White campaign was to beat the Finnish Reds and the Soviet troops supporting them. The number of Soviet soldiers active in the civil war declined markedly once Germany attacked Russia on 18 February 1918. The German-Soviet Treaty of Brest-Litovsk of 3 March restricted the Bolsheviks' support for the Finnish Reds to weapons and supplies. The Soviets remained active on the south-eastern front, mainly in the Battle of Rautu on the Karelian Isthmus between February and April 1918, where they defended the approaches to Petrograd. White Guards and Sweden's role While the conflict has been called by some, "The War of Amateurs", the White Army had two major advantages over the Red Guards: the professional military leadership of Gustaf Mannerheim and his staff, which included 84 Swedish volunteer officers and former Finnish officers of the czar's army; and 1,450 soldiers of the 1,900-strong, Jäger battalion. The majority of the unit arrived in Vaasa on 25 February 1918. On the battlefield, the Jägers, battle-hardened on the Eastern Front, provided strong leadership that made disciplined combat of the common White troopers possible. The soldiers were similar to those of the Reds, having brief and inadequate training. At the beginning of the war, the White Guards' top leadership had little authority over volunteer White units, which obeyed only their local leaders. At the end of February, the Jägers started a rapid training of six conscript regiments. The Jäger battalion was politically divided, too. Four-hundred-and-fifty –mostly socialist– Jägers remained stationed in Germany, as it was feared they were likely to side with the Reds. White Guard leaders faced a similar problem when drafting young men to the army in February 1918: 30,000 obvious supporters of the Finnish labour movement never showed up. It was also uncertain whether common troops drafted from the small-sized and poor farms of central and northern Finland had strong enough motivation to fight the Finnish Reds. The Whites' propaganda promoted the idea that they were fighting a defensive war against Bolshevist Russians, and belittled the role of the Red Finns among their enemies. Social divisions appeared both between southern and northern Finland and within rural Finland. The economy and society of the north had modernised more slowly than that of the south. There was a more pronounced conflict between Christianity and socialism in the north, and the ownership of farmland conferred major social status, motivating the farmers to fight against the Reds. Sweden declared neutrality both during World War I and the Finnish Civil War. General opinion, in particular among the Swedish elite, was divided between supporters of the Allies and the Central powers, Germanism being somewhat more popular. Three war-time priorities determined the pragmatic policy of the Swedish liberal-social democratic government: sound economics, with export of iron-ore and foodstuff to Germany; sustaining the tranquility of Swedish society; and geopolitics. The government accepted the participation of Swedish volunteer officers and soldiers in the Finnish White Army in order to block expansion of revolutionary unrest to Scandinavia. A 1,000-strong paramilitary Swedish Brigade, led by Hjalmar Frisell, took part in the Battle of Tampere and in the fighting south of the town. In February 1918, the Swedish Navy escorted the German naval squadron transporting Finnish Jägers and German weapons and allowed it to pass through Swedish territorial waters. The Swedish socialists tried to open peace negotiations between the Whites and the Reds. The weakness of Finland offered Sweden a chance to take over the geopolitically vital Finnish Åland Islands, east of Stockholm, but the German army's Finland operation stalled this plan. German intervention In March 1918, the German Empire intervened in the Finnish Civil War on the side of the White Army. Finnish activists leaning on Germanism had been seeking German aid in freeing Finland from Soviet hegemony since late 1917, but because of the pressure they were facing at the Western Front, the Germans did not want to jeopardise their armistice and peace negotiations with the Soviet Union. The German stance changed after 10 February when Leon Trotsky, despite the weakness of the Bolsheviks' position, broke off negotiations, hoping revolutions would break out in the German Empire and change everything. On 13 February, the German leadership decided to retaliate and send military detachments to Finland too. As a pretext for aggression, the Germans invited "requests for help" from the western neighbouring countries of Russia. Representatives of White Finland in Berlin duly requested help on 14 February. The Imperial German Army attacked Russia on 18 February. The offensive led to a rapid collapse of the Soviet forces and to the signing of the first Treaty of Brest-Litovsk by the Bolsheviks on 3 March 1918. Finland, the Baltic countries, Poland and Ukraine were transferred to the German sphere of influence. The Finnish Civil War opened a low-cost access route to Fennoscandia, where the geopolitical status was altered as a Royal Navy squadron occupied the Soviet harbour of Murmansk by the Arctic Ocean on 9 March 1918. The leader of the German war effort, General Erich Ludendorff, wanted to keep Petrograd under threat of attack via the Vyborg-Narva area and to install a German-led monarchy in Finland. On 5 March 1918, a German naval squadron landed on the Åland Islands (in mid-February 1918, the islands had been occupied by a Swedish military expedition, which departed from there in May). On 3 April 1918, the 10,000-strong Baltic Sea Division (), led by General Rüdiger von der Goltz, launched the main attack at Hanko, west of Helsinki. It was followed on 7 April by Colonel Otto von Brandenstein's 3,000-strong Detachment Brandenstein () taking the town of Loviisa east of Helsinki. The larger German formations advanced eastwards from Hanko and took Helsinki on 12–13 April, while Detachment Brandenstein overran the town of Lahti on 19 April. The main German detachment proceeded northwards from Helsinki and took Hyvinkää and Riihimäki on 21–22 April, followed by Hämeenlinna on 26 April. The final blow to the cause of the Finnish Reds was dealt when the Bolsheviks broke off the peace negotiations at Brest-Litovsk, leading to the German eastern offensive in February 1918. Decisive engagements Battle of Tampere In February 1918, General Mannerheim deliberated on where to focus the general offensive of the Whites. There were two strategically vital enemy strongholds: Tampere, Finland's major industrial town in the south-west, and Vyborg, Karelia's main city. Although seizing Vyborg offered many advantages, his army's lack of combat skills and the potential for a major counterattack by the Reds in the area or in the south-west made it too risky. Mannerheim decided to strike first at Tampere, despite the fact that the town, mostly known for its working class, housed nearly 15,000 heavily armed Red Guards. He launched the main assault on 16 March 1918, at Längelmäki north-east of the town, through the right flank of the Reds' defence. At the same time, the Whites attacked through the north-western frontline Vilppula–Kuru–Kyröskoski–Suodenniemi. Although the Whites were unaccustomed to offensive warfare, some Red Guard units collapsed and retreated in panic under the weight of the offensive, while other Red detachments defended their posts to the last and were able to slow the advance of the White troops. Eventually, the Whites lay siege to Tampere. They cut off the Reds' southward connection at Lempäälä on 24 March and westward ones at Siuro, Nokia, and Ylöjärvi on 25 March. The Battle for Tampere was fought between 16,000 White and 14,000 Red soldiers. It was Finland's first large-scale urban battle and one of the four most decisive military engagements of the war. The fight for the area of Tampere began on 28 March, on the eve of Easter 1918, later called "Bloody Maundy Thursday", in the Kalevankangas Cemetery. The White Army did not achieve a decisive victory in the fierce combat, suffering more than 50 percent losses in some of their units. The Whites had to re-organise their troops and battle plans, managing to raid the town centre in the early hours of 3 April. After a heavy, concentrated artillery barrage, the White Guards advanced from house to house and street to street, as the Red Guards retreated. In the late evening of 3 April, the Whites reached the eastern banks of the Tammerkoski rapids. The Reds' attempts to break the siege of Tampere from the outside along the Helsinki-Tampere railway failed. The Red Guards lost the western parts of the town between 4 and 5 April. The Tampere City Hall was among the last strongholds of the Reds. The battle ended 6 April 1918 with the surrender of Red forces in the Pyynikki and Pispala sections of Tampere. The Reds, now on the defensive, showed increased motivation to fight during the battle. General Mannerheim was compelled to deploy some of the best-trained Jäger detachments, initially meant to be conserved for later use in the Vyborg area. The Battle of Tampere was the bloodiest action of the Civil War. The White Army lost 700–900 men, including 50 Jägers, the highest number of deaths the Jäger battalion suffered in a single battle of the 1918 war. The Red Guards lost 1,000–1,500 soldiers, with a further 11,000–12,000 captured. 71 civilians died, mainly due to artillery fire. The eastern parts of the city, consisting mostly of wooden buildings, were completely destroyed. Battle of Helsinki After peace talks between Germans and the Finnish Reds were broken off on 11 April 1918, the battle for the capital of Finland began. At 05:00 on 12 April, around 2,000–3,000 German Baltic Sea Division soldiers, led by Colonel Hans von Tschirsky und von Bögendorff, attacked the city from the north-west, supported via the Helsinki-Turku railway. The Germans broke through the area between Munkkiniemi and Pasila, and advanced on the central-western parts of the town. The German naval squadron led by Vice Admiral Hugo Meurer blocked the city harbour, bombarded the southern town area, and landed Seebataillon marines at Katajanokka. Around 7,000 Finnish Reds defended Helsinki, but their best troops fought on other fronts of the war. The main strongholds of the Red defence were the Workers' Hall, the Helsinki railway station, the Red Headquarters at Smolna, the Senate Palace–Helsinki University area and the former Russian garrisons. By the late evening of 12 April, most of the southern parts and all of the western area of the city had been occupied by the Germans. Local Helsinki White Guards, having hidden in the city during the war, joined the battle as the Germans advanced through the town. On 13 April, German troops took over the Market Square, the Smolna, the Presidential Palace and the Senate-Ritarihuone area. Toward the end, a German brigade with 2,000–3,000 soldiers, led by Colonel Kondrad Wolf joined the battle. The unit rushed from north to the eastern parts of Helsinki, pushing into the working-class neighborhoods of Hermanni, Kallio and Sörnäinen. German artillery bombarded and destroyed the Workers' Hall and put out the red lantern of the Finnish revolution. The eastern parts of the town surrendered around 14:00 on 13 April, when a white flag was raised in the tower of the Kallio Church. Sporadic fighting lasted until the evening. In total, 60 Germans, 300–400 Reds and 23 White Guard troopers were killed in the battle. Around 7,000 Reds were captured. The German army celebrated the victory with a military parade in the centre of Helsinki on 14 April 1918. Battle of Hyvinkää After losing Helsinki, the Red Defense Command moved to Riihimäki, where it was headed by painter and congressman Efraim Kronqvist. The Germans troops, led by Major General Konrad Wolf, on the other hand, attacked Helsinki north on April 15 and conquered Klaukkala four days later, continuing from there to Hämeenlinna. In that connection, the Battle of Hyvinkää took place in the town of Hyvinkää, in connection with which killed 21 Germans and about 50 Red Guards. After the battle, at least 150 of the Reds were executed by the Whites. Battle of Lahti On 19 April 1918, Detachment Brandenstein took over the town of Lahti. The German troops advanced from the east-southeast via Nastola, through the Mustankallio graveyard in Salpausselkä and the Russian garrisons at Hennala. The battle was minor but strategically important as it cut the connection between the western and eastern Red Guards. Local engagements broke out in the town and the surrounding area between 22 April and 1 May 1918 as several thousand western Red Guards and Red civilian refugees tried to push through on their way to Russia. The German troops were able to hold major parts of the town and halt the Red advance. In total, 600 Reds and 80 German soldiers perished, and 30,000 Reds were captured in and around Lahti. Battle of Vyborg After the defeat in Tampere, the Red Guards began a slow retreat eastwards. As the German army seized Helsinki, the White Army shifted the military focus to Vyborg area, where 18,500 Whites advanced against 15,000 defending Reds. General Mannerheim's war plan had been revised as a result of the Battle for Tampere, a civilian, industrial town. He aimed to avoid new, complex city combat in Vyborg, an old military fortress. The Jäger detachments tried to tie down and destroy the Red force outside the town. The Whites were able to cut the Reds' connection to Petrograd and weaken the troops on the Karelian Isthmus on 20–26 April, but the decisive blow remained to be dealt in Vyborg. The final attack began on late 27 April with a heavy Jäger artillery barrage. The Reds' defence collapsed gradually, and eventually the Whites conquered Patterinmäki—the Reds' symbolic last stand of the 1918 uprising—in the early hours of 29 April 1918. In total, 400 Whites died, and 500–600 Reds perished and 12,000–15,000 were captured. Red and White terror Both Whites and Reds carried out political violence through executions, respectively termed White Terror (; ) and Red Terror (; ). The threshold of political violence had already been crossed by the Finnish activists during the First Period of Russification. Large-scale terror operations were born and bred in Europe during World War I, the first total war. The February and October Revolutions initiated similar violence in Finland: at first by Russian army troops executing their officers, and then later between the Finnish Reds and Whites. The terror consisted of a calculated aspect of general warfare and, on the other hand, the local, personal murders and corresponding acts of revenge. In the former, the commanding staff planned and organised the actions and gave orders to the lower ranks. At least a third of the Red terror and most of the White terror was centrally led. In February 1918, a Desk of Securing Occupied Areas was implemented by the highest-ranking White staff, and the White troops were given Instructions for Wartime Judicature, later called the Shoot on the Spot Declaration. This order authorised field commanders to execute essentially anyone they saw fit. No order by the less-organised, highest Red Guard leadership authorising Red Terror has been found. The paper was "burned" or the command was oral. The main goals of the terror were to destroy the command structure of the enemy; to clear and secure the areas governed and occupied by armies; and to create shock and fear among the civil population and the enemy soldiers. Additionally, the common troops' paramilitary nature and their lack of combat skills drove them to use political violence as a military weapon. Most of the executions were carried out by cavalry units called Flying Patrols, consisting of 10 to 80 soldiers aged 15 to 20 and led by an experienced, adult leader with absolute authority. The patrols, specialised in search and destroy operations and death squad tactics, were similar to German Sturmbattalions and Russian Assault units organized during World War I. The terror achieved some of its objectives but also gave additional motivation to fight against an enemy perceived to be inhuman and cruel. Both Red and White propaganda made effective use of their opponents' actions, increasing the spiral of revenge. The Red Guards executed influential Whites, including politicians, major landowners, industrialists, police officers, civil servants and teachers as well as White Guards. Ten priests of the Evangelical Lutheran Church and 90 moderate socialists were killed. The number of executions varied over the war months, peaking in February as the Reds secured power, but March saw low counts because the Reds could not seize new areas outside of the original frontlines. The numbers rose again in April as the Reds aimed to leave Finland. The two major centres for Red Terror were Toijala and Kouvola, where 300–350 Whites were executed between February and April 1918. The White Guards executed Red Guard and party leaders, Red troops, socialist members of the Finnish Parliament and local Red administrators, and those active in implementing Red Terror. The numbers varied over the months as the Whites conquered southern Finland. Comprehensive White Terror started with their general offensive in March 1918 and increased constantly. It peaked at the end of the war and declined and ceased after the enemy troops had been transferred to prison camps. During the high point of the executions, between the end of April and the beginning of May, 200 Reds were shot per day. White Terror was decisive against Russian soldiers who assisted the Finnish Reds, and several Russian non-socialist civilians were killed in the Vyborg massacre, the aftermath of the Battle of Vyborg. In total, 1,650 Whites died as a result of Red Terror, while around 10,000 Reds perished by White Terror, which turned into political cleansing. White victims have been recorded exactly, while the number of Red troops executed immediately after battles remains unclear. Together with the harsh prison-camp treatment of the Reds during 1918, the executions inflicted the deepest mental scars on the Finns, regardless of their political allegiance. Some of those who carried out the killings were traumatised, a phenomenon that was later documented. End On 8 April 1918, after the defeat in Tampere and the German army intervention, the People's Delegation retreated from Helsinki to Vyborg. The loss of Helsinki pushed them to Petrograd on 25 April. The escape of the leadership embittered many Reds, and thousands of them tried to flee to Russia, but most of the refugees were encircled by White and German troops. In the Lahti area they surrendered on 1–2 May. The long Red caravans included women and children, who experienced a desperate, chaotic escape with severe losses due to White attacks. The scene was described as a "road of tears" for the Reds, but for the Whites, the sight of long, enemy caravans heading east was a victorious moment. The Red Guards' last strongholds between the Kouvola and Kotka area fell by 5 May, after the Battle of Ahvenkoski. The war of 1918 ended on 15 May 1918, when the Whites took over Fort Ino, a Russian coastal artillery base on the Karelian Isthmus, from the Russian troops. White Finland and General Mannerheim celebrated the victory with a large military parade in Helsinki on 16 May 1918. The Red Guards had been defeated. The Finnish labour movement had lost the Civil War, several military leaders committed suicide and a majority of the Reds were sent to prison camps. The Vaasa Senate returned to Helsinki on 4 May 1918, but the capital was under the control of the German army. White Finland had become a protectorate of the German Empire and General Rüdiger von der Goltz was called "the true Regent of Finland". No armistice or peace negotiations were carried out between the Whites and Reds and an official peace treaty to end the Finnish Civil War was never signed. Aftermath and impact Casualties Casualties of Finnish Civil War were according to a Finnish Government project (2004): Died in battle: "whites" 3414, "reds" 5199; Missing: whites 46, reds 1767; Executed: whites 1424, reds 7370; Died in prison camps: whites 4, reds 11652 – total deaths 36640. Prison camps The White Army and German troops captured around 80,000 Red prisoners, including 5,000 women, 1,500 children and 8,000 Russians. The largest prison camps were Suomenlinna (an island facing Helsinki), Hämeenlinna, Lahti, Riihimäki, Tammisaari, Tampere and Vyborg. The Senate decided to keep the prisoners detained until each individual's role in the Civil War had been investigated. Legislation making provision for a Treason Court (; ) was enacted on 29 May 1918. The judicature of the 145 inferior courts led by the Supreme Treason Court (; ) did not meet the standards of impartiality, due to the condemnatory atmosphere of White Finland. In total 76,000
The general strike increased support for the social democrats substantially. The party encompassed a higher proportion of the population than any other socialist movement in the world. The Reform of 1906 was a giant leap towards the political and social liberalisation of the common Finnish people because the Russian House of Romanov had been the most autocratic and conservative ruler in Europe. The Finns adopted a unicameral parliamentary system, the Parliament of Finland (; ) with universal suffrage. The number of voters increased from 126,000 to 1,273,000, including female citizens. The reform led to the social democrats obtaining about fifty percent of the popular vote, but the Czar regained his authority after the crisis of 1905. Subsequently, during the more severe programme of Russification, called "the Second Period of Oppression" by the Finns, the Czar neutralised the power of the Finnish Parliament between 1908 and 1917. He dissolved the assembly, ordered parliamentary elections almost annually, and determined the composition of the Finnish Senate, which did not correlate with the Parliament. The capacity of the Finnish Parliament to solve socio-economic problems was stymied by confrontations between the largely uneducated commoners and the former estates. Another conflict festered as employers denied collective bargaining and the right of the labour unions to represent workers. The parliamentary process disappointed the labour movement, but as dominance in the Parliament and legislation was the workers' most likely way to obtain a more balanced society, they identified themselves with the state. Overall domestic politics led to a contest for leadership of the Finnish state during the ten years before the collapse of the Russian Empire. February Revolution Build-up The Second Period of Russification was halted on 15 March 1917 by the February Revolution, which removed the czar, Nicholas II. The collapse of Russia was caused by military defeats, war-weariness against the duration and hardships of the Great War, and the collision between the most conservative regime in Europe and a Russian people desiring modernisation. The Czar's power was transferred to the State Duma (Russian Parliament) and the right-wing Provisional Government, but this new authority was challenged by the Petrograd Soviet (city council), leading to dual power in the country. The autonomous status of 1809–1899 was returned to the Finns by the March 1917 manifesto of the Russian Provisional Government. For the first time in history, de facto political power existed in the Parliament of Finland. The political left, consisting mainly of social democrats, covered a wide spectrum from moderate to revolutionary socialists. The political right was even more diverse, ranging from social liberals and moderate conservatives to rightist conservative elements. The four main parties were: The conservative Finnish Party; the Young Finnish Party, which included both liberals and conservatives, with the liberals divided between social liberals and economic liberals; the social reformist, centrist Agrarian League, which drew its support mainly from peasants with small or mid-sized farms; and the conservative Swedish People's Party, which sought to retain the rights of the former nobility and the Swedish-speaking minority of Finland. During 1917, a power struggle and social disintegration interacted. The collapse of Russia induced a chain reaction of disintegration, starting from the government, military and economy, and spreading to all fields of society, such as local administration, workplaces and to individual citizens. The social democrats wanted to retain the civil rights already achieved and to increase the socialists' power over society. The conservatives feared the loss of their long-held socio-economic dominance. Both factions collaborated with their equivalents in Russia, deepening the split in the nation. The Social Democratic Party gained an absolute majority in the parliamentary elections of 1916. A new Senate was formed in March 1917 by Oskari Tokoi, but it did not reflect the socialists' large parliamentary majority: it comprised six social democrats and six non-socialists. In theory, the Senate consisted of a broad national coalition, but in practice (with the main political groups unwilling to compromise and top politicians remaining outside of it), it proved unable to solve any major Finnish problem. After the February Revolution, political authority descended to the street level: mass meetings, strike organisations and worker-soldier councils on the left and to active organisations of employers on the right, all serving to undermine the authority of the state. The February Revolution halted the Finnish economic boom caused by the Russian war-economy. The collapse in business led to unemployment and high inflation, but the employed workers gained an opportunity to resolve workplace problems. The commoners' call for the eight-hour working day, better working conditions and higher wages led to demonstrations and large-scale strikes in industry and agriculture. While the Finns had specialised in milk and butter production, the bulk of the food supply for the country depended on cereals produced in southern Russia. The cessation of cereal imports from disintegrating Russia led to food shortages in Finland. The Senate responded by introducing rationing and price controls. The farmers resisted the state control and thus a black market, accompanied by sharply rising food prices, formed. As a consequence, export to the free market of the Petrograd area increased. Food supply, prices and, in the end, the fear of starvation became emotional political issues between farmers and urban workers, especially those who were unemployed. Common people, their fears exploited by politicians and an incendiary, polarised political media, took to the streets. Despite the food shortages, no actual large-scale starvation hit southern Finland before the civil war and the food market remained a secondary stimulator in the power struggle of the Finnish state. Contest for leadership The passing of the Tokoi Senate bill called the "Law of Supreme Power" (, more commonly known as valtalaki; ) in July 1917, triggered one of the key crises in the power struggle between the social democrats and the conservatives. The fall of the Russian Empire opened the question of who would hold sovereign political authority in the former Grand Duchy. After decades of political disappointment, the February Revolution offered the Finnish social democrats an opportunity to govern; they held the absolute majority in Parliament. The conservatives were alarmed by the continuous increase of the socialists' influence since 1899, which reached a climax in 1917. The "Law of Supreme Power" incorporated a plan by the socialists to substantially increase the authority of Parliament, as a reaction to the non-parliamentary and conservative leadership of the Finnish Senate between 1906 and 1916. The bill furthered Finnish autonomy in domestic affairs: the Russian Provisional Government was only allowed the right to control Finnish foreign and military policies. The Act was adopted with the support of the Social Democratic Party, the Agrarian League, part of the Young Finnish Party and some activists eager for Finnish sovereignty. The conservatives opposed the bill and some of the most right-wing representatives resigned from Parliament. In Petrograd, the social democrats' plan had the backing of the Bolsheviks. They had been plotting a revolt against the Provisional Government since April 1917, and pro-Soviet demonstrations during the July Days brought matters to a head. The Helsinki Soviet and the Regional Committee of the Finnish Soviets, led by the Bolshevik Ivar Smilga, both pledged to defend the Finnish Parliament, were it threatened with attack. However, the Provisional Government still had sufficient support in the Russian army to survive and as the street movement waned, Vladimir Lenin fled to Karelia. In the aftermath of these events, the "Law of Supreme Power" was overruled and the social democrats eventually backed down; more Russian troops were sent to Finland and, with the co-operation and insistence of the Finnish conservatives, Parliament was dissolved and new elections announced. In the October 1917 elections, the social democrats lost their absolute majority, which radicalised the labour movement and decreased support for moderate politics. The crisis of July 1917 did not bring about the Red Revolution of January 1918 on its own, but together with political developments based on the commoners' interpretation of the ideas of Fennomania and socialism, the events favoured a Finnish revolution. In order to win power, the socialists had to overcome Parliament. The February Revolution resulted in a loss of institutional authority in Finland and the dissolution of the police force, creating fear and uncertainty. In response, both the right and left assembled their own security groups, which were initially local and largely unarmed. By late 1917, following the dissolution of Parliament, in the absence of a strong government and national armed forces, the security groups began assuming a broader and more paramilitary character. The Civil Guards (; ; ) and the later White Guards (; ) were organised by local men of influence: conservative academics, industrialists, major landowners, and activists. The Workers' Order Guards (; ) and the Red Guards (; ) were recruited through the local social democratic party sections and from the labour unions. October Revolution The Bolsheviks' and Vladimir Lenin's October Revolution of 7 November 1917 transferred political power in Petrograd to the radical, left-wing socialists. The German government's decision to arrange safe-conduct for Lenin and his comrades from exile in Switzerland to Petrograd in April 1917, was a success. An armistice between Germany and the Bolshevik regime came into force on 6 December and peace negotiations began on 22 December 1917 at Brest-Litovsk. November 1917 became another watershed in the 1917–1918 rivalry for the leadership of Finland. After the dissolution of the Finnish Parliament, polarisation between the social democrats and the conservatives increased markedly and the period witnessed the appearance of political violence. An agricultural worker was shot during a local strike on 9 August 1917 at Ypäjä and a Civil Guard member was killed in a local political crisis at Malmi on 24 September. The October Revolution disrupted the informal truce between the Finnish non-socialists and the Russian Provisional Government. After political wrangling over how to react to the revolt, the majority of the politicians accepted a compromise proposal by Santeri Alkio, the leader of the Agrarian League. Parliament seized the sovereign power in Finland on 15 November 1917 based on the socialists' "Law of Supreme Power" and ratified their proposals of an eight-hour working day and universal suffrage in local elections, from July 1917. The purely non-socialist, conservative-led government of Pehr Evind Svinhufvud was appointed on 27 November. This nomination was both a long-term aim of the conservatives and a response to the challenges of the labour movement during November 1917. Svinhufvud's main aspirations were to separate Finland from Russia, to strengthen the Civil Guards, and to return a part of Parliament's new authority to the Senate. There were 149 Civil Guards on 31 August 1917 in Finland, counting local units and subsidiary White Guards in towns and rural communes; 251 on 30 September; 315 on 31 October; 380 on 30 November and 408 on 26 January 1918. The first attempt at serious military training among the Guards was the establishment of a 200-strong cavalry school at the Saksanniemi estate in the vicinity of the town of Porvoo, in September 1917. The vanguard of the Finnish Jägers and German weaponry arrived in Finland during October–November 1917 on the freighter and the German U-boat ; around 50 Jägers had returned by the end of 1917. After political defeats in July and October 1917, the social democrats put forward an uncompromising program called "We Demand" (; ) on 1 November, in order to push for political concessions. They insisted upon a return to the political status before the dissolution of Parliament in July 1917, disbandment of the Civil Guards and elections to establish a Finnish Constituent Assembly. The program failed and the socialists initiated a general strike during 14–19 November to increase political pressure on the conservatives, who had opposed the "Law of Supreme Power" and the parliamentary proclamation of sovereign power on 15 November. Revolution became the goal of the radicalised socialists after the loss of political control, and events in November 1917 offered momentum for a socialist uprising. In this phase, Lenin and Joseph Stalin, under threat in Petrograd, urged the social democrats to take power in Finland. The majority of Finnish socialists were moderate and preferred parliamentary methods, prompting the Bolsheviks to label them "reluctant revolutionaries". The reluctance diminished as the general strike appeared to offer a major channel of influence for the workers in southern Finland. The strike leadership voted by a narrow majority to start a revolution on 16 November, but the uprising had to be called off the same day due to the lack of active revolutionaries to execute it. At the end of November 1917, the moderate socialists among the social democrats won a second vote over the radicals in a debate over revolutionary versus parliamentary means, but when they tried to pass a resolution to completely abandon the idea of a socialist revolution, the party representatives and several influential leaders voted it down. The Finnish labour movement wanted to sustain a military force of its own and to keep the revolutionary road open, too. The wavering Finnish socialists disappointed V. I. Lenin and in turn, he began to encourage the Finnish Bolsheviks in Petrograd. Among the labour movement, a more marked consequence of the events of 1917 was the rise of the Workers' Order Guards. There were 20–60 separate guards between 31 August and 30 September 1917, but on 20 October, after defeat in parliamentary elections, the Finnish labour movement proclaimed the need to establish more worker units. The announcement led to a rush of recruits: on 31 October the number of guards was 100–150; 342 on 30 November 1917 and 375 on 26 January 1918. Since May 1917, the paramilitary organisations of the left had grown in two phases, the majority of them as Workers' Order Guards. The minority were Red Guards, these were partly underground groups formed in industrialised towns and industrial centres, such as Helsinki, Kotka and Tampere, based on the original Red Guards that had been formed during 1905–1906 in Finland. The presence of the two opposing armed forces created a state of dual power and divided sovereignty on Finnish society. The decisive rift between the guards broke out during the general strike: the Reds executed several political opponents in southern Finland and the first armed clashes between the Whites and Reds took place. In total, 34 casualties were reported. Eventually, the political rivalries of 1917 led to an arms race and an escalation towards civil war. Independence of Finland The disintegration of Russia offered Finns an historic opportunity to gain national independence. After the October Revolution, the conservatives were eager for secession from Russia in order to control the left and minimise the influence of the Bolsheviks. The socialists were skeptical about sovereignty under conservative rule, but they feared a loss of support among nationalistic workers, particularly after having promised increased national liberty through the "Law of Supreme Power". Eventually, both political factions supported an independent Finland, despite strong disagreement over the composition of the nation's leadership. Nationalism had become a "civic religion" in Finland by the end of nineteenth century, but the goal during the general strike of 1905 was a return to the autonomy of 1809–1898, not full independence. In comparison to the unitary Swedish regime, the domestic power of Finns had increased under the less uniform Russian rule. Economically, the Grand Duchy of Finland benefited from having an independent domestic state budget, a central bank with national currency, the markka (deployed 1860), and customs organisation and the industrial progress of 1860–1916. The economy was dependent on the huge Russian market and separation would disrupt the profitable Finnish financial zone. The economic collapse of Russia and the power struggle of the Finnish state in 1917 were among the key factors that brought sovereignty to the fore in Finland. Svinhufvud's Senate introduced Finland's Declaration of Independence on 4 December 1917 and Parliament adopted it on 6 December. The social democrats voted against the Senate's proposal, while presenting an alternative declaration of sovereignty. The establishment of an independent state was not a guaranteed conclusion for the small Finnish nation. Recognition by Russia and other great powers was essential; Svinhufvud accepted that he had to negotiate with Lenin for the acknowledgement. The socialists, having been reluctant to enter talks with the Russian leadership in July 1917, sent two delegations to Petrograd to request that Lenin approve Finnish sovereignty. In December 1917, Lenin was under intense pressure from the Germans to conclude peace negotiations at Brest-Litovsk, and the Bolsheviks' rule was in crisis, with an inexperienced administration and the demoralised army facing powerful political and military opponents. Lenin calculated that the Bolsheviks could fight for central parts of Russia but had to give up some peripheral territories, including Finland in the geopolitically less important north-western corner. As a result, Svinhufvud's delegation won Lenin's concession of sovereignty on 31 December 1917. By the beginning of the Civil War, Austria-Hungary, Denmark, France, Germany, Greece, Norway, Sweden and Switzerland had recognised Finnish independence. The United Kingdom and United States did not approve it; they waited and monitored the relations between Finland and Germany (the main enemy of the Allies), hoping to override Lenin's regime and to get Russia back into the war against the German Empire. In turn, the Germans hastened Finland's separation from Russia so as to move the country to within their sphere of influence. Warfare Escalation The final escalation towards war began in early January 1918, as each military or political action of the Reds or the Whites resulted in a corresponding counteraction by the other. Both sides justified their activities as defensive measures, particularly to their own supporters. On the left, the vanguard of the movement was the urban Red Guards from Helsinki, Kotka and Turku; they led the rural Reds and convinced the socialist leaders who wavered between peace and war to support the revolution. On the right, the vanguard was the Jägers, who had transferred to Finland, and the volunteer Civil Guards of southwestern Finland, southern Ostrobothnia and Vyborg province in the southeastern corner of Finland. The first local battles were fought during 9–21 January 1918 in southern and southeastern Finland, mainly to win the arms race and to control Vyborg (; ). On 12 January 1918, Parliament authorised the Svinhufvud Senate to establish internal order and discipline on behalf of the state. On 15 January, Carl Gustaf Emil Mannerheim, a former Finnish general of the Imperial Russian Army, was appointed the commander-in-chief of the Civil Guards. The Senate appointed the Guards, henceforth called the White Guards, as the White Army of Finland. Mannerheim placed his Headquarters of the White Army in the Vaasa–Seinäjoki area. The White Order to engage was issued on 25 January. The Whites gained weaponry by disarming Russian garrisons during 21–28 January, in particular in southern Ostrobothnia. The Red Guards, led by Ali Aaltonen, refused to recognise the Whites' hegemony and established a military authority of their own. Aaltonen installed his headquarters in Helsinki and nicknamed it Smolna echoing the Smolny Institute, the Bolsheviks' headquarters in Petrograd. The Red Order of Revolution was issued on 26 January, and a red lantern, a symbolic indicator of the uprising, was lit in the tower of the Helsinki Workers' House. A large-scale mobilisation of the Reds began late in the evening of 27 January, with the Helsinki Red Guard and some of the Guards located along the Vyborg-Tampere railway having been activated between 23 and 26 January, in order to safeguard vital positions and escort a heavy railroad shipment of Bolshevik weapons from Petrograd to Finland. White troops tried to capture the shipment: 20–30 Finns, Red and White, died in the Battle of Kämärä at the Karelian Isthmus on 27 January 1918. The Finnish rivalry for power had culminated. Opposing parties Red Finland and White Finland At the beginning of the war, a discontinuous front line ran through southern Finland from west to east, dividing the country into White Finland and Red Finland. The Red Guards controlled the area to the south, including nearly all the major towns and industrial centres, along with the largest estates and farms with the highest numbers of crofters and tenant farmers. The White Army controlled the area to the north, which was predominantly agrarian and contained small or medium-sized farms and tenant farmers. The number of crofters was lower and they held a better social status than those in the south. Enclaves of the opposing forces existed on both sides of the front line: within the White area lay the industrial towns of Varkaus, Kuopio, Oulu, Raahe, Kemi and Tornio; within the Red area lay Porvoo, Kirkkonummi and Uusikaupunki. The elimination of these strongholds was a priority for both armies in February 1918. Red Finland was led by the Finnish People's Delegation (; ), established on 28 January 1918 in Helsinki, which was supervised by the Central Workers' Council. The delegation sought democratic socialism based on the Finnish Social Democratic Party's ethos; their visions differed from Lenin's dictatorship of the proletariat. Otto Ville Kuusinen formulated a proposal for a new constitution, influenced by those of Switzerland and the United States. With it, political power was to be concentrated to Parliament, with a lesser role for a government. The proposal included a multi-party system; freedom of assembly, speech and press; and the use of referenda in political decision-making. In order to ensure the authority of the labour movement, the common people would have a right to permanent revolution. The socialists planned to transfer a substantial part of property rights to the state and local administrations. In foreign policy, Red Finland leaned on Bolshevist Russia. A Red-initiated Finno–Russian treaty and peace agreement was signed on 1 March 1918, where Red Finland was called the Finnish Socialist Workers' Republic (; ). The negotiations for the treaty implied that –as in World War I in general– nationalism was more important for both sides than the principles of international socialism. The Red Finns did not simply accept an alliance with the Bolsheviks and major disputes appeared, for example, over the demarcation of the border between Red Finland and Soviet Russia. The significance of the Russo–Finnish Treaty evaporated quickly due to the signing of the Treaty of Brest-Litovsk between the Bolsheviks and the German Empire on 3 March 1918. Lenin's policy on the right of nations to self-determination aimed at preventing the disintegration of Russia during the period of military weakness. He assumed that in war-torn, splintering Europe, the proletariat of free nations would carry out socialist revolutions and unite with Soviet Russia later. The majority of the Finnish labour movement supported Finland's independence. The Finnish Bolsheviks, influential, though few in number, favoured annexation of Finland by Russia. The government of White Finland, Pehr Evind Svinhufvud's first senate, was called the Vaasa Senate after its relocation to the safer west-coast city of Vaasa, which acted as the capital of the Whites from 29 January to 3 May 1918. In domestic policy, the White Senate's main goal was to return the political right to power in Finland. The conservatives planned a monarchist political system, with a lesser role for Parliament. A section of the conservatives had always supported monarchy and opposed democracy; others had approved of parliamentarianism since the revolutionary reform of 1906, but after the crisis of 1917–1918, concluded that empowering the common people would not work. Social liberals and reformist non-socialists opposed any restriction of parliamentarianism. They initially resisted German military help, but the prolonged warfare changed their stance. In foreign policy, the Vaasa Senate relied on the German Empire for military and political aid. Their objective was to defeat the Finnish Reds; end the influence of Bolshevist Russia in Finland and expand Finnish territory to East Karelia, a geopolitically significant home to people speaking Finnic languages. The weakness of Russia inspired an idea of Greater Finland among the expansionist factions of both the right and left: the Reds had claims concerning the same areas. General Mannerheim agreed on the need to take over East Karelia and to request German weapons, but opposed actual German intervention in Finland. Mannerheim recognised the Red Guards' lack of combat skill and trusted in the abilities of the German-trained Finnish Jägers. As a former Russian army officer, Mannerheim was well aware of the demoralisation of the Russian army. He co-operated with White-aligned Russian officers in Finland and Russia. Soldiers and weapons The number of Finnish troops on each side varied from 70,000 to 90,000 and both had
Intelligence? further expanded on this theory. Environmental changes resulting from modernization—such as more intellectually demanding work, greater use of technology, and smaller families—have meant that a much larger proportion of people are more accustomed to manipulating abstract concepts such as hypotheses and categories than a century ago. Substantial portions of IQ tests deal with these abilities. Flynn gives, as an example, the question 'What do a dog and a rabbit have in common?' A modern respondent might say they are both mammals (an abstract, or a priori answer, which depends only on the meanings of the words dog and rabbit), whereas someone a century ago might have said that humans catch rabbits with dogs (a concrete, or a posteriori answer, which depended on what happened to be the case at that time). Nutrition Improved nutrition is another possible explanation. Today's average adult from an industrialized nation is taller than a comparable adult of a century ago. That increase of stature, likely the result of general improvements in nutrition and health, has been at a rate of more than a centimeter per decade. Available data suggest that these gains have been accompanied by analogous increases in head size, and by an increase in the average size of the brain. This argument had been thought to suffer the difficulty that groups who tend to be of smaller overall body size (e.g. women, or people of Asian ancestry) do not have lower average IQs. A 2005 study presented data supporting the nutrition hypothesis, which predicts that gains will occur predominantly at the low end of the IQ distribution, where nutritional deprivation is probably most severe. An alternative interpretation of skewed IQ gains could be that improved education has been particularly important for this group. Richard Lynn makes the case for nutrition, arguing that cultural factors cannot typically explain the Flynn effect because its gains are observed even at infant and preschool levels, with rates of IQ test score increase about equal to those of school students and adults. Lynn states that "This rules out improvements in education, greater test sophistication, etc., and most of the other factors that have been proposed to explain the Flynn effect. He proposes that the most probable factor has been improvements in pre-natal and early post-natal nutrition." A century ago, nutritional deficiencies may have limited body and organ functionality, including skull volume. The first two years of life are a critical time for nutrition. The consequences of malnutrition can be irreversible and may include poor cognitive development, educability, and future economic productivity. On the other hand, Flynn has pointed to 20-point gains on Dutch military (Raven's type) IQ tests between 1952, 1962, 1972, and 1982. He observes that the Dutch 18-year-olds of 1962 had a major nutritional handicap. They were either in the womb or were recently born, during the great Dutch famine of 1944—when German troops monopolized food and 18,000 people died of starvation. Yet, concludes Flynn, "they do not show up even as a blip in the pattern of Dutch IQ gains. It is as if the famine had never occurred." It appears that the effects of diet are gradual, taking effect over decades (affecting mother as well as the child) rather than a few months. In support of the nutritional hypothesis, it is known that, in the United States, the average height before 1900 was about 10 cm (~4 inches) shorter than it is today. Possibly related to the Flynn effect is a similar change of skull size and shape during the last 150 years. A Norwegian study found that height gains were strongly correlated with intelligence gains until the cessation of height gains in military conscript cohorts towards the end of the 1980s. Both height and skull size increases probably result from a combination of phenotypic plasticity and genetic selection over this period. With only five or six human generations in 150 years, time for natural selection has been very limited, suggesting that increased skeletal size resulting from changes in population phenotypes is more likely than recent genetic evolution. It is well known that micronutrient deficiencies change the development of intelligence. For instance, one study has found that iodine deficiency causes a fall, on average, of 12 IQ points in China. Scientists James Feyrer, Dimitra Politi, and David N. Weil have found in the U.S. that the proliferation of iodized salt increased IQ by 15 points in some areas. Journalist Max Nisen has stated that with this type of salt becoming popular, that "the aggregate effect has been extremely positive." Daley et al. (2003) found a significant Flynn effect among children in rural Kenya, and concluded that nutrition was one of the hypothesized explanations that best explained their results (the others were parental literacy and family structure). Infectious diseases Eppig, Fincher, and Thornhill (2009) argue that "From an energetics standpoint, a developing human will have difficulty building a brain and fighting off infectious diseases at the same time, as both are very metabolically costly tasks" and that "the Flynn effect may be caused in part by the decrease in the intensity of infectious diseases as nations develop." They suggest that improvements in gross domestic product (GDP), education, literacy, and nutrition may have an effect on IQ mainly through reducing the intensity of infectious diseases. Eppig, Fincher, and Thornhill (2011) in a similar study instead looking at different US states found that states with a higher prevalence of infectious diseases had lower average IQ. The effect remained after controlling for the effects of wealth and educational variation. Atheendar Venkataramani (2010) studied the effect of malaria on IQ in a sample of Mexicans. Malaria eradication during the birth year was associated with increases in IQ. It also increased the probability of employment in a skilled occupation. The author suggests that this may be one explanation for the Flynn effect and that this may be an important explanation for the link between national malaria burden and economic development. A literature review of 44 papers states that cognitive abilities and school performance were shown to be impaired in sub-groups of patients (with either cerebral malaria or uncomplicated malaria) when compared with healthy controls. Studies comparing cognitive functions before and after treatment for acute malarial illness continued to show significantly impaired school performance and cognitive abilities even after recovery. Malaria prophylaxis was shown to improve cognitive function and school performance in clinical trials when compared to placebo groups. Heterosis Heterosis, or hybrid vigor associated with historical reductions of the levels of inbreeding, has been proposed by Michael Mingroni as an alternative explanation of the Flynn effect. However, James Flynn has pointed out that even if everyone mated with a sibling in 1900, subsequent increases in heterosis would not be a sufficient explanation of the observed IQ gains. Reduction of lead in gasoline One study found the drop in blood lead levels in the United States from the 1970s to 2007 correlated with a 4-5 point increase in IQ. Possible end of progression Jon Martin Sundet and colleagues (2004) examined scores on intelligence tests given to Norwegian conscripts between the 1950s and 2002. They found that the increase of scores of general intelligence stopped after the mid-1990s and declined in numerical reasoning sub-tests. Teasdale and Owen (2005) examined the results of IQ tests given to Danish male conscripts. Between 1959 and 1979 the gains were 3 points per decade. Between 1979 and 1989 the increase approached 2 IQ points. Between 1989 and 1998 the gain was about 1.3 points. Between 1998 and 2004 IQ declined by about the same amount as it gained between 1989 and 1998. They speculate that "a contributing factor in this recent fall could be a simultaneous decline in proportions of students entering 3-year advanced-level school programs for 16–18-year-olds." The same authors in a more comprehensive 2008 study, again on Danish male conscripts, found that there was a 1.5-point increase between 1988 and 1998, but a 1.5-point decrease between 1998 and 2003/2004. A possible contributing factor to the more recent decline may be the changes in the Danish educational system. Another may be the rising proportion of immigrants or their immediate descendants in Denmark. This is supported by data on Danish draftees where first or second-generation immigrants with Danish nationality score below average. In Australia, the IQ of 6–12 year olds as measured by the Colored Progressive Matrices has shown no increase from 1975 to 2003. In the United Kingdom, a study by Flynn (2009) found that tests carried out in 1980 and again in 2008 show that the IQ score of an average 14-year-old dropped by more than two points over the period. For the upper half of the results, the performance was even worse. Average IQ scores declined by six points. However, children aged between five and 10 saw their IQs increase by up to half a point a year over the three decades. Flynn argues that the abnormal drop in British teenage IQ could be due to youth culture having "stagnated" or even dumbed down. He also states that the youth culture is more oriented towards computer games than towards reading and holding conversations. Researcher Richard House, commenting on the study, also mentions the computer culture diminishing reading books as well as a tendency towards teaching to the test. Stefansson et al. (2017) argue for a decline in polygenic scores pertaining to educational attainment in Icelandic individuals born from 1910 to 1990. They point out that the effect observed is extremely negligible, however, and may only be of concern if the trend is assumed to be larger in genomic effect and continues across centuries. Bratsberg & Rogeberg (2018) present evidence that the Flynn effect in Norway has reversed, and that both the original rise in mean IQ scores and their subsequent decline was caused by environmental factors. They conclude that environmental factors explain all or almost all of the decline, and the hypothesized declines in genotypic IQ is negligible, although they "cannot rule out the theoretical possibility of negative selection on a genetic component that is masked when assessed using environmentally influenced measures", not being able to rule out the decline posited by Stefansson et al. One possible explanation of a worldwide decline in intelligence, suggested by the World Health Organization and the Forum of International Respiratory Societies' Environmental Committee, is an increase in air pollution, which now affects over 90% of the world's population. IQ group differences If the Flynn effect has ended in developed nations but continues in less developed ones, this would tend to diminish national differences in IQ scores. Also, if the Flynn effect has ended for the majority in developed nations, it may still continue for minorities, especially for groups like immigrants where many may have received poor nutrition during early childhood or have had other disadvantages. A study in the Netherlands found that children of non-Western immigrants had improvements for g, educational achievements, and work proficiency compared to their parents, although there were still remaining differences compared to ethnic Dutch. In the United States, the IQ gap between black and white people was gradually closing over the last decades of the 20th century, as black test-takers increased their average scores relative to white test-takers. For instance, Vincent reported in 1991 that the black–white IQ gap was decreasing among children, but that it was remaining constant among adults. Similarly, a 2006 study by Dickens and Flynn estimated that the difference between mean scores of black people and white people closed by about 5 or 6 IQ points between 1972 and 2002, a reduction of about one-third. In the same period, the educational achievement disparity also diminished. Reviews by Flynn and Dickens, Mackintosh, and Nisbett et al. all concluded that the gradual closing of the gap was a real phenomenon. Flynn has commented that he never claimed that the Flynn effect has the same causes as the black-white gap, but that it shows that environmental factors can create IQ differences of a magnitude similar to the gap. A meta-analysis which examined
reported for semantic and episodic memory. Some research suggests that there may be an ongoing reversed Flynn effect (i.e., a decline in IQ scores) in Norway, Denmark, Australia, Britain, the Netherlands, Sweden, Finland, and German-speaking countries, a development which appears to have started in the 1990s. In certain cases, this apparent reversal may be due to cultural changes which render parts of intelligence tests obsolete. Meta-analyses indicate that, overall, the Flynn effect continues, either at the same rate or at a slower rate in developed countries. Origin of term The Flynn effect is named for James R. Flynn, who did much to document it and promote awareness of its implications. The term itself was coined by Richard Herrnstein and Charles Murray in their 1994 book The Bell Curve. Although the general term for the phenomenon—referring to no researcher in particular—continues to be "secular rise in IQ scores", many textbooks on psychology and IQ testing have now followed the lead of Herrnstein and Murray in calling the phenomenon the Flynn effect. Rise in IQ IQ tests are updated periodically. For example, the Wechsler Intelligence Scale for Children (WISC), originally developed in 1949, was updated in 1974, 1991, 2003, and again in 2014. The revised versions are standardized based on the performance of test-takers in standardization samples. A standard score of IQ 100 is defined as the median performance of the standardization sample. Thus one way to see changes in norms over time is to conduct a study in which the same test-takers take both an old and new version of the same test. Doing so confirms IQ gains over time. Some IQ tests, for example, tests used for military draftees in NATO countries in Europe, report raw scores, and those also confirm a trend of rising scores over time. The average rate of increase seems to be about three IQ points per decade in the United States, as scaled by the Wechsler tests. The increasing test performance over time appears on every major test, in every age range, at every ability level, and in every modern industrialized country, although not necessarily at the same rate as in the United States. The increase was continuous and roughly linear from the earliest days of testing to the mid-1990s. Though the effect is most associated with IQ increases, a similar effect has been found with increases in attention and of semantic and episodic memory. Ulric Neisser estimated that using the IQ values of 1997, the average IQ of the United States in 1932, according to the first Stanford–Binet Intelligence Scales standardization sample, was 80. Neisser states that "Hardly any of them would have scored 'very superior', but nearly one-quarter would have appeared to be 'deficient.'" He also wrote that "Test scores are certainly going up all over the world, but whether intelligence itself has risen remains controversial." Trahan et al. (2014) found that the effect was about 2.93 points per decade, based on both Stanford–Binet and Wechsler tests; they also found no evidence the effect was diminishing. In contrast, Pietschnig and Voracek (2015) reported, in their meta-analysis of studies involving nearly 4 million participants, that the Flynn effect had decreased in recent decades. They also reported that the magnitude of the effect was different for different types of intelligence ("0.41, 0.30, 0.28, and 0.21 IQ points annually for fluid, spatial, full-scale, and crystallized IQ test performance, respectively"), and that the effect was stronger for adults than for children. Raven (2000) found that, as Flynn suggested, data interpreted as showing a decrease in many abilities with increasing age must be re-interpreted as showing that there has been a dramatic increase of these abilities with the date of birth. On many tests this occurs at all levels of ability. Some studies have found the gains of the Flynn effect to be particularly concentrated at the lower end of the distribution. Teasdale and Owen (1989), for example, found the effect primarily reduced the number of low-end scores, resulting in an increased number of moderately high scores, with no increase in very high scores. In another study, two large samples of Spanish children were assessed with a 30-year gap. Comparison of the IQ distributions indicated that the mean IQ scores on the test had increased by 9.7 points (the Flynn effect), the gains were concentrated in the lower half of the distribution and negligible in the top half, and the gains gradually decreased as the IQ of the individuals increased. Some studies have found a reverse Flynn effect with declining scores for those with high IQ. In 1987, Flynn took the position that the very large increase indicates that IQ tests do not measure intelligence but only a minor sort of "abstract problem-solving ability" with little practical significance. He argued that if IQ gains do reflect intelligence increases, there would have been consequent changes of our society that have not been observed (a presumed non-occurrence of a "cultural renaissance"). Flynn no longer endorses this view of intelligence and has since elaborated and refined his view of what rising IQ scores mean. Precursors to Flynn's publications Earlier investigators had discovered rises in raw IQ test scores in some study populations, but had not published general investigations of that issue in particular. Historian Daniel C. Calhoun cited earlier psychology literature on IQ score trends in his book The Intelligence of a People (1973). R. L. Thorndike drew attention to rises in Stanford-Binet scores in a 1975 review of the history of intelligence testing. Richard Lynn recorded an increase in Japanese IQ in 1982. Intelligence There is debate about whether the rise in IQ scores also corresponds to a rise in general intelligence, or only a rise in special skills related to taking IQ tests. Because children attend school longer now and have become much more familiar with the testing of school-related material, one might expect the greatest gains to occur on such school content-related tests as vocabulary, arithmetic or general information. Just the opposite is the case: abilities such as these have experienced relatively small gains and even occasional decreases over the years. Meta-analytic findings indicate that Flynn effects occur for tests assessing both fluid and crystallized abilities. For example, Dutch conscripts gained 21 points during only 30 years, or 7 points per decade, between 1952 and 1982. But this rise in IQ test scores is not wholly explained by an increase in general intelligence. Studies have shown that while test scores have improved over time, the improvement is not fully correlated with latent factors related to intelligence. Rushton argues that the gains in IQ over time are unrelated to general intelligence. Other researchers argue that the IQ gains described by the Flynn effect are due in part to increasing intelligence, and in part to increases
tip causes a natural magnification — ions are repelled in a direction roughly perpendicular to the surface (a "point projection" effect). A detector is placed so as to collect these repelled ions; the image formed from all the collected ions can be of sufficient resolution to image individual atoms on the tip surface. Unlike conventional microscopes, where the spatial resolution is limited by the wavelength of the particles which are used for imaging, the FIM is a projection type microscope with atomic resolution and an approximate magnification of a few million times. Design, limitations and applications FIM like Field Emission Microscopy (FEM) consists of a sharp sample tip and a fluorescent screen (now replaced by a multichannel plate) as the key elements. However, there are some essential differences as follows: The tip potential is positive. The chamber is filled with an imaging gas (typically, He or Ne at 10−5 to 10−3 Torr). The tip is cooled to low temperatures (~20-80K). Like FEM, the field strength at the tip apex is typically a few V/Å. The experimental set-up and image formation in FIM is illustrated in the accompanying figures. In FIM the presence of a strong field is critical. The imaging gas atoms (He, Ne) near the tip are polarized by the field and since the field is non-uniform the polarized atoms are attracted towards the tip surface. The imaging atoms then lose their kinetic energy performing a series of hops and accommodate to the tip temperature. Eventually, the
the tip causes a natural magnification — ions are repelled in a direction roughly perpendicular to the surface (a "point projection" effect). A detector is placed so as to collect these repelled ions; the image formed from all the collected ions can be of sufficient resolution to image individual atoms on the tip surface. Unlike conventional microscopes, where the spatial resolution is limited by the wavelength of the particles which are used for imaging, the FIM is a projection type microscope with atomic resolution and an approximate magnification of a few million times. Design, limitations and applications FIM like Field Emission Microscopy (FEM) consists of a sharp sample tip and a fluorescent screen (now replaced by a multichannel plate) as the key elements. However, there are some essential differences as follows: The tip potential is positive. The chamber is filled with an imaging gas (typically, He or Ne at 10−5 to 10−3 Torr). The tip is cooled to low temperatures (~20-80K). Like FEM, the field strength at the tip apex is typically a few V/Å. The experimental set-up and image formation in FIM is illustrated in the accompanying figures. In FIM the presence of a strong field is critical. The imaging gas atoms (He, Ne) near the tip are polarized by the field and since the field is non-uniform the polarized atoms are attracted towards the tip surface. The imaging atoms then lose their kinetic energy performing a series of hops and accommodate to the tip temperature. Eventually, the imaging atoms are ionized by tunneling electrons into the surface and the resulting positive ions are accelerated along the field lines to the screen to form a highly magnified image of the sample tip. In FIM, the ionization takes place close to the tip, where the field is strongest. The electron that tunnels from the atom is picked up by the tip. There is a critical distance, xc, at which the tunneling probability is a maximum. This distance is typically about 0.4 nm. The very high spatial resolution and high contrast for features on the atomic scale arises from the fact that the electric field is enhanced in the vicinity of the surface atoms because of the higher local curvature. The resolution of FIM is limited by the thermal velocity of the imaging ion. Resolution of the order of 1Å (atomic resolution) can be achieved by effective cooling of the tip. Application of FIM, like FEM, is limited by the materials which can be fabricated in the shape of a sharp tip, can be used in an ultra high vacuum (UHV) environment, and can tolerate the high electrostatic fields. For these reasons,
units left fit to fight, the infantry positions would be defeated in detail. The Mersa defence plan also included an armoured reserve but in its absence Ritchie believed he could organise his infantry to cover the minefields between the defended localities to prevent Axis engineers from having undisturbed access. To defend the Matruh line, Ritchie placed 10th Indian Infantry Division (in Matruh itself) and 50th (Northumbrian) Infantry Division (some down the coast at Gerawla) under X Corps HQ, newly arrived from Syria. Inland from X Corps would be XIII Corps with 5th Indian Infantry Division (with only one infantry brigade, 29th Indian, and two artillery regiments) around Sidi Hamza about inland, and the newly arrived 2nd New Zealand Division (short one brigade, the 6th, which had been left out of combat in case the division was captured and it would be needed to serve as the nucleus of a new division) at Minqar Qaim (on the escarpment inland) and 1st Armored Division in the open desert to the south. The 1st Armored Division had taken over 4th and 22nd Armoured Brigades from 7th Armoured Division which by this time had only three tank regiments (battalions) between them. On 25 June, General Claude Auchinleck—Commander-in-Chief (C-in-C) Middle East Command—relieved Ritchie and assumed direct command of the Eighth Army himself. He decided not to seek a decisive confrontation at the Mersa Matruh position. He concluded that his inferiority in armour after the Gazala defeat, meant he would be unable to prevent Rommel either breaking through his centre or enveloping his open left flank to the south in the same way he had at Gazala. He decided instead to employ delaying tactics while withdrawing a further or more east to a more defensible position near El Alamein on the Mediterranean coast. Only to the south of El Alamein, the steep slopes of the Qattara Depression ruled out the possibility of Axis armour moving around the southern flank of his defences and limited the width of the front he had to defend. Battle of Mersa Matruh While preparing the Alamein positions, Auchinleck fought strong delaying actions, first at Mersa Matruh on 26–27 June and then Fuka on 28 June. The late change of orders resulted in some confusion in the forward formations (X Corps and XIII Corps) between the desire to inflict damage on the enemy and the intention not to get trapped in the Matruh position but retreat in good order. The result was poor co-ordination between the two forward Corps and units within them. Late on 26 June, the German 90th Light and 21st Panzer Divisions managed to find their way through the minefields in the centre of the front. Early on 27 June, resuming its advance, the 90th Light was checked by British 50th Division's artillery. Meanwhile, the 15th and 21st Panzer Divisions advanced east above and below the escarpment. The 15th Panzer were blocked by 4th Armoured and 7th Motor Brigades, but the 21st Panzer were ordered on to attack Minqar Qaim. Rommel ordered 90th Light to resume its advance, requiring it to cut the coast road behind 50th Division by the evening. As the 21st Panzer moved on Minqar Qaim, the 2nd New Zealand Division found itself surrounded but broke out on the night of 27/28 June without serious losses and withdrew east. Auchinleck had planned a second delaying position at Fuka, some east of Matruh, and at 21:20 he issued the orders for a withdrawal to Fuka. Confusion in communication led the division withdrawing immediately to the El Alamein position. X Corps, having made an unsuccessful attempt to secure a position on the escarpment, were out of touch with Eighth Army from 19:30 until 04:30 the next morning. Only then did they discover that the withdrawal order had been given. The withdrawal of XIII Corps had left the southern flank of X Corps on the coast at Matruh exposed and their line of retreat compromised by the cutting of the coastal road east of Matruh. They were ordered to break out southwards into the desert and then make their way east. Auchinleck ordered XIII Corps to provide support but they were in no position to do so. At 21:00 on 28 June, X Corps—organised into brigade groups—headed south. In the darkness, there was considerable confusion as they came across enemy units laagered for the night. In the process, 5th Indian Division in particular sustained heavy casualties, including the destruction of the 29th Indian Infantry Brigade at Fuka. Axis forces captured more than 6,000 prisoners, in addition to 40 tanks and an enormous quantity of supplies. Prelude Defences at El Alamein Alamein itself was an inconsequential railway station on the coast. Some to the south lay the Ruweisat Ridge, a low stony prominence that gave excellent observation for many miles over the surrounding desert; to the south was the Qattara Depression. The line the British chose to defend stretched between the sea and the Depression, which meant that Rommel could outflank it only by taking a significant detour to the south and crossing the Sahara Desert. The British Army in Egypt recognised this before the war and had the Eighth Army begin construction of several "boxes" (localities with dug-outs and surrounded by minefields and barbed wire) the most developed being around the railway station at Alamein. Most of the "line" was open, empty desert. Lieutenant-General William Norrie (General officer commanding [GOC] XXX Corps) organised the position and started to construct three defended "boxes". The first and strongest, at El Alamein on the coast, had been partly wired and mined by 1st South African Division. The Bab el Qattara box—some from the coast and south-west of the Ruweisat Ridge—had been dug but had not been wired or mined, while at the Naq Abu Dweis box (on the edge of the Qattara Depression), from the coast, very little work had been done. The British position in Egypt was desperate, the rout from Mersa Matruh had created a panic in the British headquarters at Cairo, something later called "the Flap". On what came to be referred to as "Ash Wednesday", at British headquarters, rear echelon units and the British Embassy, papers were hurriedly burned in anticipation of the fall of the city. Auchinleck—although believing he could stop Rommel at Alamein—felt he could not ignore the possibility that he might once more be outmanoeuvred or outfought. To maintain his army, plans must be made for the possibility of a further retreat whilst maintaining morale and retaining the support and co-operation of the Egyptians. Defensive positions were constructed west of Alexandria and on the approaches to Cairo while considerable areas in the Nile delta were flooded. The Axis, too, believed that the capture of Egypt was imminent; Italian leader Benito Mussolini—sensing a historic moment—flew to Libya to prepare for his triumphal entry into Cairo. The scattering of X Corps at Mersa Matruh disrupted Auchinleck's plan for occupying the Alamein defences. On 29 June, he ordered XXX Corps—the 1st South African, 5th and 10th Indian divisions—to take the coastal sector on the right of the front and XIII Corps—the 2nd New Zealand Division and 4th Indian divisions—to be on the left. The remains of the 1st Armoured Division and the 7th Armoured Division were to be held as a mobile army reserve. His intention was for the fixed defensive positions to channel and disorganise the enemy's advance while mobile units would attack their flanks and rear. On 30 June, Rommel's Panzerarmee Afrika approached the Alamein position. The Axis forces were exhausted and understrength. Rommel had driven them forward ruthlessly, being confident that, provided he struck quickly before Eighth Army had time to settle, his momentum would take him through the Alamein position and he could then advance to the Nile with little further opposition. Supplies remained a problem because the Axis staff had originally expected a pause of six weeks after the capture of Tobruk. German air units were also exhausted and providing little help against the RAF's all-out attack on the Axis supply lines which, with the arrival of United States Army Air Forces (USAAF) heavy bombers, could reach as far as Benghazi. Although captured supplies proved useful, water and ammunition were constantly in short supply, while a shortage of transport impeded the distribution of the supplies that the Axis forces did have. Axis plan of attack Rommel's plan was for the 90th Light Division and the 15th and 21st Panzer divisions of the Afrika Korps to penetrate the Eighth Army lines between the Alamein box and Deir el Abyad (which he believed was defended). The 90th Light Division was then to veer north to cut the coastal road and trap the defenders of the Alamein box (which Rommel thought was occupied by the remains of the 50th Infantry Division) and the Afrika Korps would veer right to attack the rear of XIII Corps. Battle An Italian division was to attack the Alamein box from the west and another was to follow the 90th Light Division. The Italian XX Corps was to follow the Afrika Korps and deal with the Qattara box while the 133rd Armoured Division "Littorio" and German reconnaissance units would protect the right flank. Rommel had planned to attack on 30 June but supply and transport difficulties had resulted in a day's delay, vital to the defending forces reorganising on the Alamein line. On 30 June, the 90th Light Division was still short of its start line, 21st Panzer Division was immobilised through lack of fuel and the promised air support had yet to move into its advanced airfields. Panzer Army Africa attacks At 03:00 on 1 July, 90th Light Infantry Division advanced east but strayed too far north and ran into the 1st South African Division's defences and became pinned down. The 15th and 21st Panzer Divisions of the Afrika Korps were delayed by a sandstorm and then a heavy air attack. It was broad daylight by the time they circled round the back of Deir el Abyad where they found the feature to the east of it occupied by 18th Indian Infantry Brigade which, after a hasty journey from Iraq, had occupied the exposed position just west of Ruweisat Ridge and east of Deir el Abyad at Deir el Shein late on 28 June to create one of Norrie's additional defensive boxes. At about 10:00 on 1 July, 21st Panzer Division attacked Deir el Shein. 18th Indian Infantry Brigade—supported by 23 25-pounder gun-howitzers, 16 of the new 6-pounder anti-tank guns and nine Matilda tanks—held out the whole day in desperate fighting but by evening the Germans succeeded in over-running them. The time they bought allowed Auchinleck to organise the defence of the western end of Ruweisat Ridge. The 1st Armoured Division had been sent to intervene at Deir el Shein. They ran into 15th Panzer Division just south of Deir el Shein and drove it west. By the end of the day's fighting, the Afrika Korps had 37 tanks left out of its initial complement of 55. During the early afternoon, 90th Light had extricated itself from the El Alamein box defences and resumed its move eastward. It came under artillery fire from the three South African brigade groups and was forced to dig in. On 2 July, Rommel ordered the resumption of the offensive. Once again, 90th Light failed to make progress so Rommel called the Afrika Korps to abandon its planned sweep southward and instead join the effort to break through to the coast road by attacking east toward Ruweisat Ridge. The British defence of Ruweisat Ridge relied on an improvised formation called "Robcol", comprising a regiment each of field artillery and light anti-aircraft artillery and a company of infantry. Robcol—in line with normal British Army practice for ad hoc formations—was named after its commander, Brigadier Robert Waller, the Commander Royal Artillery of the 10th Indian Infantry Division. Robcol was able to buy time, and by late afternoon the two British armoured brigades joined the battle with 4th Armoured Brigade engaging 15th Panzer and 22nd Armoured Brigade 21st Panzer respectively. They drove back repeated attacks by the Axis armour, who then withdrew before dusk. The British reinforced Ruweisat on the night of 2 July. The now enlarged Robcol became "Walgroup". Meanwhile, the Royal Air Force (RAF) made heavy air attacks on the Axis units. The next day, 3 July, Rommel ordered the Afrika Korps to resume its attack on the Ruweisat ridge with the Italian XX Motorised Corps on its southern flank. Italian X Corps, meanwhile were to hold El Mreir. By this stage the Afrika Korps had only 26 operational tanks. There was a sharp armoured exchange south of Ruweisat ridge during the morning and the main Axis advance was held. On 3 July, the RAF flew 780 sorties. To relieve the pressure on the right and centre of the Eighth Army line, XIII Corps on the left advanced from the Qattara box (known to the New Zealanders as the Kaponga box). The plan was that the New Zealand 2nd Division—with the remains of Indian 5th Division and 7th Motor Brigade under its command—would swing north to threaten the Axis flank and rear. This force encountered the 132nd Armoured Division "Ariete"'s artillery, which was driving on the southern flank of the division as it attacked Ruweisat. The Italian commander ordered his battalions to fight their way out independently but the Ariete lost 531 men (about 350 were prisoners), 36 pieces of artillery, six (or eight?) tanks, and 55 trucks. By the end of the day, the Ariete Division had only five tanks. The day ended once again with the Afrika Korps and Ariete coming off second best to the superior numbers of the British 22nd Armoured and 4th Armoured Brigades, frustrating Rommel's attempts to resume his advance. The RAF once again played its part, flying 900 sorties during the day. To the south, on 5 July the New Zealand group resumed its advance northwards towards El Mreir intending to cut the rear of the Ariete Division. Heavy fire from the Italian 27th Infantry Division "Brescia" at El Mreir, however, north of the Qattara box, checked their progress and led XIII Corps to call off its attack. Rommel digs in At this point, Rommel decided his exhausted forces could make no further headway without resting and regrouping. He reported to the German High Command that his three German divisions numbered just 1,200–1,500 men each and resupply was proving highly problematic because of enemy interference from the air. He expected to have to remain on the defensive for at least two weeks. Rommel was by this time suffering from the extended length of his supply lines. The Allied Desert Air Force (DAF) was concentrating fiercely on his fragile and elongated supply routes while British mobile columns moving west and striking from the south were causing havoc in the Axis rear echelons. Rommel could afford these losses even less since shipments from Italy had been substantially reduced (in June, he received of supplies compared with in May and 400 vehicles (compared with 2,000 in May). Meanwhile, the Eighth Army was reorganising and rebuilding, benefiting from its short lines of communication. By 4 July, the Australian 9th Division had entered the line in the north, and on 9 July the Indian 5th Infantry Brigade also returned, taking over the Ruweisat position. At the same time, the fresh Indian 161st Infantry Brigade reinforced the depleted Indian 5th Infantry Division. Tel el Eisa On 8 July, Auchinleck ordered the new XXX Corps commander—Lieutenant-General William Ramsden—to capture the low ridges at Tel el Eisa and Tel el Makh Khad and then to push mobile battle groups south toward Deir el Shein and raiding parties west toward the airfields at El Daba. Meanwhile, XIII Corps would prevent the Axis from moving troops north to reinforce the coastal sector. Ramsden tasked the Australian 9th Division with 44th Royal Tank Regiment under command with the Tel el Eisa objective and the South African 1st Division with eight supporting tanks, Tel el Makh Khad. The raiding parties were to be provided by 1st Armoured Division. Following a bombardment which started at 03:30 on 10 July, the Australian 26th Brigade launched an attack against the ridge north of Tel el Eisa station along the coast (Trig 33). The bombardment was the heaviest barrage yet experienced in North Africa, which created panic in the inexperienced soldiers of the Italian 60th Infantry Division "Sabratha" who had only just occupied sketchy defences in the sector. The Australian attack took more than 1,500 prisoners, routed an Italian Division and overran the German Signals Intercept Company 621. Meanwhile, the South Africans had by late morning taken Tel el Makh Khad and were in covering positions. Elements of the German 164th Light Division and Italian 101st Motorised Division "Trieste" arrived to plug the gap torn in the Axis defences. That afternoon and evening, tanks from the German 15th Panzer and Italian Trieste Divisions launched counter-attacks against the Australian positions, the counter-attacks failing in the face of overwhelming Allied artillery and the Australian anti-tank guns. At first light on 11 July, the Australian 2/24th Battalion supported by tanks from 44th Royal Tank Regiment attacked the western end of Tel el Eisa hill (Point 24). By early afternoon, the feature was captured and was then held against a series of Axis counter-attacks throughout the day. A small column of armour, motorised infantry, and guns then set off to raid Deir el Abyad and caused a battalion
to resume its advance, requiring it to cut the coast road behind 50th Division by the evening. As the 21st Panzer moved on Minqar Qaim, the 2nd New Zealand Division found itself surrounded but broke out on the night of 27/28 June without serious losses and withdrew east. Auchinleck had planned a second delaying position at Fuka, some east of Matruh, and at 21:20 he issued the orders for a withdrawal to Fuka. Confusion in communication led the division withdrawing immediately to the El Alamein position. X Corps, having made an unsuccessful attempt to secure a position on the escarpment, were out of touch with Eighth Army from 19:30 until 04:30 the next morning. Only then did they discover that the withdrawal order had been given. The withdrawal of XIII Corps had left the southern flank of X Corps on the coast at Matruh exposed and their line of retreat compromised by the cutting of the coastal road east of Matruh. They were ordered to break out southwards into the desert and then make their way east. Auchinleck ordered XIII Corps to provide support but they were in no position to do so. At 21:00 on 28 June, X Corps—organised into brigade groups—headed south. In the darkness, there was considerable confusion as they came across enemy units laagered for the night. In the process, 5th Indian Division in particular sustained heavy casualties, including the destruction of the 29th Indian Infantry Brigade at Fuka. Axis forces captured more than 6,000 prisoners, in addition to 40 tanks and an enormous quantity of supplies. Prelude Defences at El Alamein Alamein itself was an inconsequential railway station on the coast. Some to the south lay the Ruweisat Ridge, a low stony prominence that gave excellent observation for many miles over the surrounding desert; to the south was the Qattara Depression. The line the British chose to defend stretched between the sea and the Depression, which meant that Rommel could outflank it only by taking a significant detour to the south and crossing the Sahara Desert. The British Army in Egypt recognised this before the war and had the Eighth Army begin construction of several "boxes" (localities with dug-outs and surrounded by minefields and barbed wire) the most developed being around the railway station at Alamein. Most of the "line" was open, empty desert. Lieutenant-General William Norrie (General officer commanding [GOC] XXX Corps) organised the position and started to construct three defended "boxes". The first and strongest, at El Alamein on the coast, had been partly wired and mined by 1st South African Division. The Bab el Qattara box—some from the coast and south-west of the Ruweisat Ridge—had been dug but had not been wired or mined, while at the Naq Abu Dweis box (on the edge of the Qattara Depression), from the coast, very little work had been done. The British position in Egypt was desperate, the rout from Mersa Matruh had created a panic in the British headquarters at Cairo, something later called "the Flap". On what came to be referred to as "Ash Wednesday", at British headquarters, rear echelon units and the British Embassy, papers were hurriedly burned in anticipation of the fall of the city. Auchinleck—although believing he could stop Rommel at Alamein—felt he could not ignore the possibility that he might once more be outmanoeuvred or outfought. To maintain his army, plans must be made for the possibility of a further retreat whilst maintaining morale and retaining the support and co-operation of the Egyptians. Defensive positions were constructed west of Alexandria and on the approaches to Cairo while considerable areas in the Nile delta were flooded. The Axis, too, believed that the capture of Egypt was imminent; Italian leader Benito Mussolini—sensing a historic moment—flew to Libya to prepare for his triumphal entry into Cairo. The scattering of X Corps at Mersa Matruh disrupted Auchinleck's plan for occupying the Alamein defences. On 29 June, he ordered XXX Corps—the 1st South African, 5th and 10th Indian divisions—to take the coastal sector on the right of the front and XIII Corps—the 2nd New Zealand Division and 4th Indian divisions—to be on the left. The remains of the 1st Armoured Division and the 7th Armoured Division were to be held as a mobile army reserve. His intention was for the fixed defensive positions to channel and disorganise the enemy's advance while mobile units would attack their flanks and rear. On 30 June, Rommel's Panzerarmee Afrika approached the Alamein position. The Axis forces were exhausted and understrength. Rommel had driven them forward ruthlessly, being confident that, provided he struck quickly before Eighth Army had time to settle, his momentum would take him through the Alamein position and he could then advance to the Nile with little further opposition. Supplies remained a problem because the Axis staff had originally expected a pause of six weeks after the capture of Tobruk. German air units were also exhausted and providing little help against the RAF's all-out attack on the Axis supply lines which, with the arrival of United States Army Air Forces (USAAF) heavy bombers, could reach as far as Benghazi. Although captured supplies proved useful, water and ammunition were constantly in short supply, while a shortage of transport impeded the distribution of the supplies that the Axis forces did have. Axis plan of attack Rommel's plan was for the 90th Light Division and the 15th and 21st Panzer divisions of the Afrika Korps to penetrate the Eighth Army lines between the Alamein box and Deir el Abyad (which he believed was defended). The 90th Light Division was then to veer north to cut the coastal road and trap the defenders of the Alamein box (which Rommel thought was occupied by the remains of the 50th Infantry Division) and the Afrika Korps would veer right to attack the rear of XIII Corps. Battle An Italian division was to attack the Alamein box from the west and another was to follow the 90th Light Division. The Italian XX Corps was to follow the Afrika Korps and deal with the Qattara box while the 133rd Armoured Division "Littorio" and German reconnaissance units would protect the right flank. Rommel had planned to attack on 30 June but supply and transport difficulties had resulted in a day's delay, vital to the defending forces reorganising on the Alamein line. On 30 June, the 90th Light Division was still short of its start line, 21st Panzer Division was immobilised through lack of fuel and the promised air support had yet to move into its advanced airfields. Panzer Army Africa attacks At 03:00 on 1 July, 90th Light Infantry Division advanced east but strayed too far north and ran into the 1st South African Division's defences and became pinned down. The 15th and 21st Panzer Divisions of the Afrika Korps were delayed by a sandstorm and then a heavy air attack. It was broad daylight by the time they circled round the back of Deir el Abyad where they found the feature to the east of it occupied by 18th Indian Infantry Brigade which, after a hasty journey from Iraq, had occupied the exposed position just west of Ruweisat Ridge and east of Deir el Abyad at Deir el Shein late on 28 June to create one of Norrie's additional defensive boxes. At about 10:00 on 1 July, 21st Panzer Division attacked Deir el Shein. 18th Indian Infantry Brigade—supported by 23 25-pounder gun-howitzers, 16 of the new 6-pounder anti-tank guns and nine Matilda tanks—held out the whole day in desperate fighting but by evening the Germans succeeded in over-running them. The time they bought allowed Auchinleck to organise the defence of the western end of Ruweisat Ridge. The 1st Armoured Division had been sent to intervene at Deir el Shein. They ran into 15th Panzer Division just south of Deir el Shein and drove it west. By the end of the day's fighting, the Afrika Korps had 37 tanks left out of its initial complement of 55. During the early afternoon, 90th Light had extricated itself from the El Alamein box defences and resumed its move eastward. It came under artillery fire from the three South African brigade groups and was forced to dig in. On 2 July, Rommel ordered the resumption of the offensive. Once again, 90th Light failed to make progress so Rommel called the Afrika Korps to abandon its planned sweep southward and instead join the effort to break through to the coast road by attacking east toward Ruweisat Ridge. The British defence of Ruweisat Ridge relied on an improvised formation called "Robcol", comprising a regiment each of field artillery and light anti-aircraft artillery and a company of infantry. Robcol—in line with normal British Army practice for ad hoc formations—was named after its commander, Brigadier Robert Waller, the Commander Royal Artillery of the 10th Indian Infantry Division. Robcol was able to buy time, and by late afternoon the two British armoured brigades joined the battle with 4th Armoured Brigade engaging 15th Panzer and 22nd Armoured Brigade 21st Panzer respectively. They drove back repeated attacks by the Axis armour, who then withdrew before dusk. The British reinforced Ruweisat on the night of 2 July. The now enlarged Robcol became "Walgroup". Meanwhile, the Royal Air Force (RAF) made heavy air attacks on the Axis units. The next day, 3 July, Rommel ordered the Afrika Korps to resume its attack on the Ruweisat ridge with the Italian XX Motorised Corps on its southern flank. Italian X Corps, meanwhile were to hold El Mreir. By this stage the Afrika Korps had only 26 operational tanks. There was a sharp armoured exchange south of Ruweisat ridge during the morning and the main Axis advance was held. On 3 July, the RAF flew 780 sorties. To relieve the pressure on the right and centre of the Eighth Army line, XIII Corps on the left advanced from the Qattara box (known to the New Zealanders as the Kaponga box). The plan was that the New Zealand 2nd Division—with the remains of Indian 5th Division and 7th Motor Brigade under its command—would swing north to threaten the Axis flank and rear. This force encountered the 132nd Armoured Division "Ariete"'s artillery, which was driving on the southern flank of the division as it attacked Ruweisat. The Italian commander ordered his battalions to fight their way out independently but the Ariete lost 531 men (about 350 were prisoners), 36 pieces of artillery, six (or eight?) tanks, and 55 trucks. By the end of the day, the Ariete Division had only five tanks. The day ended once again with the Afrika Korps and Ariete coming off second best to the superior numbers of the British 22nd Armoured and 4th Armoured Brigades, frustrating Rommel's attempts to resume his advance. The RAF once again played its part, flying 900 sorties during the day. To the south, on 5 July the New Zealand group resumed its advance northwards towards El Mreir intending to cut the rear of the Ariete Division. Heavy fire from the Italian 27th Infantry Division "Brescia" at El Mreir, however, north of the Qattara box, checked their progress and led XIII Corps to call off its attack. Rommel digs in At this point, Rommel decided his exhausted forces could make no further headway without resting and regrouping. He reported to the German High Command that his three German divisions numbered just 1,200–1,500 men each and resupply was proving highly problematic because of enemy interference from the air. He expected to have to remain on the defensive for at least two weeks. Rommel was by this time suffering from the extended length of his supply lines. The Allied Desert Air Force (DAF) was concentrating fiercely on his fragile and elongated supply routes while British mobile columns moving west and striking from
try to mediate, largely out of the fear that the Italians might actually lose. The British consul in Zanzibar, Gerald Portal, was sent in 1887 to mediate between the Ethiopians and Italians before war broke out.Upon meeting the Emperor Yohannes on 4 December 1887, he presented him with gifts and a letter from Queen Victoria urging him to settle with the Italians. Portal reported: "What might have been possible in August or September was impossible in December, when the whole of the immense available forces in the country were already under arms; and that there now remains no hope of a satisfactory adjustment of the difficulties between Italy and Abyssinia [Ethiopia] until the question of the relative supremacy of these two nations has been decided by an appeal to the fortunes of war... No one who has once seen the nature of the gorges, ravines and mountain passes near the Abyssinian frontier can doubt for a moment that any advance by a civilised army in the face of the hostile Abyssinian hordes would be accomplished at the price of a fearful loss of life on both sides. ... The Abyssinians are savage and untrustworthy, but they are also redeemed by the possession of an unbounded courage, by a disregard of death, and by a national pride which leads them to look down on every human being who has not had the good fortune to be born an Abyssinian". Portal ended by writing that the Italians were making a mistake in preparing to go war against Ethiopia: "It is the old, old story, contempt of a gallant enemy because his skin happens to be chocolate or brown or black, and because his men have not gone through orthodox courses of field-firing, battalion drill, or 'autumn maneuvers'". The defeat at Dogali made the Italians cautious for a moment, but on 10 March 1889, Emperor Yohannes died after being wounded in battle against the Ansar and on his deathbed admitted that Ras Mengesha, the supposed son of his brother, was actually his own son and asked that he succeed him. The revelation that the emperor had slept with his brother's wife scandalised intensely Orthodox Ethiopia, and instead the Negus Menelik was proclaimed emperor on 26 March 1889. Ras Mengesha, one of the most powerful Ethiopian noblemen, was unhappy about being by-passed in the succession and for a time allied himself with the Italians against the Emperor Menelik. Under the feudal Ethiopian system, there was no standing army, and instead, the nobility raised up armies on behalf of the Emperor. In December 1889, the Italians advanced inland again and took the cities of Asmara and Keren and in January 1890 took Adowa. Treaty of Wuchale On 25 March 1889, the Shewa ruler Menelik II, having conquered Tigray and Amhara, declared himself Emperor of Ethiopia (or "Abyssinia", as it was commonly called in Europe at the time). Barely a month later, on 2 May he signed the Treaty of Wuchale with the Italians, which apparently gave them control over Eritrea, the Red Sea coast to the northeast of Ethiopia, in return for recognition of Menelik's rule. Menelik II continued the policy of Tewodros II of integrating Ethiopia. However, the bilingual treaty did not say the same thing in Italian and Amharic; the Italian version did not give the Ethiopians the "significant autonomy" written into the Amharic translation. The Italian text stated that Ethiopia must conduct its foreign affairs through Italy (making it an Italian protectorate), but the Amharic version merely stated that Ethiopia could contact foreign powers and conduct foreign affairs using the embassy of Italy. Italian diplomats, however, claimed that the original Amharic text included the clause and Menelik knowingly signed a modified copy of the Treaty. In October 1889, the Italians informed all of the other European governments because of the Treaty of Wuchale that Ethiopia was now an Italian protectorate and therefore the other European nations could not conduct diplomatic relations with Ethiopia. With the exceptions of the Ottoman Empire, which still maintained its claim to Eritrea, and Russia, which disliked the idea of an Orthodox nation being subjugated to a Roman Catholic nation, all of the European powers accepted the Italian claim to a protectorate. The Italian claim that Menelik was aware of Article XVII turning his nation into an Italian protectorate seems unlikely given that the Emperor Menelik sent letters to Queen Victoria and Emperor Wilhelm II in late 1889 and was informed in the replies in early 1890 that neither Britain nor Germany could have diplomatic relations with Ethiopia on the account of Article XVII of the Treaty of Wuchale, a revelation that came as a great shock to the Emperor. Victoria's letter was polite whereas Wilhelm's letter was somewhat more rude, saying that King Umberto I was a great friend of Germany and Menelik's violation of the supposed Italian protectorate was a grave insult to Umberto, adding that he never wanted to hear from Menelik again. Moreover, Menelik did not know Italian and only signed the Amharic text of the treaty, being assured that there were no differences between the Italian and Amharic texts before he signed. The differences between the Italian and Amharic texts were due to the Italian minister in Addis Ababa, Count Pietro Antonelli, who had been instructed by his government to gain as much territory as possible in negotiating with the Emperor Menelik. However, knowing Menelik was now enthroned as the King of Kings and had a strong position, Antonelli was in the unenviable situation of negotiating a treaty that his own government might disallow. Therefore, he inserted the statement making Ethiopia give up its right to conduct its foreign affairs to Italy as a way of pleasing his superiors who might otherwise have fired him for only making small territorial gains. Antonelli was fluent in Amharic and given that Menelik only signed the Amharic text he could not have been unaware that the Amharic version of Article XVII only stated that the King of Italy places the services of his diplomats at the disposal of the Emperor of Ethiopia to represent him abroad if he so wished. When his subterfuge was exposed in 1890 with Menelik indignantly saying he would never sign away his country's independence to anybody, Antonelli who left Addis Ababa in mid 1890 resorted to racism, telling his superiors in Rome that as Menelik was a black man, he was thus intrinsically dishonest and it was only natural the Emperor would lie about the protectorate he supposedly willingly turned his nation into. Francesco Crispi, the Italian Prime Minister was an ultra-imperialist who believed the newly unified Italian state required "the grandeur of a second Roman empire". Crispi believed that the Horn of Africa was the best place for the Italians to start building the new Roman empire. The American journalist James Perry wrote that "Crispi was a fool, a bigot and a very dangerous man". Because of the Ethiopian refusal to abide by the Italian version of the treaty and despite economic handicaps at home, the Italian government decided on a military solution to force Ethiopia to abide by the Italian version of the treaty. In doing so, they believed that they could exploit divisions within Ethiopia and rely on tactical and technological superiority to offset any inferiority in numbers. The efforts of Emperor Menelik, viewed as pro-French by London, to unify Ethiopia and thus bring the source of the Blue Nile under his control was perceived in Whitehall as a threat to their influence in Egypt. As Menelik became increasingly successful in unifying Ethiopia, the British government courted the Italians to counter Ethiopian expansion. There was a broader, European background as well: the Triple Alliance of Germany, Austria-Hungary, and Italy was under some stress, with Italy being courted by the British government. Two secret Anglo-Italian protocols were signed in 1891, leaving most of Ethiopia in Italy's sphere of influence. France, one of the members of the opposing
Eritrea to replace the Egyptians, London decided to have the Italians move into Eritrea. In his history of Ethiopia, British historian Augustus Wylde wrote: "England made use of King John [Emperor Yohannes] as long as he was of any service and then threw him over to the tender mercies of Italy...It is one of our worst bits of business out of the many we have been guilty of in Africa...one of the vilest bites of treachery". After the French had unexpectedly made Tunis into their protectorate in 1881, outraging opinion in Italy over the so-called "Schiaffo di Tunisi" (the "slap of Tunis"), Italian foreign policy had been extremely anti-French, and from the British viewpoint the best way of ensuring the Eritrean ports on the Red Sea stayed out of French hands was by allowing the staunchly anti-French Italians move in. In 1882, Italy had joined the Triple Alliance, allying herself with Austria and Germany against France. On 5 February 1885 Italian troops landed at Massawa to replace the Egyptians. The Italian government for its part was more than happy to embark upon an imperialist policy to distract its people from the failings in post Risorgimento Italy. In 1861, the unification of Italy was supposed to mark the beginning of a glorious new era in Italian life, and many Italians were gravely disappointed to find that not much had changed in the new Kingdom of Italy with the vast majority of Italians still living in abject poverty. To compensate, a chauvinist mood was rampant among the upper classes in Italy with the newspaper Il Diritto writing in an editorial: "Italy must be ready. The year 1885 will decide her fate as a great power. It is necessary to feel the responsibility of the new era; to become again strong men afraid of nothing, with the sacred love of the fatherland, of all Italy, in our hearts". On the Ethiopian side, the wars that Emperor Yohannes had waged first against the invading Egyptians in the 1870s and then more so against the Sudanese Mahdiyya state in the 1880s had been presented by him to his subjects as holy wars in defense of Orthodox Christianity against Islam, reinforcing the Ethiopian belief that their country was an especially virtuous and holy land. The struggle against the Ansar from Sudan complicated Yohannes's relations with the Italians, whom he sometimes asked to provide him with guns to fight the Ansar and other times he resisted the Italians and proposed a truce with the Ansar. On 18 January 1887, at a village named Saati, an advancing Italian Army detachment defeated the Ethiopians in a skirmish, but it ended with the numerically superior Ethiopians surrounding the Italians in Saati after they retreated in face of the enemy's numbers. Some 500 Italian soldiers under Colonel de Christoforis together with 50 Eritrean auxiliaries were sent to support the besieged garrison at Saati. At Dogali on his way to Saati, de Christoforis was ambushed by an Ethiopian force under Ras Alula, whose men armed with spears skillfully encircled the Italians who retreated to one hill and then to another higher hill. After the Italians ran out of ammunition, Ras Alula ordered his men to charge and the Ethiopians swiftly overwhelmed the Italians in an action that featured bayonets against spears. The Battle of Dogali ended with the Italians losing 23 officers and 407 other ranks killed. As a result of the defeat at Dogali, the Italians abandoned Saati and retreated back to the Red Sea coast. Italians newspapers called the battle a "massacre" and excoriated the Regio Esercito for not assigning de Chistoforis enough ammunition. Having, at first, encouraged Emperor Yohannes to move into Eritrea, and then having encouraged the Italians to also do so, London realised a war was brewing and decided to try to mediate, largely out of the fear that the Italians might actually lose. The British consul in Zanzibar, Gerald Portal, was sent in 1887 to mediate between the Ethiopians and Italians before war broke out.Upon meeting the Emperor Yohannes on 4 December 1887, he presented him with gifts and a letter from Queen Victoria urging him to settle with the Italians. Portal reported: "What might have been possible in August or September was impossible in December, when the whole of the immense available forces in the country were already under arms; and that there now remains no hope of a satisfactory adjustment of the difficulties between Italy and Abyssinia [Ethiopia] until the question of the relative supremacy of these two nations has been decided by an appeal to the fortunes of war... No one who has once seen the nature of the gorges, ravines and mountain passes near the Abyssinian frontier can doubt for a moment that any advance by a civilised army in the face of the hostile Abyssinian hordes would be accomplished at the price of a fearful loss of life on both sides. ... The Abyssinians are savage and untrustworthy, but they are also redeemed by the possession of an unbounded courage, by a disregard of death, and by a national pride which leads them to look down on every human being who has not had the good fortune to be born an Abyssinian". Portal ended by writing that the Italians were making a mistake in preparing to go war against Ethiopia: "It is the old, old story, contempt of a gallant enemy because his skin happens to be chocolate or brown or black, and because his men have not gone through orthodox courses of field-firing, battalion drill, or 'autumn maneuvers'". The defeat at Dogali made the Italians cautious for a moment, but on 10 March 1889, Emperor Yohannes died after being wounded in battle against the Ansar and on his deathbed admitted that Ras Mengesha, the supposed son of his brother, was actually his own son and asked that he succeed him. The revelation that the emperor had slept with his brother's wife scandalised intensely Orthodox Ethiopia, and instead the Negus Menelik was proclaimed emperor on 26 March 1889. Ras Mengesha, one of the most powerful Ethiopian noblemen, was unhappy about being by-passed in the succession and for a time allied himself with the Italians against the Emperor Menelik. Under the feudal Ethiopian system, there was no standing army, and instead, the nobility raised up armies on behalf of the Emperor. In December 1889, the Italians advanced inland again and took the cities of Asmara and Keren and in January 1890 took Adowa. Treaty of Wuchale On 25 March 1889, the Shewa ruler Menelik II, having conquered Tigray and Amhara, declared himself Emperor of Ethiopia (or "Abyssinia", as it was commonly called in Europe at the time). Barely a month later, on 2 May he signed the Treaty of Wuchale with the Italians, which apparently gave them control over Eritrea, the Red Sea coast to the northeast of Ethiopia, in return for recognition of Menelik's rule. Menelik II continued the policy of Tewodros II of integrating Ethiopia. However, the bilingual treaty did not say the same thing in Italian and Amharic; the Italian version did not give the Ethiopians the "significant autonomy" written into the Amharic translation. The Italian text stated that Ethiopia must conduct its foreign affairs through Italy (making it an Italian protectorate), but the Amharic version merely stated that Ethiopia could contact foreign powers and conduct foreign affairs using the embassy of Italy. Italian diplomats, however, claimed that the original Amharic text included the clause and Menelik knowingly signed a modified copy of the Treaty. In October 1889, the Italians informed all of the other European governments because of the Treaty of Wuchale that Ethiopia was now an Italian protectorate and therefore the other European nations could not conduct diplomatic relations with Ethiopia. With the exceptions of the Ottoman Empire, which still maintained its claim to Eritrea, and Russia, which disliked the idea of an Orthodox nation being subjugated to a Roman Catholic nation, all of the European powers accepted the Italian claim to a protectorate. The Italian claim that Menelik was aware of Article XVII turning his nation into an Italian protectorate seems unlikely given that the Emperor Menelik sent letters to Queen Victoria and Emperor Wilhelm II in late 1889 and was informed in the replies in early 1890 that neither Britain nor Germany could have diplomatic relations with Ethiopia on the account of Article XVII of the Treaty of Wuchale, a revelation that came as a great shock to the Emperor. Victoria's letter was polite whereas Wilhelm's letter was somewhat more rude, saying that King Umberto I was a great friend of Germany and Menelik's violation of the supposed Italian protectorate was a grave insult to Umberto, adding that he never wanted to hear from Menelik again. Moreover, Menelik did not know Italian and only signed the Amharic text of the treaty, being assured that there were no differences between the Italian and Amharic texts before he signed. The differences between the Italian and Amharic texts were due to the Italian minister in Addis Ababa, Count Pietro Antonelli, who had been instructed by his government to gain as much territory as possible in negotiating with the Emperor Menelik. However, knowing Menelik was now enthroned as the King of Kings and had a strong position, Antonelli was in the unenviable situation of negotiating a treaty that his own government might disallow. Therefore, he inserted the statement making Ethiopia give up its right to conduct its foreign affairs to Italy as a way of pleasing his superiors who might otherwise have fired him for only making small territorial gains. Antonelli was fluent in Amharic and given that Menelik only signed the Amharic text he could not have been unaware that the Amharic version of Article XVII only stated that the King of Italy places the services of his diplomats at the disposal of the Emperor of Ethiopia to represent him abroad if he so wished. When his subterfuge was exposed in 1890 with Menelik indignantly saying he would never sign away his country's independence to anybody, Antonelli who left Addis Ababa in mid 1890 resorted to racism, telling his superiors in Rome that as Menelik was a black man, he was thus intrinsically dishonest and it was only natural the Emperor would lie about the protectorate he supposedly willingly turned his nation into. Francesco Crispi, the Italian Prime Minister was an ultra-imperialist who believed the newly unified Italian state required "the grandeur of a second Roman empire". Crispi believed that the Horn of Africa was the best place for the Italians to start building the new Roman empire. The American journalist James Perry wrote that "Crispi was a fool, a bigot and a very dangerous man". Because of the Ethiopian refusal to abide by the Italian version of the treaty and despite economic handicaps at home, the Italian government decided on a military solution to force Ethiopia to abide by the Italian version of the treaty. In doing so, they believed that they could exploit divisions within Ethiopia and rely on tactical and technological superiority to offset any inferiority in numbers. The efforts of Emperor Menelik, viewed as pro-French by London, to unify Ethiopia and thus bring the source of the Blue Nile under his control was perceived in Whitehall as a threat to their influence in Egypt. As Menelik became increasingly successful in unifying Ethiopia, the British government courted the Italians to counter Ethiopian expansion. There was a broader, European background as well: the Triple Alliance of Germany, Austria-Hungary, and Italy was under some stress, with Italy being courted by the British government. Two secret Anglo-Italian protocols were signed in 1891, leaving most of Ethiopia in Italy's sphere of influence. France, one of the members of the opposing Franco-Russian Alliance, had its own claims on Eritrea and was bargaining with Italy over giving up those claims in exchange for a more secure position in Tunisia. Meanwhile, Russia was supplying weapons and other aid to Ethiopia. It had been trying to gain a foothold in Ethiopia, and in 1894, after denouncing the Treaty of Wuchale in July, it received an Ethiopian mission in St. Petersburg and sent arms and ammunition to Ethiopia. This support continued after the war ended. The Russian travel writer Alexander Bulatovich who went to Ethiopia to serve as a Red Cross volunteer with the Emperor Menelik made a point of emphasizing in his books that the Ethiopians converted to Christianity before any of the Europeans ever did, described the Ethiopians as a deeply religious people like the Russians, and argued the Ethiopians did not have the "low cultural level" of the other African peoples, making them equal to the Europeans. Germany and Austria supported their ally in the Triple Alliance Italy while France and Russia supported Ethiopia. Prelude & Beginning of Conflict In 1893, judging that his power over Ethiopia was secure, Menelik repudiated the treaty; in response the Italians ramped up the pressure on his domain in a variety of ways, including the annexation of small territories bordering their original claim under the Treaty
and deficits as macroeconomic policy tools that could counter cyclical trends, and establish bureaus of economic statistics (including a consumer price index) in order to facilitate this effort" – are now conventional practice, his critique of fractional-reserve banking still "remains outside the bounds of conventional wisdom" although a recent paper by the IMF reinvigorated his proposals. Soddy wrote that financial debts grew exponentially at compound interest but the real economy was based on exhaustible stocks of fossil fuels. Energy obtained from the fossil fuels could not be used again. This criticism of economic growth is echoed by his intellectual heirs in the now emergent field of ecological economics. The New Palgrave Dictionary of Economics, an influential reference text in economics, recognized Soddy as a "reformer" for his works on monetary reforms. Political views In Wealth, Virtual Wealth and Debt, Soddy cited the Protocols of the Learned Elders of Zion as evidence for the belief, which was relatively widespread at the time, of a "financial conspiracy to enslave the world". The Protocols was widely disseminated by Henry Ford in the United States. He claimed that "A corrupt monetary system strikes at the very life of the nation." Later in life he published a pamphlet Abolish Private Money, or Drown in Debt (1939). The influence of his writing can be gauged, for example, in this quote from Ezra Pound:"Professor Frederick Soddy states that the Gold Standard monetary system has wrecked a scientific age! ... The world's bankers ... have not been content to take their share of modern wealth production – great as it has been – but they have refused to allow the masses of mankind to receive theirs." Though some activists have insubstantially accused Soddy of anti-Semitism, most of his biographers dispute this narrative and argue that among Soddy's friends and students were some Jews who held positive views of him. Among these friends include Kazimierz Fajans, a Polish-Jewish physicist who worked with both Ernest Rutherford and Soddy. Descartes' theorem He rediscovered the Descartes' theorem in 1936 and published it as a poem, "The Kiss Precise", quoted at Problem of Apollonius. The kissing circles in this problem are sometimes known as Soddy circles. Honours and awards He received the Nobel Prize in Chemistry in 1921 and the same year he was elected member of the International Atomic Weights Committee. A small crater on the far side of the Moon as well as the radioactive uranium mineral soddyite are named after him. Personal life In 1908, Soddy married Winifred Moller Beilby (1885-1936), the daughter of industrial chemist Sir George Beilby and Lady Emma Bielby, a philanthropist to women's causes. The couple worked together and co-published a paper in 1910 on the absorption of gamma rays from radium. He died in Brighton, England in 1956, twenty days after his 79th birthday. Bibliography Radioactivity (1904) The Interpretation of Radium (1909) Matter and Energy (1911), second edition (2015) The Chemistry of the Radio-elements (1915) Science and life: Aberdeen addresses (1920) Cartesian Economics: The Bearing of Physical Science upon State Stewardship (1921) Science and Life Wealth, Virtual Wealth, and Debt Money versus Man etc (1921) Nobel Lecture – The origins of the conception of isotopes (1922) Wealth, Virtual Wealth and Debt. The solution of the economic paradox (George Allen & Unwin, 1926) The wrecking of
cyclical trends, and establish bureaus of economic statistics (including a consumer price index) in order to facilitate this effort" – are now conventional practice, his critique of fractional-reserve banking still "remains outside the bounds of conventional wisdom" although a recent paper by the IMF reinvigorated his proposals. Soddy wrote that financial debts grew exponentially at compound interest but the real economy was based on exhaustible stocks of fossil fuels. Energy obtained from the fossil fuels could not be used again. This criticism of economic growth is echoed by his intellectual heirs in the now emergent field of ecological economics. The New Palgrave Dictionary of Economics, an influential reference text in economics, recognized Soddy as a "reformer" for his works on monetary reforms. Political views In Wealth, Virtual Wealth and Debt, Soddy cited the Protocols of the Learned Elders of Zion as evidence for the belief, which was relatively widespread at the time, of a "financial conspiracy to enslave the world". The Protocols was widely disseminated by Henry Ford in the United States. He claimed that "A corrupt monetary system strikes at the very life of the nation." Later in life he published a pamphlet Abolish Private Money, or Drown in Debt (1939). The influence of his writing can be gauged, for example, in this quote from Ezra Pound:"Professor Frederick Soddy states that the Gold Standard monetary system has wrecked a scientific age! ... The world's bankers ... have not been content to take their share of modern wealth production – great as it has been – but they have refused to allow the masses of mankind to receive theirs." Though some activists have insubstantially accused Soddy of anti-Semitism, most of his biographers dispute this narrative and argue that among Soddy's friends and students were some Jews who held positive views of him. Among these friends include Kazimierz Fajans, a Polish-Jewish physicist who worked with both Ernest Rutherford and Soddy. Descartes' theorem He rediscovered the Descartes' theorem in 1936 and published it as a poem, "The Kiss Precise", quoted at Problem of Apollonius. The kissing circles in this problem are sometimes known as Soddy circles. Honours and awards He received the Nobel Prize in Chemistry in 1921 and the same year he was elected member of the International Atomic Weights Committee. A small crater on the far side of the Moon as well as the radioactive uranium mineral soddyite are named after him. Personal life In 1908, Soddy married Winifred Moller Beilby (1885-1936), the daughter of industrial chemist Sir George Beilby and Lady Emma Bielby, a philanthropist to women's causes. The couple worked together and co-published a paper in 1910 on the absorption of gamma rays from radium. He died in Brighton, England in 1956, twenty days after his 79th birthday. Bibliography Radioactivity (1904) The Interpretation of Radium (1909) Matter and Energy (1911), second edition (2015) The Chemistry of the Radio-elements (1915) Science and life: Aberdeen addresses (1920) Cartesian Economics: The Bearing of Physical Science upon State Stewardship (1921) Science and Life Wealth, Virtual Wealth, and Debt Money versus Man etc (1921) Nobel Lecture – The origins of the conception of isotopes (1922) Wealth, Virtual Wealth and Debt. The solution of the economic paradox (George Allen & Unwin, 1926) The wrecking of a scientific age (1927) The Interpretation of the Atom
head, sharp, well-developed canines, sharp eyesight, and keen hearing. They are extremely sexually dimorphic mammals, with the males often two to five times the size of the females, with proportionally larger heads, necks, and chests. Size ranges from about 1.5 m, 64 kg in the male Galapagos fur seal (also the smallest pinniped) to 2.5 m, 180 kg in the adult male New Zealand fur seal. Most fur seal pups are born with a black-brown coat that molts at 2–3 months, revealing a brown coat that typically gets darker with age. Some males and females within the same species have significant differences in appearance, further contributing to the sexual dimorphism. Females and juveniles often have a lighter colored coat overall or only on the chest, as seen in South American fur seals. In a northern fur seal population, the females are typically silvery-gray on the dorsal side and reddish-brown on their ventral side with a light gray patch on their chest. This makes them easily distinguished from the males with their brownish-gray to reddish-brown or black coats. Habitat Of the fur seal family, eight species are considered southern fur seals, and only one is found in the Northern Hemisphere. The southern group includes Antarctic, Galapagos, Guadalupe, Juan Fernandez, New Zealand, brown, South American, and subantarctic fur seals. They typically spend about 70% of their lives in subpolar, temperate, and equatorial waters. Colonies of fur seals can be seen throughout the Pacific and Southern Oceans from south Australia, Africa, and New Zealand, to the coast of Peru and north to California. They are typically nonmigrating mammals, with the exception of the northern fur seal, which has been known to travel distances up to 10,000 km. Fur seals are often found near isolated islands or peninsulas, and can be seen hauling out onto the mainland during winter. Although they are not migratory, they have been observed wandering hundreds of miles from their breeding grounds in times of scarce resources. For example, the subantarctic fur seal typically resides near temperate islands in the South Atlantic and Indian Oceans north of the Antarctic Polar Front, but juvenile males have been seen wandering as far north as Brazil and South Africa. Behavior and ecology Typically, fur seals gather during the summer in large rookeries at specific beaches or rocky outcrops to give birth and breed. All species are polygynous, meaning dominant males reproduce with more than one female. For most species, total gestation lasts about 11.5 months, including a several-month period of delayed implantation of the embryo. Northern fur seal males aggressively select and defend the specific females in their harems. Females typically reach sexual maturity around 3–4 years. The males reach sexual maturity around the same time, but do not become territorial or mate until 6–10 years. The breeding season typically begins in November and lasts 2–3 months. The northern fur seals begin their breeding season as early as June due to their region, climate, and resources. In all cases, the males arrive a few weeks early to fight for their territory and groups of females with which to mate. They congregate at rocky, isolated breeding grounds and defend their territory through fighting and vocalization. Males typically do not leave their territory for the entirety of the breeding season, fasting and competing until all energy sources are depleted. The Juan Fernandez fur seals deviate from this typical behavior, using aquatic breeding territories not seen in other fur seals. They use rocky sites for breeding, but males fight for territory on land and on the shoreline and in the water. Upon arriving to the breeding grounds, females give birth to their pups from the previous season. About a week later, the females mate again and shortly after begin their feeding cycle, which typically consists of foraging and feeding at sea for about 5 days, then returning to the breeding grounds to nurse the pups
resides near temperate islands in the South Atlantic and Indian Oceans north of the Antarctic Polar Front, but juvenile males have been seen wandering as far north as Brazil and South Africa. Behavior and ecology Typically, fur seals gather during the summer in large rookeries at specific beaches or rocky outcrops to give birth and breed. All species are polygynous, meaning dominant males reproduce with more than one female. For most species, total gestation lasts about 11.5 months, including a several-month period of delayed implantation of the embryo. Northern fur seal males aggressively select and defend the specific females in their harems. Females typically reach sexual maturity around 3–4 years. The males reach sexual maturity around the same time, but do not become territorial or mate until 6–10 years. The breeding season typically begins in November and lasts 2–3 months. The northern fur seals begin their breeding season as early as June due to their region, climate, and resources. In all cases, the males arrive a few weeks early to fight for their territory and groups of females with which to mate. They congregate at rocky, isolated breeding grounds and defend their territory through fighting and vocalization. Males typically do not leave their territory for the entirety of the breeding season, fasting and competing until all energy sources are depleted. The Juan Fernandez fur seals deviate from this typical behavior, using aquatic breeding territories not seen in other fur seals. They use rocky sites for breeding, but males fight for territory on land and on the shoreline and in the water. Upon arriving to the breeding grounds, females give birth to their pups from the previous season. About a week later, the females mate again and shortly after begin their feeding cycle, which typically consists of foraging and feeding at sea for about 5 days, then returning to the breeding grounds to nurse the pups for about 2 days. Mothers and pups locate each other using call recognition during nursing period. The Juan Fernandez fur seal has a particularly long feeding cycle, with about 12 days of foraging and feeding and 5 days of nursing. Most fur seals continue this cycle for about 9 months until they wean their pup. The exception to this is the Antarctic fur seal, which has a feeding cycle that lasts only 4 months. During foraging trips, most female fur seals travel around 200 km from the breeding site, and can dive around 200 m depending on food availability. The remainder of the year, fur seals lead a largely pelagic existence in the open sea, pursuing their prey wherever it is abundant. They
group of West Germanic languages, including: Old Frisian, spoken in Frisia from the 8th to 16th Century Middle Frisian, spoken in Frisia from the 16th to 19th Century North Frisian language, spoken in Schleswig-Holstein, Germany Saterland Frisian language, spoken in Lower Saxony, Germany West Frisian language, spoken in Friesland, Netherlands Frisian or Friesian may also refer to: Animal breeds Friesian (chicken), a Dutch breed of chicken East Friesian (sheep),
in Frisia from the 8th to 16th Century Middle Frisian, spoken in Frisia from the 16th to 19th Century North Frisian language, spoken in Schleswig-Holstein, Germany Saterland Frisian language, spoken in Lower Saxony, Germany West Frisian language, spoken in Friesland, Netherlands Frisian or Friesian may also refer to: Animal breeds Friesian (chicken), a Dutch breed of chicken East Friesian (sheep), a breed of sheep notable for its high production of milk Friesian cross, a cross of the Friesian horse with any other breed Friesian horse, a horse breed from Friesland Friesian Sporthorse, a type of
city of Bloemfontein Fauna (album), a 2008 album by Oh Land Fauna (film), a 2020 Mexican/Canadian drama film Fauna, a fictional character from Disney's Sleeping Beauty, see Flora, Fauna, and Merryweather Fauna, a 2009 Spiel des Jahres-nominated board game Fauna, a female
Land Fauna (film), a 2020 Mexican/Canadian drama film Fauna, a fictional character from Disney's Sleeping Beauty, see Flora, Fauna, and Merryweather Fauna, a 2009 Spiel des Jahres-nominated board game Fauna,
and stories by Wanda, a shantytown prostitute Fellini met on the set of Il Bidone. Pier Paolo Pasolini was hired to translate Flaiano and Pinelli's dialogue into Roman dialect and to supervise researches in the vice-afflicted suburbs of Rome. The movie won the Academy Award for Best Foreign Language Film at the 30th Academy Awards and brought Masina the Best Actress Award at Cannes for her performance. With Pinelli, he developed Journey with Anita for Sophia Loren and Gregory Peck. An "invention born out of intimate truth", the script was based on Fellini's return to Rimini with a mistress to attend his father's funeral. Due to Loren's unavailability, the project was shelved and resurrected twenty-five years later as Lovers and Liars (1981), a comedy directed by Mario Monicelli with Goldie Hawn and Giancarlo Giannini. For Eduardo De Filippo, he co-wrote the script of Fortunella, tailoring the lead role to accommodate Masina's particular sensibility. The Hollywood on the Tiber phenomenon of 1958 in which American studios profited from the cheap studio labour available in Rome provided the backdrop for photojournalists to steal shots of celebrities on the via Veneto. The scandal provoked by Turkish dancer Haish Nana's improvised striptease at a nightclub captured Fellini's imagination: he decided to end his latest script-in-progress, Moraldo in the City, with an all-night "orgy" at a seaside villa. Pierluigi Praturlon's photos of Anita Ekberg wading fully dressed in the Trevi Fountain provided further inspiration for Fellini and his scriptwriters. Changing the title of the screenplay to La Dolce Vita, Fellini soon clashed with his producer on casting: The director insisted on the relatively unknown Mastroianni while De Laurentiis wanted Paul Newman as a hedge on his investment. Reaching an impasse, De Laurentiis sold the rights to publishing mogul Angelo Rizzoli. Shooting began on 16 March 1959 with Anita Ekberg climbing the stairs to the cupola of Saint Peter's in a mammoth décor constructed at Cinecittà. The statue of Christ flown by helicopter over Rome to St. Peter's Square was inspired by an actual media event on 1 May 1956, which Fellini had witnessed. The film wrapped 15 August on a deserted beach at Passo Oscuro with a bloated mutant fish designed by Piero Gherardi. La Dolce Vita broke all box office records. Despite scalpers selling tickets at 1000 lire, crowds queued in line for hours to see an "immoral movie" before the censors banned it. At an exclusive Milan screening on 5 February 1960, one outraged patron spat on Fellini while others hurled insults. Denounced in parliament by right-wing conservatives, undersecretary Domenico Magrì of the Christian Democrats demanded tolerance for the film's controversial themes. The Vatican's official press organ, l'Osservatore Romano, lobbied for censorship while the Board of Roman Parish Priests and the Genealogical Board of Italian Nobility attacked the film. In one documented instance involving favourable reviews written by the Jesuits of San Fedele, defending La Dolce Vita had severe consequences. In competition at Cannes alongside Antonioni's L'Avventura, the film won the Palme d'Or awarded by presiding juror Georges Simenon. The Belgian writer was promptly "hissed at" by the disapproving festival crowd. Art films and dreams (1961–1969) A major discovery for Fellini after his Italian neorealism period (1950–1959) was the work of Carl Jung. After meeting Jungian psychoanalyst Dr. Ernst Bernhard in early 1960, he read Jung's autobiography, Memories, Dreams, Reflections (1963) and experimented with LSD. Bernhard also recommended that Fellini consult the I Ching and keep a record of his dreams. What Fellini formerly accepted as "his extrasensory perceptions" were now interpreted as psychic manifestations of the unconscious. Bernhard's focus on Jungian depth psychology proved to be the single greatest influence on Fellini's mature style and marked the turning point in his work from neorealism to filmmaking that was "primarily oneiric". As a consequence, Jung's seminal ideas on the anima and the animus, the role of archetypes and the collective unconscious directly influenced such films as (1963), Juliet of the Spirits (1965), Fellini Satyricon (1969), Casanova (1976), and City of Women (1980). Other key influences on his work include Luis Buñuel. Charlie Chaplin, Sergei Eisenstein, Buster Keaton, Laurel and Hardy, the Marx Brothers, and Roberto Rossellini. Exploiting La Dolce Vitas success, financier Angelo Rizzoli set up Federiz in 1960, an independent film company, for Fellini and production manager Clemente Fracassi to discover and produce new talent. Despite the best intentions, their overcautious editorial and business skills forced the company to close down soon after cancelling Pasolini's project, Accattone (1961). Condemned as a "public sinner", for La Dolce Vita, Fellini responded with The Temptations of Doctor Antonio, a segment in the omnibus Boccaccio '70. His second colour film, it was the sole project green-lighted at Federiz. Infused with the surrealistic satire that characterized the young Fellini's work at Marc'Aurelio, the film ridiculed a crusader against vice, interpreted by Peppino De Filippo, who goes insane trying to censor a billboard of Anita Ekberg espousing the virtues of milk. In an October 1960 letter to his colleague Brunello Rondi, Fellini first outlined his film ideas about a man suffering creative block: "Well then – a guy (a writer? any kind of professional man? a theatrical producer?) has to interrupt the usual rhythm of his life for two weeks because of a not-too-serious disease. It's a warning bell: something is blocking up his system." Unclear about the script, its title, and his protagonist's profession, he scouted locations throughout Italy "looking for the film", in the hope of resolving his confusion. Flaiano suggested La bella confusione (literally The Beautiful Confusion) as the movie's title. Under pressure from his producers, Fellini finally settled on , a self-referential title referring principally (but not exclusively) to the number of films he had directed up to that time. Giving the order to start production in spring 1962, Fellini signed deals with his producer Rizzoli, fixed dates, had sets constructed, cast Mastroianni, Anouk Aimée, and Sandra Milo in lead roles, and did screen tests at the Scalera Studios in Rome. He hired cinematographer Gianni Di Venanzo, among key personnel. But apart from naming his hero Guido Anselmi, he still couldn't decide what his character did for a living. The crisis came to a head in April when, sitting in his Cinecittà office, he began a letter to Rizzoli confessing he had "lost his film" and had to abandon the project. Interrupted by the chief machinist requesting he celebrate the launch of , Fellini put aside the letter and went on the set. Raising a toast to the crew, he "felt overwhelmed by shame… I was in a no exit situation. I was a director who wanted to make a film he no longer remembers. And lo and behold, at that very moment everything fell into place. I got straight to the heart of the film. I would narrate everything that had been happening to me. I would make a film telling the story of a director who no longer knows what film he wanted to make". The self-mirroring structure makes the entire film inseparable from its reflecting construction. Shooting began on 9 May 1962. Perplexed by the seemingly chaotic, incessant improvisation on the set, Deena Boyer, the director's American press officer at the time, asked for a rationale. Fellini told her that he hoped to convey the three levels "on which our minds live: the past, the present, and the conditional — the realm of fantasy". After shooting wrapped on 14 October, Nino Rota composed various circus marches and fanfares that would later become signature tunes of the maestro's cinema. Nominated for four Oscars, won awards for best foreign language film and best costume design in black-and-white. In California for the ceremony, Fellini toured Disneyland with Walt Disney the day after. Increasingly attracted to parapsychology, Fellini met the Turin antiquarian Gustavo Rol in 1963. Rol, a former banker, introduced him to the world of Spiritism and séances. In 1964, Fellini took LSD under the supervision of Emilio Servadio, his psychoanalyst during the 1954 production of La Strada. For years reserved about what actually occurred that Sunday afternoon, he admitted in 1992 that ... objects and their functions no longer had any significance. All I perceived was perception itself, the hell of forms and figures devoid of human emotion and detached from the reality of my unreal environment. I was an instrument in a virtual world that constantly renewed its own meaningless image in a living world that was itself perceived outside of nature. And since the appearance of things was no longer definitive but limitless, this paradisiacal awareness freed me from the reality external to my self. The fire and the rose, as it were, became one. Fellini's hallucinatory insights were given full flower in his first colour feature Juliet of the Spirits (1965), depicting Giulietta Masina as Juliet, a housewife who rightly suspects her husband's infidelity and succumbs to the voices of spirits summoned during a séance at her home. Her sexually voracious next door neighbor Suzy (Sandra Milo) introduces Juliet to a world of uninhibited sensuality, but Juliet is haunted by childhood memories of her Catholic guilt and a teenaged friend who committed suicide. Complex and filled with psychological symbolism, the film is set to a jaunty score by Nino Rota. Nostalgia, sexuality, and politics (1970–1980) To help promote Satyricon in the
production in spring 1962, Fellini signed deals with his producer Rizzoli, fixed dates, had sets constructed, cast Mastroianni, Anouk Aimée, and Sandra Milo in lead roles, and did screen tests at the Scalera Studios in Rome. He hired cinematographer Gianni Di Venanzo, among key personnel. But apart from naming his hero Guido Anselmi, he still couldn't decide what his character did for a living. The crisis came to a head in April when, sitting in his Cinecittà office, he began a letter to Rizzoli confessing he had "lost his film" and had to abandon the project. Interrupted by the chief machinist requesting he celebrate the launch of , Fellini put aside the letter and went on the set. Raising a toast to the crew, he "felt overwhelmed by shame… I was in a no exit situation. I was a director who wanted to make a film he no longer remembers. And lo and behold, at that very moment everything fell into place. I got straight to the heart of the film. I would narrate everything that had been happening to me. I would make a film telling the story of a director who no longer knows what film he wanted to make". The self-mirroring structure makes the entire film inseparable from its reflecting construction. Shooting began on 9 May 1962. Perplexed by the seemingly chaotic, incessant improvisation on the set, Deena Boyer, the director's American press officer at the time, asked for a rationale. Fellini told her that he hoped to convey the three levels "on which our minds live: the past, the present, and the conditional — the realm of fantasy". After shooting wrapped on 14 October, Nino Rota composed various circus marches and fanfares that would later become signature tunes of the maestro's cinema. Nominated for four Oscars, won awards for best foreign language film and best costume design in black-and-white. In California for the ceremony, Fellini toured Disneyland with Walt Disney the day after. Increasingly attracted to parapsychology, Fellini met the Turin antiquarian Gustavo Rol in 1963. Rol, a former banker, introduced him to the world of Spiritism and séances. In 1964, Fellini took LSD under the supervision of Emilio Servadio, his psychoanalyst during the 1954 production of La Strada. For years reserved about what actually occurred that Sunday afternoon, he admitted in 1992 that ... objects and their functions no longer had any significance. All I perceived was perception itself, the hell of forms and figures devoid of human emotion and detached from the reality of my unreal environment. I was an instrument in a virtual world that constantly renewed its own meaningless image in a living world that was itself perceived outside of nature. And since the appearance of things was no longer definitive but limitless, this paradisiacal awareness freed me from the reality external to my self. The fire and the rose, as it were, became one. Fellini's hallucinatory insights were given full flower in his first colour feature Juliet of the Spirits (1965), depicting Giulietta Masina as Juliet, a housewife who rightly suspects her husband's infidelity and succumbs to the voices of spirits summoned during a séance at her home. Her sexually voracious next door neighbor Suzy (Sandra Milo) introduces Juliet to a world of uninhibited sensuality, but Juliet is haunted by childhood memories of her Catholic guilt and a teenaged friend who committed suicide. Complex and filled with psychological symbolism, the film is set to a jaunty score by Nino Rota. Nostalgia, sexuality, and politics (1970–1980) To help promote Satyricon in the United States, Fellini flew to Los Angeles in January 1970 for interviews with Dick Cavett and David Frost. He also met with film director Paul Mazursky who wanted to star him alongside Donald Sutherland in his new film, Alex in Wonderland. In February, Fellini scouted locations in Paris for The Clowns, a docufiction both for cinema and television, based on his childhood memories of the circus and a "coherent theory of clowning." As he saw it, the clown "was always the caricature of a well-established, ordered, peaceful society. But today all is temporary, disordered, grotesque. Who can still laugh at clowns?... All the world plays a clown now." In March 1971, Fellini began production on Roma, a seemingly random collection of episodes informed by the director's memories and impressions of Rome. The "diverse sequences," writes Fellini scholar Peter Bondanella, "are held together only by the fact that they all ultimately originate from the director's fertile imagination." The film's opening scene anticipates Amarcord while its most surreal sequence involves an ecclesiastical fashion show in which nuns and priests roller skate past shipwrecks of cobwebbed skeletons. Over a period of six months between January and June 1973, Fellini shot the Oscar-winning Amarcord. Loosely based on the director's 1968 autobiographical essay My Rimini, the film depicts the adolescent Titta and his friends working out their sexual frustrations against the religious and Fascist backdrop of a provincial town in Italy during the 1930s. Produced by Franco Cristaldi, the seriocomic movie became Fellini's second biggest commercial success after La Dolce Vita. Circular in form, Amarcord avoids plot and linear narrative in a way similar to The Clowns and Roma. The director's overriding concern with developing a poetic form of cinema was first outlined in a 1965 interview he gave to The New Yorker journalist Lillian Ross: "I am trying to free my work from certain constrictions – a story with a beginning, a development, an ending. It should be more like a poem with metre and cadence." Late films and projects (1981–1990) Organized by his publisher Diogenes Verlag in 1982, the first major exhibition of 63 drawings by Fellini was held in Paris, Brussels, and the Pierre Matisse Gallery in New York. A gifted caricaturist, he found much of the inspiration for his sketches from his own dreams while the films-in-progress both originated from and stimulated drawings for characters, decor, costumes and set designs. Under the title, I disegni di Fellini (Fellini's Designs), he published 350 drawings executed in pencil, watercolours, and felt pens. On 6 September 1985 Fellini was awarded the Golden Lion for lifetime achievement at the 42nd Venice Film Festival. That same year, he became the first non-American to receive the Film Society of Lincoln Center's annual award for cinematic achievement. Long fascinated by Carlos Castaneda's The Teachings of Don Juan: A Yaqui Way of Knowledge, Fellini accompanied the Peruvian author on a journey to the Yucatán to assess the feasibility of a film. After first meeting Castaneda in Rome in October 1984, Fellini drafted a treatment with Pinelli titled Viaggio a Tulun. Producer Alberto Grimaldi, prepared to buy film rights to all of Castaneda's work, then paid for pre-production research taking Fellini and his entourage from Rome to Los Angeles and the jungles of Mexico in October 1985. When Castaneda inexplicably disappeared and the project fell through, Fellini's mystico-shamanic adventures were scripted with Pinelli and serialized in Corriere della Sera in May 1986. A barely veiled satirical interpretation of Castaneda's work, Viaggio a Tulun was published in 1989 as a graphic novel with artwork by Milo Manara and as Trip to Tulum in America in 1990. For Intervista, produced by Ibrahim Moussa and RAI Television, Fellini intercut memories of the first time he visited Cinecittà in 1939 with present-day footage of himself at work on a screen adaptation of Franz Kafka's Amerika. A meditation on the nature of memory and film production, it won the special 40th Anniversary Prize at Cannes and the 15th Moscow International Film Festival Golden Prize. In Brussels later that year, a panel of thirty professionals from eighteen European countries named Fellini the world's best director and the best European film of all time. In early 1989 Fellini began production on The Voice of the Moon, based on Ermanno Cavazzoni's novel, Il poema dei lunatici (The Lunatics' Poem). A small town was built at Empire Studios on the via Pontina outside Rome. Starring Roberto Benigni as Ivo Salvini, a madcap poetic figure newly released from a mental institution, the character is a combination of La Stradas Gelsomina, Pinocchio, and Italian poet Giacomo Leopardi. Fellini improvised as he filmed, using as a guide a rough treatment written with Pinelli. Despite its modest critical and commercial success in Italy, and its warm reception by French critics, it failed to interest North American distributors. Fellini won the Praemium Imperiale, an international prize in the visual arts given by the Japan Art Association in 1990. Final years (1991–1993) In July 1991 and April 1992, Fellini worked in close collaboration with Canadian filmmaker Damian Pettigrew to establish "the longest and most detailed conversations ever recorded on film". Described as the "Maestro's spiritual testament" by his biographer Tullio Kezich, excerpts culled from the conversations later served as the basis of their feature documentary, Fellini: I'm a Born Liar (2002) and the book, I'm a Born Liar: A Fellini Lexicon. Finding it increasingly difficult to secure financing for feature films, Fellini developed a suite of television projects whose titles reflect their subjects: Attore, Napoli, L'Inferno, L'opera lirica, and L'America. In April 1993 Fellini received his fifth Oscar, for lifetime achievement, "in recognition of his cinematic accomplishments that have thrilled and entertained audiences worldwide". On 16 June, he entered the Cantonal Hospital in Zürich for an angioplasty on his femoral artery but suffered a stroke at the Grand Hotel in Rimini two months later. Partially paralyzed, he was first transferred to Ferrara for rehabilitation and then to the Policlinico Umberto I in Rome to be near his wife, also hospitalized. He suffered a second stroke and fell into an irreversible coma. Death Fellini died in Rome on 31 October 1993 at the age of 73 after a heart attack he suffered a few weeks earlier, a day after his 50th wedding anniversary. The memorial service, in Studio 5 at Cinecittà, was attended by an estimated 70,000 people. At Giulietta Masina's request, trumpeter Mauro Maur played Nino Rota's "Improvviso dell'Angelo" during the ceremony. Five months later, on 23 March 1994, Masina died of lung cancer. Fellini, Masina and their son, Pierfederico, are buried in a bronze sepulchre sculpted by Arnaldo Pomodoro. Designed as a ship's prow, the tomb is at the main entrance to the cemetery of Rimini. The Federico Fellini Airport in Rimini is named in his honour. Religious views Fellini was raised in a Roman Catholic family and considered himself a Catholic, but avoided formal activity in the Catholic Church. Fellini's films include Catholic themes; some celebrate Catholic teachings, while others criticize or ridicule church dogma. Political views While Fellini was for the most part indifferent to politics, he had a general dislike of authoritarian institutions, and is interpreted by Bondanella as believing in
were also released as singles, with less success. 1987–1995: Departure of Buckingham and Nicks With a ten-week tour scheduled, Buckingham held back at the last minute, saying he felt his creativity was being stifled. A group meeting at Christine McVie's house on 7 August 1987 resulted in turmoil. Tensions were coming to a head. Fleetwood said in his autobiography that there was a physical altercation between Buckingham and Nicks. Buckingham left the band the following day. After Buckingham's departure Fleetwood Mac added two new guitarists to the band, Billy Burnette and Rick Vito, again without auditions. Burnette was the son of Dorsey Burnette and nephew of Johnny Burnette, both of The Rock and Roll Trio. He had already worked with Fleetwood in Zoo, with Christine McVie as part of her solo band, had done some session work with Nicks, and backed Buckingham on Saturday Night Live. Fleetwood and Christine McVie had played on his Try Me album in 1985. Vito, a Peter Green admirer, had played with many artists from Bonnie Raitt to John Mayall, to Roger McGuinn in Thunderbyrd and worked with John McVie on two Mayall albums. The 1987–88 "Shake the Cage" tour was the first outing for this line-up. It was successful enough to warrant the release of a concert video, entitled "Tango in the Night", which was filmed at San Francisco's Cow Palace arena in December 1987. Capitalising on the success of Tango in the Night, the band released a Greatest Hits album in 1988. It featured singles from the 1975–1988 era and included two new compositions, "No Questions Asked" written by Nicks and "As Long as You Follow", written by Christine McVie and Quintela. 'As Long as You Follow' was released as a single in 1988 but only made No. 43 in the US and No.66 in the UK, although it reached No.1 on the US Adult Contemporary charts. The Greatest Hits album, which peaked at No. 3 in the UK and No. 14 in the US (though it has since sold over 8 million copies there) was dedicated by the band to Buckingham, with whom they were now reconciled. In 1990, Fleetwood Mac released their fifteenth studio album, Behind the Mask. With this album the band veered away from the stylised sound that Buckingham had evolved during his tenure in the band (which was also evident in his solo work) and developed a more adult contemporary style with producer Greg Ladanyi. The album yielded only one Top 40 hit, Christine McVie's "Save Me". Behind the Mask only achieved Gold album status in the US, peaking at No. 18 on the Billboard album chart, though it entered the UK Albums Chart at No. 1. It received mixed reviews and was seen by some music critics as a low point for the band in the absence of Buckingham (who had actually made a guest appearance playing on the title track). But Rolling Stone magazine said that Vito and Burnette were "the best thing to ever happen to Fleetwood Mac". The subsequent "Behind the Mask" tour saw the band play sold-out shows at London's Wembley Stadium. In the final show in Los Angeles, Buckingham joined the band on stage. The two women of the band, McVie and Nicks, had decided that the tour would be their last (McVie's father had died during the tour), although both stated that they would still record with the band. In 1991, however, Nicks and Rick Vito left Fleetwood Mac altogether. In 1992, Fleetwood arranged a 4-disc box set, spanning highlights from the band's 25-year history, entitled 25 Years – The Chain (an edited 2-disc set was also available). A notable inclusion in the box set was "Silver Springs", a Nicks composition that was recorded during the Rumours sessions but was omitted from the album and used as the B-side of "Go Your Own Way". Nicks had requested use of this track for her 1991 best-of compilation TimeSpace, but Fleetwood had refused as he had planned to include it in this collection as a rarity. The disagreement between Nicks and Fleetwood garnered press coverage and was believed to have been the main reason for Nicks leaving the band in 1991. The box set also included a new Nicks/Rick Vito composition, "Paper Doll", which was released in the US as a single and produced by Buckingham and Richard Dashut. There were also two new Christine McVie compositions, "Heart of Stone" and "Love Shines". "Love Shines" was released as a single in the UK and elsewhere. Buckingham also contributed a new song, "Make Me a Mask". Fleetwood also released a deluxe hardcover companion book to coincide with the release of the box set, titled My 25 Years in Fleetwood Mac. The volume featured notes written by Fleetwood detailing the band's 25-year history and many rare photographs. The Buckingham/Nicks/McVie/McVie/Fleetwood line-up reunited in 1993 at the request of US President Bill Clinton for his first Inaugural Ball. Clinton had made Fleetwood Mac's "Don't Stop" his campaign theme song. His request for it to be performed at the Inauguration Ball was met with enthusiasm by the band, although this line-up had no intention of reuniting again. Inspired by the new interest in the band, Mick Fleetwood, John McVie, and Christine McVie recorded another album as Fleetwood Mac, with Billy Burnette taking lead guitar duties. Burnette left in March 1993 to record a country album and pursue an acting career and Bekka Bramlett, who had worked a year earlier with Fleetwood's Zoo, was recruited to take his place. Solo singer-songwriter/guitarist and Traffic member Dave Mason, who had worked with Bekka's parents Delaney & Bonnie twenty-five years earlier, was subsequently added. In March 1994 Billy Burnette, a good friend and co-songwriter with Delaney Bramlett, returned to the band with Fleetwood's blessing. The band, minus Christine McVie, toured in 1994, opening for Crosby, Stills, & Nash and in 1995 as part of a package with REO Speedwagon and Pat Benatar. This tour saw the band perform classic Fleetwood Mac songs from their 1967–1974 era. In 1995, at a concert in Tokyo, the band was greeted by former member Jeremy Spencer, who performed a few songs with them. On 10 October 1995, Fleetwood Mac released their sixteenth studio album, Time, which was not a success. Although it hit the UK Top 60 for one week, the album had zero impact in the US. It failed to graze the Billboard Top 200 albums chart, a reversal for a band that had been a mainstay on that chart for most of the previous two decades. Shortly after the album's release, Christine McVie informed the band that the album would be her last. Bramlett and Burnette subsequently formed a country music duo, Bekka & Billy. 1995–2007: Re-formation, Reunion and Christine McVie's departure Just weeks after disbanding Fleetwood Mac, Mick Fleetwood started working with Lindsey Buckingham again. John McVie was added to the sessions, and later Christine McVie. Stevie Nicks also enlisted Buckingham to produce a song for a soundtrack. In May 1996 Fleetwood, John McVie, Christine McVie, and Nicks performed together at a private party in Louisville, Kentucky, prior to the Kentucky Derby, with Steve Winwood filling in for Buckingham. A week later the Twister film soundtrack was released, which featured the Nicks-Buckingham duet "Twisted", with Fleetwood on drums. This eventually led to a full reunion of the Rumours line-up, which officially reformed in March 1997. The regrouped Fleetwood Mac performed a live concert on a soundstage at Warner Bros. Burbank, California, on 22 May 1997. The concert was recorded, and from this performance came the 1997 live album The Dance, which brought the band back to the top of the US album charts for the first time in 10 years. The Dance returned Fleetwood Mac to a superstar status they had not enjoyed since Tango in the Night. The album was certified 5 million units by the RIAA. An arena tour followed the MTV premiere of The Dance and kept the reunited Fleetwood Mac on the road throughout much of 1997, the 20th anniversary of Rumours. With additional musicians Neale Heywood on guitar, Brett Tuggle on keyboards, Lenny Castro on percussion and Sharon Celani (who had toured with the band in the late 1980s) and Mindy Stein on backing vocals, this would be the final appearance of the classic line-up including Christine McVie for 16 years. Neale Heywood and Sharon Celani remain touring members to this day. In 1998 Fleetwood Mac were inducted into the Rock and Roll Hall of Fame. Members inducted included the original band, Mick Fleetwood, John McVie, Peter Green, Jeremy Spencer and Danny Kirwan, and Rumours-era members Christine McVie, Stevie Nicks and Lindsey Buckingham. Bob Welch was not included, despite his key role in keeping the band alive during the early 1970s. The Rumours-era version of the band performed both at the induction ceremony and at the Grammy Awards programme that year. Peter Green attended the induction ceremony but did not perform with his former bandmates, opting instead to perform his composition "Black Magic Woman" with Santana, who were inducted the same night. Neither Jeremy Spencer nor Danny Kirwan attended. Fleetwood Mac also received the "Outstanding Contribution to Music" award at the Brit Awards (British Phonographic Industry Awards) the same year. In 1998 Christine McVie left the band. Her departure left Buckingham and Nicks to sing all the lead vocals for the band's seventeenth album, Say You Will, released in 2003, although Christine contributed some backing vocals and keyboards. The album debuted at No.3 on the Billboard 200 chart (No. 6 in the UK) and yielded chart hits with "Peacekeeper" and the title track, and a successful world arena tour which lasted through 2004. The tour grossed $27,711,129 and was ranked No. 21 in the top 25 grossing tours of 2004. Around 2004–05 there were rumours of a reunion of the early line-up of Fleetwood Mac involving Peter Green and Jeremy Spencer. While these two apparently remained unconvinced, in April 2006 bassist John McVie, during a question-and-answer session on the Penguin Fleetwood Mac fan website, said of the reunion idea: In interviews given in November 2006 to support his solo album Under the Skin, Buckingham stated that plans for the band to reunite once more for a 2008 tour were still on the cards. Recording plans had been put on hold for the foreseeable future. In an interview Nicks gave to the UK newspaper The Daily Telegraph i in September 2007, she stated that she was unwilling to carry on with the band unless Christine McVie returned. 2008–2013: Unleashed tour and Extended Play In March 2008, it was mooted that Sheryl Crow might work with Fleetwood Mac in 2009. Crow and Stevie Nicks had collaborated in the past and Crow had stated that Nicks had been a great teacher and inspiration to her. Later, Buckingham said that the potential collaboration with Crow had "lost its momentum". and the idea was abandoned. In March 2009, Fleetwood Mac started their "Unleashed" tour, again without Christine McVie. It was a greatest hits show, although album tracks such as "Storms" and "I Know I'm Not Wrong" were also played. During their show on 20 June 2009 in New Orleans, Louisiana, Stevie Nicks premiered part of a new song that she had written about Hurricane Katrina. The song was later released as "New Orleans" on Nicks's 2011 album In Your Dreams with Mick Fleetwood on drums. In October 2009 and November the band toured Europe, followed by Australia and New Zealand in December. In October, The Very Best of Fleetwood Mac was re-released in an extended two-disc format (this format having been released in the US in 2002), entering at number six on the UK Albums Chart. On 1 November 2009 a one-hour documentary, Fleetwood Mac: Don't Stop, was broadcast in the UK on BBC One, featuring recent interviews with all four current band members. During the documentary Nicks gave a candid summary of the current state of her relationship with Buckingham, saying "Maybe when we're 75 and Fleetwood Mac is a distant memory, we might be friends." On 6 November 2009, Fleetwood Mac played the last show of the European leg of their Unleashed tour at London's Wembley Arena. Christine McVie was present in the audience. Nicks paid tribute to her from the stage to a standing ovation from the audience, saying that she thought about her former bandmate "every day", and dedicated that night's performance of "Landslide" to her. On 19 December 2009 Fleetwood Mac played the second-to-last show of their Unleashed tour to a sell-out crowd in New Zealand, at what was originally intended to be a one-off event at the TSB Bowl of Brooklands in New Plymouth. Tickets, after pre-sales, sold out within twelve minutes of public release. Another date, Sunday 20 December, was added and also sold out. The tour grossed $84,900,000 and was ranked No. 13 in the highest grossing worldwide tours of 2009. On 19 October 2010, Fleetwood Mac played a private show at the Phoenician Hotel in Scottsdale, Arizona for TPG (Texas Pacific Group). On 3 May 2011, the Fox Network broadcast an episode of Glee entitled "Rumours" that featured six songs from the band's 1977 album. The show sparked renewed interest in the band and its commercially most successful album, and Rumours re-entered the Billboard 200 chart at No. 11 in the same week that Nicks's new solo album In Your Dreams debuted at No. 6. (She was quoted by Billboard saying that her new album was "my own little Rumours.") The two recordings sold about 30,000 and 52,000 units respectively. Music downloads accounted for 91 per cent of the Rumours sales. The spike in sales for Rumours represented an increase of 1,951%. It was the highest chart entry by a previously issued album since The Rolling Stones''' reissue of Exile On Main St. re-entered the chart at No. 2 on 5 June 2010. In an interview in July 2012 Nicks confirmed that the band would reunite for a tour in 2013. Original Fleetwood Mac bassist Bob Brunning died on 18 October 2011 at the age of 68. Former guitarist and singer Bob Weston was found dead on 3 January 2012 at the age of 64. Former singer and guitarist Bob Welch was found dead from a self-inflicted gunshot wound on 7 June 2012 at the age of 66. Don Aaron, a spokesman at the scene, stated, "He died from an apparent self-inflicted gunshot wound to the chest." A suicide note was found. Welch had been struggling with health issues and was dealing with depression. His wife discovered his body. The band's 2013 tour, which took place in 34 cities, started on 4 April in Columbus, OH. The band performed two new songs ("Sad Angel" and "Without You"), which Buckingham described as some of the most "Fleetwood Mac-ey" sounding songs since Mirage. 'Without You' was re-recorded from the Buckingham-Nicks era. The band released their first new studio material in ten years, Extended Play, on 30 April 2013. The EP debuted and peaked at No. 48 in the US and produced one single, "Sad Angel". On 25 and 27 September 2013, the second and third nights of the band's London O2 shows, Christine McVie joined them on stage for "Don't Stop". On 27 October 2013, the band cancelled their New Zealand and Australian performances after John McVie had been diagnosed with cancer, so that he could undergo treatment. They said: "We are sorry not to be able to play these Australian and New Zealand dates. We hope our Australian and New Zealand fans as well as Fleetwood Mac fans everywhere will join us in wishing John and his family all the best." Also in October 2013, Stevie Nicks appeared in American Horror Story: Coven with Fleetwood Mac's song "Seven Wonders" playing in the background. In November 2013, Christine McVie expressed interest in a return to Fleetwood Mac, and also affirmed that John McVie's prognosis was "really good". 2014–present: Return of McVie and departure of Buckingham On 11 January 2014, Mick Fleetwood confirmed that Christine McVie would be rejoining Fleetwood Mac. On with the Show, a 33-city North American tour, opened in Minneapolis, Minnesota, on 30 September 2014. A series of May–June 2015 arena dates in the United Kingdom went on sale on 14 November, selling out in minutes. Due to high demand, additional dates were added to the tour, including an Australian leg. In January 2015, Buckingham suggested that the new album and tour might be Fleetwood Mac's last, and that the band would cease operations in 2015 or soon afterwards. He concluded: "We're going to continue working on the new album and the solo stuff will take a back seat for a year or two. A beautiful way to wrap up this last act." But Mick Fleetwood stated that the new album might take a few years to complete and that they were waiting for contributions from Nicks, who had been ambivalent about committing to a new record. In August 2016, Fleetwood revealed that while the band had "a huge amount of recorded music", virtually none of it featured Nicks. Buckingham and Christine McVie, however, had contributed multiple songs to the new project. Fleetwood told Ultimate Classic Rock: "She [McVie] ... wrote up a storm ... She and Lindsey could probably have a mighty strong duet album if they want. In truth, I hope it will come to more than that. There really are dozens of songs. And they’re really good. So we’ll see." Nicks explained her reluctance to record another album with Fleetwood Mac. "Is it possible that Fleetwood Mac might do another record? I can never tell you yes or no, because I don't know. I honestly don't know... It's like, do you want to take a chance of going in and setting up in a room for like a year [to record an album] and having a bunch of arguing people? And then not wanting to go on tour because you just spent a year arguing?". She also emphasised that people do not buy as many records as they used to. On 9 June 2017, Buckingham and Christine McVie released a new album, titled Lindsey Buckingham/Christine McVie, which included contributions from Mick Fleetwood and John McVie. The album was preceded by the single "In My World". A 38-date tour began on 21 June and concluded 16 November. Fleetwood Mac also planned to embark on another tour in 2018. The band headlined the second night of the Classic West concert (on 16 July 2017 at Dodger Stadium in Los Angeles) and the second night of the Classic East concert (at New York City's Citi Field on 30 July 2017). The band received the MusiCares Person of the Year award in 2018 and reunited to perform several songs at the Grammy-hosted gala honouring them. Artists including Lorde, Harry Styles, Little Big Town and Miley Cyrus also performed. In April 2018, the song "Dreams" re-entered the Hot Rock Songs chart at No. 16 after a viral meme had featured the song. This chart re-entry came 40 years after the song had topped the Hot 100. The song's streaming totals also translated into 7,000 "equivalent album units", a jump of 12 per cent, which helped Rumours to go from No. 21 to No. 13 on the Top Rock Albums chart. That month Buckingham departed from the group a second time, having reportedly been dismissed. The reason was said to have been a disagreement about the nature of the tour, and in particular the question of whether newer or less well-known material would be included, as Buckingham wanted. Mick Fleetwood and the band appeared on CBS This Morning on 25 April 2018 and said that Buckingham would not sign off on a tour that the group had been planning for a year and a half and they had reached a "huge impasse" and "hit a brick wall". When asked if Buckingham had been fired, he said, "Well, we don't use that word because I think it's ugly." He also said that "Lindsey has huge amounts of respect and kudos to what he's done within the ranks of Fleetwood Mac and always will." In October 2018, Buckingham filed a lawsuit against Fleetwood Mac for breach of fiduciary duty, breach of oral contract and intentional interference with prospective economic advantage, among other charges. He stated that they eventually came to a settlement, which he would not share the terms of, but claimed he was "happy enough with it". Buckingham also told his version of what had led to his departure from the band. Two days after their performance at the MusiCares event he got a phone call from the band's manager Irving Azoff, who had a list of things that, as Buckingham puts it, “Stevie took issue with” that evening, including the guitarist’s outburst just before the band’s set over the intro music [for their acceptance speech being] the studio recording of Nicks’ “Rhiannon” — and the way he “smirked” during Nicks’ thank-you speech. Buckingham concedes the first point. “It wasn’t about it being ‘Rhiannon,’ ” he says. “It just undermined the impact of our entrance. That’s me being very specific about the right and wrong way to do something.” As for smirking, “The irony is that we have this standing joke that Stevie, when she talks, goes on a long time,” Buckingham says. “I may or may not have smirked. But I look over and Christine and Mick
allegedly fuelled by high consumption of drugs and alcohol. The band's eleventh studio album, Rumours (the band's first release on the main Warner label after Reprise was retired and all of its acts were reassigned to the parent label), was released in the spring of 1977. In this album, the band members laid bare the emotional turmoil they were experiencing at the time. Rumours was critically acclaimed and won the Grammy Award for Album of the Year in 1977. The album generated four Top Ten singles: Buckingham's "Go Your Own Way", Nicks's US No. 1 "Dreams" and Christine McVie's "Don't Stop" and "You Make Loving Fun". Buckingham's "Second Hand News", Nicks's "Gold Dust Woman" and "The Chain" (the only song written by all five band members) also received significant radio airplay. By 2003 Rumours had sold over 19 million copies in the US alone (certified as a diamond album by the RIAA) and a total of 40 million copies worldwide, bringing it to eighth on the list of best-selling albums. Fleetwood Mac supported the album with a lucrative tour. On 10 October 1979, Fleetwood Mac were honoured with a star on the Hollywood Walk of Fame for their contributions to the music industry at 6608 Hollywood Boulevard. Buckingham convinced Fleetwood to let his work on their next album be more experimental, and to be allowed to work on tracks at home before bringing them to the rest of the band in the studio. The result of this, the band's twelfth studio album Tusk, was a 20-track double album released in 1979. It produced three hit singles: Buckingham's "Tusk" (US No. 8), which featured the USC Trojan Marching Band, Christine McVie's "Think About Me" (US No. 20), and Nicks's six-and-a-half minute opus "Sara" (US No. 7). "Sara" was cut to four-and-a-half minutes for both the hit single and the first CD-release of the album, but the unedited version has since been restored on the 1988 greatest hits compilation, the 2004 reissue of Tusk and Fleetwood Mac's 2002 release of The Very Best of Fleetwood Mac. Original guitarist Peter Green also took part in the sessions of Tusk although his playing, on the Christine McVie track "Brown Eyes", is not credited on the album. In an interview in 2019 Fleetwood described Tusk as his "personal favourite" and said, “Kudos to Lindsey ... for us not doing a replica of Rumours." Tusk sold four million copies worldwide. Fleetwood blamed the album's relative lack of commercial success on the RKO radio chain having played the album in its entirety prior to release, thereby allowing mass home taping. The band embarked on an 11-month tour to support and promote Tusk. They travelled around the world, including the US, Australia, New Zealand, Japan, France, Belgium, Germany, the Netherlands, and the United Kingdom. In Germany, they shared the bill with reggae superstar Bob Marley. On this world tour, the band recorded music for their first live album, which was released at the end of 1980. The band's thirteenth studio album, Mirage, was released in 1982. Following 1981 solo albums by Nicks (Bella Donna), Fleetwood (The Visitor), and Buckingham (Law and Order), there was a return to a more conventional approach. Buckingham had been chided by critics, fellow band members and music business managers for the lesser commercial success of Tusk. Recorded at Château d'Hérouville in France and produced by Richard Dashut, Mirage was an attempt to recapture the huge success of Rumours. Its hits included Christine McVie's "Hold Me" and "Love in Store" (co-written by Robbie Patton and Jim Recor, respectively), Nicks's "Gypsy", and Buckingham's "Oh Diane", which made the Top 10 in the UK. A minor hit was also scored by Buckingham's "Eyes Of The World" and "Can't Go Back". In contrast to the Tusk Tour the band embarked on only a short tour of 18 American cities, the Los Angeles show being recorded and released on video. They also headlined the first US Festival, on 5 September 1982, for which the band was paid $500,000 ($ today). Mirage was certified double platinum in the US. Following Mirage the band went on hiatus, which allowed members to pursue solo careers. Nicks released two more solo albums (1983's The Wild Heart and 1985's Rock a Little). Buckingham issued Go Insane in 1984, the same year that Christine McVie made an eponymous album (yielding the Top 10 hit "Got a Hold on Me" and the Top 40 hit "Love Will Show Us How"). All three met with success, Nicks being the most popular. During this period Fleetwood had filed for bankruptcy, Nicks was admitted to the Betty Ford Clinic for addiction problems and John McVie had suffered an addiction-related seizure, all of which were attributed to the lifestyle of excess afforded to them by their worldwide success. It was rumoured that Fleetwood Mac had disbanded, but Buckingham commented that he was unhappy to allow Mirage to remain as the band's last effort. The Rumours line-up of Fleetwood Mac recorded one more album, their fourteenth studio album, Tango in the Night, in 1987. As with various other Fleetwood Mac albums, the material started off as a Buckingham solo album before becoming a group project. The album went on to become their best-selling release since Rumours, especially in the UK where it hit No. 1 three times in the following year. The album sold three million copies in the US and contained four hits: Christine McVie's "Little Lies" and "Everywhere" ('Little Lies' being co-written with McVie's new husband Eddy Quintela), Sandy Stewart and Nicks's "Seven Wonders", and Buckingham's "Big Love". "Family Man" (Buckingham and Richard Dashut), and "Isn't It Midnight" (Christine McVie), were also released as singles, with less success. 1987–1995: Departure of Buckingham and Nicks With a ten-week tour scheduled, Buckingham held back at the last minute, saying he felt his creativity was being stifled. A group meeting at Christine McVie's house on 7 August 1987 resulted in turmoil. Tensions were coming to a head. Fleetwood said in his autobiography that there was a physical altercation between Buckingham and Nicks. Buckingham left the band the following day. After Buckingham's departure Fleetwood Mac added two new guitarists to the band, Billy Burnette and Rick Vito, again without auditions. Burnette was the son of Dorsey Burnette and nephew of Johnny Burnette, both of The Rock and Roll Trio. He had already worked with Fleetwood in Zoo, with Christine McVie as part of her solo band, had done some session work with Nicks, and backed Buckingham on Saturday Night Live. Fleetwood and Christine McVie had played on his Try Me album in 1985. Vito, a Peter Green admirer, had played with many artists from Bonnie Raitt to John Mayall, to Roger McGuinn in Thunderbyrd and worked with John McVie on two Mayall albums. The 1987–88 "Shake the Cage" tour was the first outing for this line-up. It was successful enough to warrant the release of a concert video, entitled "Tango in the Night", which was filmed at San Francisco's Cow Palace arena in December 1987. Capitalising on the success of Tango in the Night, the band released a Greatest Hits album in 1988. It featured singles from the 1975–1988 era and included two new compositions, "No Questions Asked" written by Nicks and "As Long as You Follow", written by Christine McVie and Quintela. 'As Long as You Follow' was released as a single in 1988 but only made No. 43 in the US and No.66 in the UK, although it reached No.1 on the US Adult Contemporary charts. The Greatest Hits album, which peaked at No. 3 in the UK and No. 14 in the US (though it has since sold over 8 million copies there) was dedicated by the band to Buckingham, with whom they were now reconciled. In 1990, Fleetwood Mac released their fifteenth studio album, Behind the Mask. With this album the band veered away from the stylised sound that Buckingham had evolved during his tenure in the band (which was also evident in his solo work) and developed a more adult contemporary style with producer Greg Ladanyi. The album yielded only one Top 40 hit, Christine McVie's "Save Me". Behind the Mask only achieved Gold album status in the US, peaking at No. 18 on the Billboard album chart, though it entered the UK Albums Chart at No. 1. It received mixed reviews and was seen by some music critics as a low point for the band in the absence of Buckingham (who had actually made a guest appearance playing on the title track). But Rolling Stone magazine said that Vito and Burnette were "the best thing to ever happen to Fleetwood Mac". The subsequent "Behind the Mask" tour saw the band play sold-out shows at London's Wembley Stadium. In the final show in Los Angeles, Buckingham joined the band on stage. The two women of the band, McVie and Nicks, had decided that the tour would be their last (McVie's father had died during the tour), although both stated that they would still record with the band. In 1991, however, Nicks and Rick Vito left Fleetwood Mac altogether. In 1992, Fleetwood arranged a 4-disc box set, spanning highlights from the band's 25-year history, entitled 25 Years – The Chain (an edited 2-disc set was also available). A notable inclusion in the box set was "Silver Springs", a Nicks composition that was recorded during the Rumours sessions but was omitted from the album and used as the B-side of "Go Your Own Way". Nicks had requested use of this track for her 1991 best-of compilation TimeSpace, but Fleetwood had refused as he had planned to include it in this collection as a rarity. The disagreement between Nicks and Fleetwood garnered press coverage and was believed to have been the main reason for Nicks leaving the band in 1991. The box set also included a new Nicks/Rick Vito composition, "Paper Doll", which was released in the US as a single and produced by Buckingham and Richard Dashut. There were also two new Christine McVie compositions, "Heart of Stone" and "Love Shines". "Love Shines" was released as a single in the UK and elsewhere. Buckingham also contributed a new song, "Make Me a Mask". Fleetwood also released a deluxe hardcover companion book to coincide with the release of the box set, titled My 25 Years in Fleetwood Mac. The volume featured notes written by Fleetwood detailing the band's 25-year history and many rare photographs. The Buckingham/Nicks/McVie/McVie/Fleetwood line-up reunited in 1993 at the request of US President Bill Clinton for his first Inaugural Ball. Clinton had made Fleetwood Mac's "Don't Stop" his campaign theme song. His request for it to be performed at the Inauguration Ball was met with enthusiasm by the band, although this line-up had no intention of reuniting again. Inspired by the new interest in the band, Mick Fleetwood, John McVie, and Christine McVie recorded another album as Fleetwood Mac, with Billy Burnette taking lead guitar duties. Burnette left in March 1993 to record a country album and pursue an acting career and Bekka Bramlett, who had worked a year earlier with Fleetwood's Zoo, was recruited to take his place. Solo singer-songwriter/guitarist and Traffic member Dave Mason, who had worked with Bekka's parents Delaney & Bonnie twenty-five years earlier, was subsequently added. In March 1994 Billy Burnette, a good friend and co-songwriter with Delaney Bramlett, returned to the band with Fleetwood's blessing. The band, minus Christine McVie, toured in 1994, opening for Crosby, Stills, & Nash and in 1995 as part of a package with REO Speedwagon and Pat Benatar. This tour saw the band perform classic Fleetwood Mac songs from their 1967–1974 era. In 1995, at a concert in Tokyo, the band was greeted by former member Jeremy Spencer, who performed a few songs with them. On 10 October 1995, Fleetwood Mac released their sixteenth studio album, Time, which was not a success. Although it hit the UK Top 60 for one week, the album had zero impact in the US. It failed to graze the Billboard Top 200 albums chart, a reversal for a band that had been a mainstay on that chart for most of the previous two decades. Shortly after the album's release, Christine McVie informed the band that the album would be her last. Bramlett and Burnette subsequently formed a country music duo, Bekka & Billy. 1995–2007: Re-formation, Reunion and Christine McVie's departure Just weeks after disbanding Fleetwood Mac, Mick Fleetwood started working with Lindsey Buckingham again. John McVie was added to the sessions, and later Christine McVie. Stevie Nicks also enlisted Buckingham to produce a song for a soundtrack. In May 1996 Fleetwood, John McVie, Christine McVie, and Nicks performed together at a private party in Louisville, Kentucky, prior to the Kentucky Derby, with Steve Winwood filling in for Buckingham. A week later the Twister film soundtrack was released, which featured the Nicks-Buckingham duet "Twisted", with Fleetwood on drums. This eventually led to a full reunion of the Rumours line-up, which officially reformed in March 1997. The regrouped Fleetwood Mac performed a live concert on a soundstage at Warner Bros. Burbank, California, on 22 May 1997. The concert was recorded, and from this performance came the 1997 live album The Dance, which brought the band back to the top of the US album charts for the first time in 10 years. The Dance returned Fleetwood Mac to a superstar status they had not enjoyed since Tango in the Night. The album was certified 5 million units by the RIAA. An arena tour followed the MTV premiere of The Dance and kept the reunited Fleetwood Mac on the road throughout much of 1997, the 20th anniversary of Rumours. With additional musicians Neale Heywood on guitar, Brett Tuggle on keyboards, Lenny Castro on percussion and Sharon Celani (who had toured with the band in the late 1980s) and Mindy Stein on backing vocals, this would be the final appearance of the classic line-up including Christine McVie for 16 years. Neale Heywood and Sharon Celani remain touring members to this day. In 1998 Fleetwood Mac were inducted into the Rock and Roll Hall of Fame. Members inducted included the original band, Mick Fleetwood, John McVie, Peter Green, Jeremy Spencer and Danny Kirwan, and Rumours-era members Christine McVie, Stevie Nicks and Lindsey Buckingham. Bob Welch was not included, despite his key role in keeping the band alive during the early 1970s. The Rumours-era version of the band performed both at the induction ceremony and at the Grammy Awards programme that year. Peter Green attended the induction ceremony but did not perform with his former bandmates, opting instead to perform his composition "Black Magic Woman" with Santana, who were inducted the same night. Neither Jeremy Spencer nor Danny Kirwan attended. Fleetwood Mac also received the "Outstanding Contribution to Music" award at the Brit Awards (British Phonographic Industry Awards) the same year. In 1998 Christine McVie left the band. Her departure left Buckingham and Nicks to sing all the lead vocals for the band's seventeenth album, Say You Will, released in 2003, although Christine contributed some backing vocals and keyboards. The album debuted at No.3 on the Billboard 200 chart (No. 6 in the UK) and yielded chart hits with "Peacekeeper" and the title track, and a successful world arena tour which lasted through 2004. The tour grossed $27,711,129 and was ranked No. 21 in the top 25 grossing tours of 2004. Around 2004–05 there were rumours of a reunion of the early line-up of Fleetwood Mac involving Peter Green and Jeremy Spencer. While these two apparently remained unconvinced, in April 2006 bassist John McVie, during a question-and-answer session on the Penguin Fleetwood Mac fan website, said of the reunion idea: In interviews given in November 2006 to support his solo album Under the Skin, Buckingham stated that plans for the band to reunite once more for a 2008 tour were still on the cards. Recording plans had been put on hold for the foreseeable future. In an interview Nicks gave to the UK newspaper The Daily Telegraph i in September 2007, she stated that she was unwilling to carry on with the band unless Christine McVie returned. 2008–2013: Unleashed tour and Extended Play In March 2008, it was mooted that Sheryl Crow might work with Fleetwood Mac in 2009. Crow and Stevie Nicks had collaborated in the past and Crow had stated that Nicks had been a great teacher and inspiration to her. Later, Buckingham said that the potential collaboration with Crow had "lost its momentum". and the idea was abandoned. In March 2009, Fleetwood Mac started their "Unleashed" tour, again without Christine McVie. It was a greatest hits show, although album tracks such as "Storms" and "I Know I'm Not Wrong" were also played. During their show on 20 June 2009 in New Orleans, Louisiana, Stevie Nicks premiered part of a new song that she had written about Hurricane Katrina. The song was later released as "New Orleans" on Nicks's 2011 album In Your Dreams with Mick Fleetwood on drums. In October 2009 and November the band toured Europe, followed by Australia and New Zealand in December. In October, The Very Best of Fleetwood Mac was re-released in an extended two-disc format (this format having been released in the US in 2002), entering at number six on the UK Albums Chart. On 1 November 2009 a one-hour documentary, Fleetwood Mac: Don't Stop, was broadcast in the UK on BBC One, featuring recent interviews with all four current band members. During the documentary Nicks gave a candid summary of the current state of her relationship with Buckingham, saying "Maybe when we're 75 and Fleetwood Mac is a distant memory, we might be friends." On 6 November 2009, Fleetwood Mac played the last show of the European leg of their Unleashed tour at London's Wembley Arena. Christine McVie was present in the audience. Nicks paid tribute to her from the stage to a standing ovation from the audience, saying that she thought about her former bandmate "every day", and dedicated that night's performance of "Landslide" to her. On 19 December 2009 Fleetwood Mac played the second-to-last show of their Unleashed tour to a sell-out crowd in New Zealand, at what was originally intended to be a one-off event at the TSB Bowl of Brooklands in New Plymouth. Tickets, after pre-sales, sold out within twelve minutes of public release. Another date, Sunday 20 December, was added and also sold out. The tour grossed $84,900,000 and was ranked No. 13 in the highest grossing worldwide tours of 2009. On 19 October 2010, Fleetwood Mac played a private show at the Phoenician Hotel in Scottsdale, Arizona for TPG (Texas Pacific Group). On 3 May 2011, the Fox Network broadcast an episode of Glee entitled "Rumours" that featured six songs from the band's 1977 album. The show sparked renewed interest in the band and its commercially most successful album, and Rumours re-entered the Billboard 200 chart at No. 11 in the same week that Nicks's new solo album In Your Dreams debuted at No. 6. (She was quoted by Billboard saying that her new album was "my own little Rumours.") The two recordings sold about 30,000 and 52,000 units respectively. Music downloads accounted for 91 per cent of the Rumours sales. The spike in sales for Rumours represented an increase of 1,951%. It was the highest chart entry by a previously issued album since The Rolling Stones''' reissue of Exile On Main St. re-entered the chart at No. 2 on 5 June 2010. In an interview in July 2012 Nicks confirmed that the band would reunite for a tour in 2013. Original Fleetwood Mac bassist Bob Brunning died on 18 October 2011 at the age of 68. Former guitarist and singer Bob Weston was found dead on 3 January 2012 at the age of 64. Former singer and guitarist Bob Welch was found dead from a self-inflicted gunshot wound on 7 June 2012 at the age of 66. Don Aaron, a spokesman at the scene, stated, "He died from an apparent self-inflicted gunshot wound to the chest." A suicide note was found. Welch had been struggling with health issues and was dealing with depression. His wife discovered his body. The band's 2013 tour, which took place in 34 cities, started on 4 April in Columbus, OH. The band performed two new songs ("Sad Angel" and "Without You"), which Buckingham described as some of the most "Fleetwood Mac-ey" sounding songs since Mirage. 'Without You' was re-recorded from the Buckingham-Nicks era. The band released their first new studio material in ten years, Extended Play, on 30 April 2013. The EP debuted and peaked at No. 48 in the US and produced one single, "Sad Angel". On 25 and 27 September 2013, the second and third nights of the band's London O2 shows, Christine McVie joined them on stage for "Don't Stop". On 27 October 2013, the band cancelled their New Zealand and Australian performances after John McVie had been diagnosed with cancer, so that he could undergo treatment. They said: "We are sorry not to be able to play these Australian and New Zealand dates. We hope our Australian and New Zealand fans as well as Fleetwood Mac fans everywhere will join us in wishing John and his family all the best." Also in October 2013, Stevie Nicks appeared in American Horror Story: Coven with Fleetwood Mac's song "Seven Wonders" playing in the background. In November 2013, Christine McVie expressed interest in a return to Fleetwood Mac, and also affirmed that John McVie's prognosis was "really good". 2014–present: Return of McVie and departure of Buckingham On 11 January 2014, Mick Fleetwood confirmed that Christine McVie would be rejoining Fleetwood Mac. On with the Show, a 33-city North American tour, opened in Minneapolis, Minnesota, on 30 September 2014. A series of May–June 2015 arena dates in the United Kingdom went on sale on 14 November, selling out in minutes. Due to high demand, additional dates were added to the tour, including an Australian leg. In January 2015, Buckingham suggested that the new album and tour might be Fleetwood Mac's last, and that the band would cease operations in 2015 or soon afterwards. He concluded: "We're going to continue working on the new album and the solo stuff will take a back seat for a year or two. A beautiful way to wrap up this last act." But Mick Fleetwood stated that the new album might take a few years to complete and that they were waiting for contributions from Nicks, who had been ambivalent about committing to a new record. In August 2016, Fleetwood
Viceroy of Valencia (9 January 1493, Plassenburg – 5 July 1525, Valencia) Elisabeth of Brandenburg-Ansbach-Kulmbach (25 March 1494, Ansbach – 31 May 1518, Pforzheim), married in Pforzheim on 29 September 1510 to Margrave Ernest of Baden-Durlach. Barbara of Brandenburg-Ansbach-Kulmbach (24 September 1495, Ansbach – 23 September 1552), married in Plassenburg on 26 July 1528 to Landgrave George III of Leuchtenberg. Frederick of Brandenburg-Ansbach-Kulmbach (17 January 1497, Ansbach – 20 August 1536, Genoa), a canon in Würzburg and Salzburg. Wilhelm, Archbishop of Riga (30 June 1498, Ansbach – 4 February 1563, Riga) John Albert, Archbishop of Magdeburg (20 September 1499, Ansbach – 17 May 1550, Halle) Frederick Albert, died young. Gumprecht of Brandenburg-Ansbach-Kulmbach (16 July 1503, Ansbach – 25 June 1528, Naples), a canon in Bamberg. Ancestry References Sources External links Brandenburg-Ansbach, Frederick I, Margrave of Brandenburg-Ansbach, Frederick I, Margrave of Brandenburg-Ansbach,
of King Casimir IV of Poland by his wife Elisabeth of Austria, and sister of King Sigismund I of Poland. They had seventeen children: Casimir, Margrave of Brandenburg-Kulmbach (27 September 1481, Ansbach – 21 September 1527, Buda). Elisabeth, died young. Margarete of Brandenburg-Ansbach-Kulmbach (10 January 1483, Ansbach – 10 July 1532). George, Margrave of Brandenburg-Ansbach (4 March 1484, Ansbach – 27 December 1543, Ansbach). Sophie of Brandenburg-Ansbach-Kulmbach (10 March 1485, Ansbach – 24 May 1537, Liegnitz), married on 14 November 1518 to Duke Frederick II of Legnica. Anna of Brandenburg-Ansbach-Kulmbach (5 May 1487, Ansbach – 7 February 1539), married on 1 December 1518 to Duke Wenceslaus II of Cieszyn. Barbara, died young. Albert, 1st Duke of Prussia (17 May 1490, Ansbach – 20 March 1568, Castle Tapiau), Grand Master of the Teutonic Order from 1511 to 1525, and first Duke of Prussia from 1525. Frederick of Brandenburg-Ansbach-Kulmbach (13 June 1491, Ansbach – ca. 1497). Johann, Viceroy of Valencia (9 January 1493, Plassenburg – 5 July 1525, Valencia) Elisabeth of Brandenburg-Ansbach-Kulmbach (25 March 1494, Ansbach – 31 May 1518, Pforzheim), married in Pforzheim on 29 September 1510 to Margrave Ernest of Baden-Durlach. Barbara of Brandenburg-Ansbach-Kulmbach (24 September 1495,
marks will be shaded green to indicate that it can be used. A boost will dramatically increase a player's speed, but will decrease their ability to turn. A boost used before a jump will make the player jump farther, which could allow the player to use a shortcut with the right vehicle. Boost time and speed varies according to the machine, and is usually tuned for proper balance. For example, one machine boasts a boost time of twelve seconds, yet has the slowest boost speed of the entire game. Players can also take advantage of the varying deceleration of each vehicle. Some vehicles, such as the Jet Vermilion, take longer than others to decelerate from top boost speed to normal speed, once the boost has been used up. Players can also take advantage of this effect on boost pads. The Grand Prix is the main single player component of Maximum Velocity. It consists of four series named after chess pieces: "Pawn", "Knight", "Bishop" and "Queen". The latter of these can be unlocked by winning the others on "Expert" mode. They have five races in four difficulty settings, "Master" mode is unlocked by winning expert mode in each series, the player unlocks a new machine after completing it. The player needs to be in the top three at the end of the last lap in order to continue to the next race. If the player is unable to continue, the player will lose a machine and can try the race again. If the player runs out of machines, then the game ends, and the player has to start the series from the beginning. Championship is another single player component. It is basically the same as a "Time Attack" mode, except the player can only race on one, special course: the Synobazz Championship Circuit. This special course is not selectable in any other modes. Multiplayer Maximum Velocity can be played in two multiplayer modes using the Game Boy Advance link cable, with one cartridge, or one cartridge per player. Two to four players can play in both modes. In single cart, only one player needs to have a cartridge. The other players will boot off the link cable network from the player with the cart using the GBA's netboot capability. All players drive a generic craft, and the game can only be played on one level, Silence. Silence, along with Fire Field, are the only areas to return from previous games. Aptly, Silence in Maximum Velocity has no background music, unlike in most other F-Zero
longer than others to decelerate from top boost speed to normal speed, once the boost has been used up. Players can also take advantage of this effect on boost pads. The Grand Prix is the main single player component of Maximum Velocity. It consists of four series named after chess pieces: "Pawn", "Knight", "Bishop" and "Queen". The latter of these can be unlocked by winning the others on "Expert" mode. They have five races in four difficulty settings, "Master" mode is unlocked by winning expert mode in each series, the player unlocks a new machine after completing it. The player needs to be in the top three at the end of the last lap in order to continue to the next race. If the player is unable to continue, the player will lose a machine and can try the race again. If the player runs out of machines, then the game ends, and the player has to start the series from the beginning. Championship is another single player component. It is basically the same as a "Time Attack" mode, except the player can only race on one, special course: the Synobazz Championship Circuit. This special course is not selectable in any other modes. Multiplayer Maximum Velocity can be played in two multiplayer modes using the Game Boy Advance link cable, with one cartridge, or one cartridge per player. Two to four players can play in both modes. In single cart, only one player needs to have a cartridge. The other players will boot off the link cable network from the player with the cart using the GBA's netboot capability. All players drive a generic craft, and the game can only be played on one level, Silence. Silence, along with Fire Field, are the only areas to return from previous games. Aptly, Silence in Maximum Velocity has no background music, unlike in most other F-Zero games. In multi cart, each player needs to have a cartridge to play. This has many advantages over single cart: All players can use any machine in this game that has been unlocked by another player. Players can select any course in this game. After the race is finished, all of the players' ranking data are mixed and shared ("Mixed ranking" stored in each cart). Development F-Zero: Maximum Velocity is one of the first titles to have been developed by NDcube.
music, books and French culture, which were forbidden by his father as decadent and unmanly. As Fritz's defiance for his father's rules increased, Frederick William would frequently beat or humiliate Fritz (he preferred his younger sibling Augustus William). Fritz was beaten for being thrown off a bolting horse and wearing gloves in cold weather. After the prince attempted to flee to England with his tutor, Hans Hermann von Katte, the enraged King had Katte beheaded before the eyes of the prince, who himself was court-martialled. The court declared itself not competent in this case. Whether it was the king's intention to have his son executed as well (as Voltaire claims) is not clear. However, the Holy Roman Emperor Charles VI intervened, claiming that a prince could only be tried by the Imperial Diet of the Holy Roman Empire itself. Frederick was imprisoned in the Fortress of Küstrin from 2 September to 19 November 1731 and exiled from court until February 1732, during which time he was rigorously schooled in matters of state. After achieving a measure of reconciliation, Frederick William had his son married to Princess Elizabeth of Brunswick-Wolfenbüttel, whom Frederick despised, but then grudgingly allowed him to indulge in his musical and literary interests again. He also gifted him a stud farm in East Prussia, and Rheinsberg Palace. By the time of Frederick William's death in 1740, he and Frederick were on at least reasonable terms with each other. Although the relationship between Frederick William and Frederick was clearly hostile, Frederick himself later wrote that his father "penetrated and understood great objectives, and knew the best interests of his country better than any minister or general." Marriage and family Frederick William married his first cousin Sophia Dorothea of Hanover, George II's younger sister (daughter of his uncle, King George I of Great Britain and Sophia Dorothea of Celle) on 28 November 1706. Frederick William was faithful and loving to his wife but they did not have a happy relationship: Sophia Dorothea feared his unpredictable temper and resented him, both for allowing her no influence or independence at court, and for refusing to marry her children to their English cousins. She also abhorred his cruelty towards their son and heir Frederick (with whom she was close), although rather than trying to mend the relationship between father and son she frequently spurred Frederick on in his defiance. They had fourteen children, including: He was the godfather of the Prussian envoy Friedrich Wilhelm von Thulemeyer and of his grand-nephew, Prince Edward Augustus of Great Britain. Ancestry See also Prussian virtues References Further reading Dorwart, Reinhold A. The administrative reforms
could find his duties precisely set out: a minister or councillor failing to attend a committee meeting, for example, would lose six months' pay; if he absented himself a second time, he would be discharged from the royal service. In short, Frederick William I concerned himself with every aspect of his relatively small country, ruling an absolute monarchy with great energy and skill. In 1732, the king invited the Salzburg Protestants to settle in East Prussia, which had been depopulated by plague in 1709. Under the terms of the Peace of Augsburg, the prince-archbishop of Salzburg could require his subjects to practice the Catholic faith, but Protestants had the right to emigrate to a Protestant state. Prussian commissioners accompanied 20,000 Protestants to their new homes on the other side of Germany. Frederick William I personally welcomed the first group of migrants and sang Protestant hymns with them. Frederick William intervened briefly in the Great Northern War, allied with Peter the Great of Russia, in order to gain a small portion of Swedish Pomerania; this gave Prussia new ports on the Baltic Sea coast. More significantly, aided by his close friend Prince Leopold of Anhalt-Dessau, the "Soldier-King" made considerable reforms to the Prussian army's training, tactics and conscription program—introducing the canton system, and greatly increasing the Prussian infantry's rate of fire through the introduction of the iron ramrod. Frederick William's reforms left his son Frederick with the most formidable army in Europe, which Frederick used to increase Prussia's power. The observation that "the pen is mightier than the sword" has sometimes been attributed to him. (See as well: "Prussian virtues".) Although a highly effective ruler, Frederick William had a perpetually short temper which sometimes drove him to physically attack servants (or even his own children) with a cane at the slightest perceived provocation. His violent, harsh nature was further exacerbated by his inherited porphyritic disease, which gave him gout, obesity and frequent crippling stomach pains. He also had a notable contempt for France, and would sometimes fly into a rage at the mere mention of that country, although this did not stop him from encouraging the immigration of French Huguenot refugees to Prussia. Burial and reburials Frederick William died in 1740 at age 51 and was interred at the Garrison Church in Potsdam. During World War II, in order to protect it from advancing allied forces, Hitler ordered the king's coffin, as well as those of Frederick the Great and Paul von Hindenburg, into hiding, first to Berlin and later to a salt mine outside of Bernterode. The coffins were later discovered by occupying American forces, who re-interred the bodies in St. Elisabeth's Church in Marburg in 1946. In 1953 the coffin was moved to Burg Hohenzollern, where it remained until 1991, when it was finally laid to rest on the steps of the altar in the Kaiser Friedrich Mausoleum in the Church of Peace on the palace grounds of Sanssouci. The original black marble sarcophagus collapsed at Burg Hohenzollern—the current one is a copper copy. Relationship with Frederick II His
refer to very fine-grained or aphanitic, light-colored volcanic rocks which might be later reclassified after a more detailed microscopic or chemical analysis. In some cases, felsic volcanic rocks may contain phenocrysts of mafic minerals, usually hornblende, pyroxene or a feldspar mineral, and may need to be named after their phenocryst mineral, such as 'hornblende-bearing felsite'. The chemical name of a felsic rock is given according to the TAS classification of Le Maitre (1975). However, this only applies to volcanic rocks. If the rock is analyzed and found to be felsic but is metamorphic and has no definite volcanic protolith, it may be sufficient to simply call it a 'felsic schist'. There are examples known of highly sheared granites which can be mistaken for rhyolites. For phaneritic felsic rocks, the QAPF diagram should be used, and a name given according to the granite nomenclature. Often the species of mafic minerals is included in the name, for instance, hornblende-bearing granite, pyroxene tonalite or augite megacrystic monzonite,
is higher in viscosity than mafic magma/lava. Felsic rocks are usually light in color and have specific gravities less than 3. The most common felsic rock is granite. Common felsic minerals include quartz, muscovite, orthoclase, and the sodium-rich plagioclase feldspars (albite-rich). Terminology In modern usage, the term acid rock, although sometimes used as a synonym, normally now refers specifically to a high-silica-content (greater than 63% SiO2 by weight) volcanic rock, such as rhyolite. Older, broader usage is now considered archaic. That usage, with the contrasting term "basic rock" (MgO, FeO, mafic), was based on an ancient concept, dating from the 19th century, that "silicic acid" (H4SiO4 or Si(OH)4) was the chief form of silicon occurring in siliceous rocks. Although this intuition makes sense from an acid-base perspective in aquatic chemistry considering water-rock interactions and silica dissolution, siliceous rocks are not formed by this protonated monomeric species, but by a tridimensional network of SiO44– tetrahedra connected to each others. Once released in water and hydrolyzed, these silica entities can indeed form silicic acid in aqueous solution. The term "felsic" combines the words "feldspar" and "silica". The similarity of the resulting term felsic to the German felsig, "rocky" (from Fels, "rock"), is purely accidental. Feldspar is linked to German. It is a borrowing of Feldspat. The link is therefore to German Feld, meaning "field". Classification of felsic rocks In order for a rock to be classified as felsic, it
Groningen and, in Germany, East Frisia and North Frisia (which was a part of Denmark until 1864). The Frisian languages are spoken by more than 500,000 people; West Frisian is officially recognised in the Netherlands (in Friesland), and North Frisian and Saterland Frisian are recognised as regional languages in Germany. History The ancient Frisii enter recorded history in the Roman account of Drusus's 12 BC war against the Rhine Germans and the Chauci. They occasionally appear in the accounts of Roman wars against the Germanic tribes of the region, up to and including the Revolt of the Batavi around 70 AD. Frisian mercenaries were hired to assist the Roman invasion of Britain in the capacity of cavalry. They are not mentioned again until 296, when they were deported into Roman territory as laeti (i.e., Roman-era serfs; see Binchester Roman Fort and Cuneus Frisionum). The discovery of a type of earthenware unique to fourth century Frisia, called terp Tritzum, shows that an unknown number of them were resettled in Flanders and Kent, probably as laeti under Roman coercion. From the third through the fifth centuries Frisia suffered marine transgressions that made most of the land uninhabitable, aggravated by a change to a cooler and wetter climate. Whatever population may have remained dropped dramatically, and the coastal lands remained largely unpopulated for the next two centuries. When conditions improved, Frisia received an influx of new settlers, mostly Angles and Saxons. These people would eventually be referred to as 'Frisians', though they were not necessarily descended from the ancient Frisii. It is these 'new Frisians' who are largely the ancestors of the medieval and modern Frisians. By the end of the sixth century, Frisian territory had expanded westward to the North Sea coast and, in the seventh century, southward down to Dorestad. This farthest extent of Frisian territory is sometimes referred to as Frisia Magna. Early Frisia was ruled by a High King, with the earliest reference to a 'Frisian King' being dated 678. In the early eighth century the Frisian nobles came into increasing conflict with the Franks to their south, resulting in a series of wars in which the Frankish Empire eventually subjugated Frisia in 734. These wars benefited attempts by Anglo-Irish missionaries (which had begun with Saint Boniface) to convert the Frisian populace to Christianity, in which Saint Willibrord largely succeeded. Some time after the death of Charlemagne, the Frisian territories were in theory under the control of the Count of Holland, but in practice the Hollandic counts, starting with Count Arnulf in 993, were unable to assert themselves as the sovereign lords of Frisia. The resulting stalemate resulted in a period of time called the 'Frisian freedom', a period in which feudalism and serfdom (as well as central or judicial administration) did not exist, and in which the Frisian lands only owed their allegiance to the Holy Roman Emperor. During the 13th century, however, the counts of Holland became increasingly powerful and, starting in 1272, sought to reassert themselves as rightful lords of the Frisian lands in a series of wars, which (with a series of lengthy interruptions) ended in 1422 with the Hollandic conquest of Western Frisia and with the establishment of a more powerful noble class in Central and Eastern Frisia. In 1524, Frisia became
in the accounts of Roman wars against the Germanic tribes of the region, up to and including the Revolt of the Batavi around 70 AD. Frisian mercenaries were hired to assist the Roman invasion of Britain in the capacity of cavalry. They are not mentioned again until 296, when they were deported into Roman territory as laeti (i.e., Roman-era serfs; see Binchester Roman Fort and Cuneus Frisionum). The discovery of a type of earthenware unique to fourth century Frisia, called terp Tritzum, shows that an unknown number of them were resettled in Flanders and Kent, probably as laeti under Roman coercion. From the third through the fifth centuries Frisia suffered marine transgressions that made most of the land uninhabitable, aggravated by a change to a cooler and wetter climate. Whatever population may have remained dropped dramatically, and the coastal lands remained largely unpopulated for the next two centuries. When conditions improved, Frisia received an influx of new settlers, mostly Angles and Saxons. These people would eventually be referred to as 'Frisians', though they were not necessarily descended from the ancient Frisii. It is these 'new Frisians' who are largely the ancestors of the medieval and modern Frisians. By the end of the sixth century, Frisian territory had expanded westward to the North Sea coast and, in the seventh century, southward down to Dorestad. This farthest extent of Frisian territory is sometimes referred to as Frisia Magna. Early Frisia was ruled by a High King, with the earliest reference to a 'Frisian King' being dated 678. In the early eighth century the Frisian nobles came into increasing conflict with the Franks to their south, resulting in a series of wars in which the Frankish Empire eventually subjugated Frisia in 734. These wars benefited attempts by Anglo-Irish missionaries (which had begun with Saint Boniface) to convert the Frisian populace to Christianity, in which Saint Willibrord largely succeeded. Some time after the death of Charlemagne, the Frisian territories were in theory under the control of the Count of Holland, but in practice the Hollandic counts, starting with Count Arnulf in 993, were unable to assert themselves as the sovereign lords of Frisia. The resulting stalemate resulted in a period of time called the 'Frisian freedom', a period in which feudalism and serfdom (as well as central or judicial administration) did not exist, and in which the Frisian lands only owed their allegiance to the Holy Roman Emperor. During the 13th century, however, the counts of Holland became increasingly powerful and, starting in 1272, sought to reassert themselves as rightful lords of the Frisian lands in a series of wars, which (with a series of lengthy interruptions) ended in 1422 with the Hollandic conquest of Western Frisia and with the establishment of a more powerful noble class in Central and Eastern Frisia. In 1524, Frisia became part of the Seventeen Provinces and in 1568 joined the Dutch revolt against Philip II, king of Spain, heir of the Burgundian territories; Central Frisia has remained a part of the Netherlands ever since. The eastern periphery of Frisia would become part of various German states (later Germany) and Denmark. An old tradition existed in the region of exploitation of peatlands. Migration to
different contexts: religious, artistic and futures studies Music Futurism (music), a movement in music Albums Musica Futurista, an album of Futurist music Futurist (Alec Empire album), 2005 Futurist (Keeno album), 2016 The Futurist (Robert Downey Jr. album), 2004 The Futurist (Shellac album), 1997 Futurism, an album by Danny Tenaglia Songs "Futurism", a bonus track from the Muse album Origin of Symmetry "Futurism", from the Deerhunter album Why Hasn't Everything Already Disappeared? Other uses Futurism.com, a science and tech website formerly owned by Singularity University Futurist (comics), a Marvel Comics character Retro Futurism, a Korean play
and literature genre Afrofuturism, an African-American and African diaspora subculture Cubo-Futurism, the main school of painting and sculpture practiced by the Russian Futurists Ego-Futurism, a Russian literary movement of the 1910s Indigenous Futurism, a movement consisting of art, literature, comics, games Neo-futurism, a contemporary art and architecture movement Retrofuturism, a modern art movement Russian Futurism, a movement of Russian poets and artists Religion Futurism (Christianity), an interpretation of the Bible in Christian eschatology Futurism (Judaism), used in three different contexts: religious, artistic and futures studies Music Futurism (music), a movement in music Albums Musica Futurista, an album of Futurist music Futurist (Alec Empire album), 2005 Futurist (Keeno album), 2016 The Futurist (Robert Downey Jr.
collected and published in The Battle of Tripoli. He then covered the First Balkan War of 1912–13, witnessing the surprise success of Bulgarian troops against the Ottoman Empire in the Siege of Adrianople. In this period he also made a number of visits to London, which he considered 'the Futurist city par excellence', and where a number of exhibitions, lectures and demonstrations of Futurist music were staged. However, although a number of artists, including Wyndham Lewis, were interested in the new movement, only one British convert was made, the young artist C.R.W. Nevinson. Nevertheless, Futurism was an important influence upon Lewis's Vorticist philosophy. About the same time Marinetti worked on a very anti-Roman Catholic and anti-Austrian verse-novel, Le monoplan du Pape (The Pope's Aeroplane, 1912) and edited an anthology of futurist poets. But his attempts to renew the style of poetry did not satisfy him. So much so that, in his foreword to the anthology, he declared a new revolution: it was time to be done with traditional syntax and to use "words in freedom" (parole in libertà). His sound-poem Zang Tumb Tumb, an account of the Battle of Adrianople, exemplifies words in freedom. Recordings can be heard of Marinetti reading some of his sound poems: Battaglia, Peso + Odore (1912); Dune, parole in libertà (1914); La Battaglia di Adrianopoli (1926) (recorded 1935). Wartime Marinetti agitated for Italian involvement in World War I, and once Italy was engaged, promptly volunteered for service. In the fall of 1915 he and several other Futurists who were members of the Lombard Volunteer Cyclists were stationed at Lake Garda, in Trentino province, high in the mountains along the Italo-Austrian border. They endured several weeks of fighting in harsh conditions before the cyclists units, deemed inappropriate for mountain warfare, were disbanded. Marinetti spent most of 1916 supporting Italy's war effort with speeches, journalism, and theatrical work, then returned to military service as a regular army officer in 1917. In May of that year he was seriously wounded while serving with an artillery battalion on the Isonzo front; he returned to service after a long recovery, and participated in the decisive Italian victory at Vittorio Veneto in October 1918. Marriage After an extended courtship, in 1923 Marinetti married Benedetta Cappa (1897–1977), a writer and painter and a pupil of Giacomo Balla. Born in Rome, she had joined the Futurists in 1917. They'd met in 1918, moved in together in Rome, and chose to marry only to avoid legal complications on a lecture tour of Brazil. They had three daughters: Vittoria, Ala, and Luce. Cappa and Marinetti collaborated on a genre of mixed-media assemblages in the mid-1920s they called tattilismo ("Tactilism"), and she was a strong proponent and practitioner of the aeropittura movement after its inception in 1929. She also produced three experimental novels. Cappa's major public work is likely a series of five murals at the Palermo Post Office (1926–1935) for the Fascist public-works architect Angiolo Mazzoni. Marinetti and Fascism In early 1918 he founded the Partito Politico Futurista or Futurist Political Party, which only a year later merged with Benito Mussolini's Fasci Italiani di Combattimento. Marinetti was one of the first affiliates of the Italian Fascist Party. In 1919 he co-wrote with Alceste De Ambris the Fascist Manifesto, the original manifesto of Italian Fascism. He opposed Fascism's later exaltation of existing institutions, terming them "reactionary," and, after walking out of the 1920 Fascist party congress in disgust, withdrew from politics for three years. However, he remained a notable force in developing the party philosophy throughout the regime's existence. For example, at the end of the Congress of Fascist Culture that was held in Bologna on 30 March 1925, Giovanni Gentile addressed Sergio Panunzio on the need to define Fascism more purposefully by way of Marinetti's opinion, stating, "Great spiritual movements make recourse to precision when their primitive inspirations—what F. T. Marinetti identified this morning as artistic, that is to say, the creative and truly innovative ideas, from which the movement derived its first and most potent impulse—have lost their force. We today find ourselves at the very beginning of a new life and we experience with joy this obscure need that fills our hearts—this need that is our inspiration, the genius that governs us and carries us with it." As part of his campaign to overturn tradition, Marinetti also attacked traditional Italian food. His Manifesto of Futurist Cooking was published in the Turin Gazzetta del Popolo on 28 December 1930. Arguing that "People think, dress and act in accordance with what they drink and eat", Marinetti proposed wide-ranging changes to diet. He condemned pasta, blaming it for lassitude, pessimism and lack of virility, and promoted the eating of Italian-grown rice. In this, as in other ways, his proposed Futurist cooking was nationalistic, rejecting foreign foods and food names. It was also militaristic, seeking to stimulate men to be fighters. Marinetti also sought to increase creativity. His attraction to whatever was new made scientific discoveries appealing to him, but his views on diet were not scientifically based. He was fascinated with the idea of processed food, predicting that someday pills would replace food as a source of energy, and calling for the creation of "plastic complexes" to replace natural foods. Food, in turn, would become a matter of artistic expression. Many of the meals Marinetti described and ate resemble performance art, such as the "Tactile Dinner", recreated in 2014 for an exhibit at the Guggenheim Museum. Participants wore pajamas decorated with sponge, sandpaper, and aluminum, and ate salads without using cutlery. During the Fascist regime Marinetti sought to make Futurism the official state art of Italy but failed to do so. Mussolini was personally uninterested in art and chose to give patronage to numerous styles to keep artists loyal to the regime. Opening the exhibition of art by the Novecento Italiano group in 1923, he said: "I declare that it is far from my idea to encourage anything like a state art. Art belongs to the domain of the individual. The state has only one duty: not to undermine art, to provide humane conditions for artists, to encourage them from the artistic and national point of view." Mussolini's mistress, Margherita Sarfatti, successfully promoted the rival Novecento Group, and even persuaded Marinetti to be part of its board. In Fascist Italy, modern art was tolerated and even approved by the Fascist hierarchy. Towards the end of the 1930s, some Fascist ideologues (for example, the ex-Futurist Ardengo Soffici) wished to import the concept of "degenerate art" from Germany to Italy and condemned modernism, although their demands were ignored by the regime. In 1938, hearing that Adolf Hitler wanted to include Futurism in a traveling exhibition of degenerate art, Marinetti persuaded Mussolini to refuse to let it enter Italy. On 17 November 1938, Italy passed The Racial Laws, discriminating against Italian Jews,
Fascist public-works architect Angiolo Mazzoni. Marinetti and Fascism In early 1918 he founded the Partito Politico Futurista or Futurist Political Party, which only a year later merged with Benito Mussolini's Fasci Italiani di Combattimento. Marinetti was one of the first affiliates of the Italian Fascist Party. In 1919 he co-wrote with Alceste De Ambris the Fascist Manifesto, the original manifesto of Italian Fascism. He opposed Fascism's later exaltation of existing institutions, terming them "reactionary," and, after walking out of the 1920 Fascist party congress in disgust, withdrew from politics for three years. However, he remained a notable force in developing the party philosophy throughout the regime's existence. For example, at the end of the Congress of Fascist Culture that was held in Bologna on 30 March 1925, Giovanni Gentile addressed Sergio Panunzio on the need to define Fascism more purposefully by way of Marinetti's opinion, stating, "Great spiritual movements make recourse to precision when their primitive inspirations—what F. T. Marinetti identified this morning as artistic, that is to say, the creative and truly innovative ideas, from which the movement derived its first and most potent impulse—have lost their force. We today find ourselves at the very beginning of a new life and we experience with joy this obscure need that fills our hearts—this need that is our inspiration, the genius that governs us and carries us with it." As part of his campaign to overturn tradition, Marinetti also attacked traditional Italian food. His Manifesto of Futurist Cooking was published in the Turin Gazzetta del Popolo on 28 December 1930. Arguing that "People think, dress and act in accordance with what they drink and eat", Marinetti proposed wide-ranging changes to diet. He condemned pasta, blaming it for lassitude, pessimism and lack of virility, and promoted the eating of Italian-grown rice. In this, as in other ways, his proposed Futurist cooking was nationalistic, rejecting foreign foods and food names. It was also militaristic, seeking to stimulate men to be fighters. Marinetti also sought to increase creativity. His attraction to whatever was new made scientific discoveries appealing to him, but his views on diet were not scientifically based. He was fascinated with the idea of processed food, predicting that someday pills would replace food as a source of energy, and calling for the creation of "plastic complexes" to replace natural foods. Food, in turn, would become a matter of artistic expression. Many of the meals Marinetti described and ate resemble performance art, such as the "Tactile Dinner", recreated in 2014 for an exhibit at the Guggenheim Museum. Participants wore pajamas decorated with sponge, sandpaper, and aluminum, and ate salads without using cutlery. During the Fascist regime Marinetti sought to make Futurism the official state art of Italy but failed to do so. Mussolini was personally uninterested in art and chose to give patronage to numerous styles to keep artists loyal to the regime. Opening the exhibition of art by the Novecento Italiano group in 1923, he said: "I declare that it is far from my idea to encourage anything like a state art. Art belongs to the domain of the individual. The state has only one duty: not to undermine art, to provide humane conditions for artists, to encourage them from the artistic and national point of view." Mussolini's mistress, Margherita Sarfatti, successfully promoted the rival Novecento Group, and even persuaded Marinetti to be part of its board. In Fascist Italy, modern art was tolerated and even approved by the Fascist hierarchy. Towards the end of the 1930s, some Fascist ideologues (for example, the ex-Futurist Ardengo Soffici) wished to import the concept of "degenerate art" from Germany to Italy and condemned modernism, although their demands were ignored by the regime. In 1938, hearing that Adolf Hitler wanted to include Futurism in a traveling exhibition of degenerate art, Marinetti persuaded Mussolini to refuse to let it enter Italy. On 17 November 1938, Italy passed The Racial Laws, discriminating against Italian Jews, much as the discrimination pronounced in the Nuremberg Laws. The antisemitic trend in Italy resulted in attacks against modern art, judged too foreign, too radical and anti-nationalist. In the 11 January 1939 issue of the Futurist journal, Artecrazia Marinetti expressed his condemnation of such attacks on modern art, noting Futurism is both Italian and nationalist, not foreign, and stating that there were no Jews in Futurism. Furthermore, he claimed Jews were not active in the development of modern art. Regardless, the Italian state shut down Artecrazia. Marinetti made numerous attempts to ingratiate himself with the regime, becoming less radical and avant garde with each attempt. He relocated from Milan to Rome. He became an academician despite his condemnation of academies, saying, "It is important that Futurism be represented in the Academy." He was an atheist, but by the mid 1930s he had come to accept the influence of the Catholic Church on Italian society. In Gazzetta del Popolo, 21 June 1931, Marinetti proclaimed that "Only Futurist artists...are able to express clearly...the simultaneous dogmas of the Catholic faith, such as the Holy Trinity, the Immaculate Conception and Christ's Calvary." In his last works, written just before his death in 1944 L'aeropoema di Gesù ("The Aeropoem of Jesus") and Quarto d'ora di poesia per the X Mas ("A Fifteen Minutes' Poem of the X Mas"), Marinetti sought to reconcile his newfound love for God and his passion for the action that accompanied him throughout his life. There were other contradictions in his character: despite his nationalism, he was international, educated in Egypt and France, writing his first poems in French, publishing the Futurist Manifesto in a French newspaper and traveling to promote his ideas. Marinetti volunteered for active service in the Second Italo-Abyssinian War and the Second World War, serving on the Eastern Front for a few weeks in the Summer and Autumn of 1942 at the age of 65. He died of cardiac arrest in Bellagio on 2 December 1944 while working on a collection of poems praising the wartime achievements of the Decima
example, involved causing a fit of madness. The advantage of magnetism involved accelerating such crises without danger. Procedure Mesmer treated patients both individually and in groups. With individuals he would sit in front of his patient with his knees touching the patient's knees, pressing the patient's thumbs in his hands, looking fixedly into the patient's eyes. Mesmer made "passes", moving his hands from patients' shoulders down along their arms. He then pressed his fingers on the patient's hypochondrium region (the area below the diaphragm), sometimes holding his hands there for hours. Many patients felt peculiar sensations or had convulsions that were regarded as crises and supposed to bring about the cure. Mesmer would often conclude his treatments by playing some music on a glass armonica. By 1780 Mesmer had more patients than he could treat individually and he established a collective treatment known as the "baquet." An English doctor who observed Mesmer described the treatment as follows:In the middle of the room is placed a vessel of about a foot and a half high which is called here a "baquet". It is so large that twenty people can easily sit round it; near the edge of the lid which covers it, there are holes pierced corresponding to the number of persons who are to surround it; into these holes are introduced iron rods, bent at right angles outwards, and of different heights, so as to answer to the part of the body to which they are to be applied. Besides these rods, there is a rope which communicates between the baquet and one of the patients, and from him is carried to another, and so on the whole round. The most sensible effects are produced on the approach of Mesmer, who is said to convey the fluid by certain motions of his hands or eyes, without touching the person. I have talked with several who have witnessed these effects, who have convulsions occasioned and removed by a movement of the hand... Investigation In 1784, without Mesmer requesting it, King Louis XVI appointed four members of the Faculty of Medicine as commissioners to investigate animal magnetism as practiced by d'Eslon. At the request of these commissioners, the king appointed five additional commissioners from the Royal Academy of Sciences. These included the chemist Antoine Lavoisier, the doctor Joseph-Ignace Guillotin, the astronomer Jean Sylvain Bailly, and the American ambassador Benjamin Franklin. The commission conducted a series of experiments aimed not at determining whether Mesmer's treatment worked, but whether he had discovered a new physical fluid. The commission concluded that there was no evidence for such a fluid. Whatever benefit the treatment produced was attributed to "imagination". One of the commissioners, the botanist Antoine Laurent de Jussieu took exception to the official reports. He wrote a dissenting opinion that declared Mesmer's theory credible and worthy of further investigation. The commission did not examine Mesmer, but investigated the practice of d'Eslon. In doing so using blind trials in their investigation, the commission learned that Mesmerism only seemed to work when the subject was aware of it. The commission termed it as "Imagination," but their findings are considered the first observation of the placebo effect. Mesmer was driven into exile soon after the investigations on animal magnetism although his influential student, Armand-Marie-Jacques de Chastenet, Marquis de Puységur (1751–1825), continued to have many followers until his death. Mesmer continued to practice in Frauenfeld, Switzerland, for a number of years and died in 1815 in Meersburg. Abbé Faria, an Indo-Portuguese monk in Paris and a contemporary of Mesmer, claimed that "nothing comes from the magnetizer; everything comes from the subject and takes place in his imagination, i.e. autosuggestion generated from within the mind." Works De planetarum influxu in corpus humanum (Über den Einfluss der Gestirne auf den menschlichen Körper) [The Influence of the Planets on the Human Body] (1766) . Mémoire sur la découverte du magnetisme animal, Didot, Genf und Paris (1779) . View at Gallica, from the Bibliothèque nationale de France (BnF). Sendschreiben an einen auswärtigen Arzt über die Magnetkur [Circulatory letter to an external[?] physician about the magnetic cure] (1775) . Théorie du monde et des êtres organisés suivant les principes de M…., Paris, (1784) . View at Gallica, BnF. Mémoire de F. A. Mesmer,...sur ses découvertes (1798–1799) . View at Gallica, BnF. Mesmerismus oder System der Wechselwirkungen. Theorie und Anwendung des thierischen Magnetismus als die allgemeine Heilkunde zur Erhaltung des Menschen [Mesmerism or the system of inter-relations. Theory and applications of animal magnetism as general medicine for the preservation of man]. Edited by . Nikolai, Berlin (1814) . View at Munich Digitization Center, from the Bavarian State Library. See also Animal magnetism Royal Commission on Animal Magnetism Notes References Bailly, J-S., "Secret Report on Mesmerism or Animal Magnetism", International Journal of Clinical and Experimental Hypnosis, Vol. 50, No. 4, (October 2002), pp. 364–68. doi=10.1080/00207140208410110 Franklin, B., Majault, M. J., Le Roy, J. B., Sallin, C. L., Bailly, J-S., d'Arcet, J., de Bory, G., Guillotin, J-I., and Lavoisier, A., "Report of the Commissioners charged by the King with the Examination of Animal Magnetism", International Journal of Clinical and Experimental Hypnosis, Vol. 50, No. 4, (October 2002), pp. 332–63. doi=10.1080/00207140208410109 Buranelli, V., The
who is said to convey the fluid by certain motions of his hands or eyes, without touching the person. I have talked with several who have witnessed these effects, who have convulsions occasioned and removed by a movement of the hand... Investigation In 1784, without Mesmer requesting it, King Louis XVI appointed four members of the Faculty of Medicine as commissioners to investigate animal magnetism as practiced by d'Eslon. At the request of these commissioners, the king appointed five additional commissioners from the Royal Academy of Sciences. These included the chemist Antoine Lavoisier, the doctor Joseph-Ignace Guillotin, the astronomer Jean Sylvain Bailly, and the American ambassador Benjamin Franklin. The commission conducted a series of experiments aimed not at determining whether Mesmer's treatment worked, but whether he had discovered a new physical fluid. The commission concluded that there was no evidence for such a fluid. Whatever benefit the treatment produced was attributed to "imagination". One of the commissioners, the botanist Antoine Laurent de Jussieu took exception to the official reports. He wrote a dissenting opinion that declared Mesmer's theory credible and worthy of further investigation. The commission did not examine Mesmer, but investigated the practice of d'Eslon. In doing so using blind trials in their investigation, the commission learned that Mesmerism only seemed to work when the subject was aware of it. The commission termed it as "Imagination," but their findings are considered the first observation of the placebo effect. Mesmer was driven into exile soon after the investigations on animal magnetism although his influential student, Armand-Marie-Jacques de Chastenet, Marquis de Puységur (1751–1825), continued to have many followers until his death. Mesmer continued to practice in Frauenfeld, Switzerland, for a number of years and died in 1815 in Meersburg. Abbé Faria, an Indo-Portuguese monk in Paris and a contemporary of Mesmer, claimed that "nothing comes from the magnetizer; everything comes from the subject and takes place in his imagination, i.e. autosuggestion generated from within the mind." Works De planetarum influxu in corpus humanum (Über den Einfluss der Gestirne auf den menschlichen Körper) [The Influence of the Planets on the Human Body] (1766) . Mémoire sur la découverte du magnetisme animal, Didot, Genf und Paris (1779) . View at Gallica, from the Bibliothèque nationale de France (BnF). Sendschreiben an einen auswärtigen Arzt über die Magnetkur [Circulatory letter to an external[?] physician about the magnetic cure] (1775) . Théorie du monde et des êtres organisés suivant les principes de M…., Paris, (1784) . View at Gallica, BnF. Mémoire de F. A. Mesmer,...sur ses découvertes (1798–1799) . View at Gallica, BnF. Mesmerismus oder System der Wechselwirkungen. Theorie und Anwendung des thierischen Magnetismus als die allgemeine Heilkunde zur Erhaltung des Menschen [Mesmerism or the system of inter-relations. Theory and applications of animal magnetism as general medicine for the preservation of man]. Edited by . Nikolai, Berlin (1814) . View at Munich Digitization Center, from the Bavarian State Library. See also Animal magnetism Royal Commission on Animal Magnetism Notes References Bailly, J-S., "Secret Report on Mesmerism or Animal Magnetism", International Journal of Clinical and Experimental Hypnosis, Vol. 50, No. 4, (October 2002), pp. 364–68. doi=10.1080/00207140208410110 Franklin, B., Majault, M. J., Le Roy, J. B., Sallin, C. L., Bailly, J-S., d'Arcet, J., de Bory, G., Guillotin, J-I., and Lavoisier, A., "Report of the Commissioners charged by the King with the Examination of Animal Magnetism", International Journal of Clinical and Experimental Hypnosis, Vol. 50, No. 4, (October 2002), pp. 332–63. doi=10.1080/00207140208410109 Buranelli, V., The Wizard from Vienna: Franz Anton Mesmer, Coward, McCann & Geoghegan., (New York), 1975. Crabtree, Adam (1988). Animal Magnetism, Early Hypnotism, and Psychical Research, 1766–1925 – An Annotated Bibliography. White Plains, NY: Kraus International. Donaldson, I.M.L., "Mesmer's 1780 Proposal for a Controlled Trial to Test his Method of Treatment Using 'Animal Magnetism'", Journal of the Royal Society of Medicine, Vol.98, No.12, (December 2005), pp. 572–575. Goldsmith, M., Franz Anton Mesmer: A History of Mesmerism, Doubleday,
Diagnosis Clinically, the patient may present with neurological symptoms such as numbness, weakness, loss of reflexes, or even sudden or progressive paralysis. The affected portion of the body will correlate to where the lesion lies within the spinal cord. The disease typically has an insidious onset, but symptoms may manifest suddenly. A thorough physical exam may lead a physician toward targeted imaging, with MRI being the most appropriate imaging modality for initial diagnosis. A spinal MRA will serve as a superior imaging technique to visualize the extent of the arteriovenous malformation within the cord and may be especially useful if surgical treatment is attempted. Treatment Surgical treatment may be attempted with endovascular
syndrome, also called subacute ascending necrotizing myelitis, is a disease caused by an arteriovenous malformation of the spinal cord. In particular, most cases involve dural arteriovenous malformations that present in the lower thoracic or lumbar spinal cord. The patients can present with symptoms indicating spinal cord involvement such as (paralysis of arms and legs, numbness and loss of sensation and sphincter dysfunction), and pathological examination reveals disseminated nerve cell death in the spinal cord. The condition is named after Charles Foix and Théophile Alajouanine who first described the condition in 1926. Diagnosis Clinically, the patient may present with neurological symptoms such as numbness, weakness, loss of reflexes, or even sudden or progressive
the more fundamental property of the electron that it has quantum mechanical spin. Due to its quantum nature, the spin of the electron can be in one of only two states; with the magnetic field either pointing "up" or "down" (for any choice of up and down). The spin of the electrons in atoms is the main source of ferromagnetism, although there is also a contribution from the orbital angular momentum of the electron about the nucleus. When these magnetic dipoles in a piece of matter are aligned, (point in the same direction) their individually tiny magnetic fields add together to create a much larger macroscopic field. However, materials made of atoms with filled electron shells have a total dipole moment of zero: because the electrons all exist in pairs with opposite spin, every electron's magnetic moment is cancelled by the opposite moment of the second electron in the pair. Only atoms with partially filled shells (i.e., unpaired spins) can have a net magnetic moment, so ferromagnetism occurs only in materials with partially filled shells. Because of Hund's rules, the first few electrons in a shell tend to have the same spin, thereby increasing the total dipole moment. These unpaired dipoles (often called simply "spins" even though they also generally include orbital angular momentum) tend to align in parallel to an external magnetic field, an effect called paramagnetism. Ferromagnetism involves an additional phenomenon, however: in a few substances the dipoles tend to align spontaneously, giving rise to a spontaneous magnetization, even when there is no applied field. Exchange interaction When two nearby atoms have unpaired electrons, whether the electron spins are parallel or antiparallel affects whether the electrons can share the same orbit as a result of the quantum mechanical effect called the exchange interaction. This in turn affects the electron location and the Coulomb (electrostatic) interaction and thus the energy difference between these states. The exchange interaction is related to the Pauli exclusion principle, which says that two electrons with the same spin cannot also be in the same spatial state (orbital). This is a consequence of the spin-statistics theorem and that electrons are fermions. Therefore, under certain conditions, when the orbitals of the unpaired outer valence electrons from adjacent atoms overlap, the distributions of their electric charge in space are farther apart when the electrons have parallel spins than when they have opposite spins. This reduces the electrostatic energy of the electrons when their spins are parallel compared to their energy when the spins are anti-parallel, so the parallel-spin state is more stable. This difference in energy is called the exchange energy. In simple terms, the outer electrons of adjacent atoms, which repel each other, can move further apart by aligning their spins in parallel, so the spins of these electrons tend to line up. This energy difference can be orders of magnitude larger than the energy differences associated with the magnetic dipole-dipole interaction due to dipole orientation, which tends to align the dipoles antiparallel. In certain doped semiconductor oxides RKKY interactions have been shown to bring about periodic longer-range magnetic interactions, a phenomenon of significance in the study of spintronic materials. The materials in which the exchange interaction is much stronger than the competing dipole-dipole interaction are frequently called magnetic materials. For instance, in iron (Fe) the exchange force is about 1000 times stronger than the dipole interaction. Therefore, below the Curie temperature virtually all of the dipoles in a ferromagnetic material will be aligned. In addition to ferromagnetism, the exchange interaction is also responsible for the other types of spontaneous ordering of atomic magnetic moments occurring in magnetic solids, antiferromagnetism and ferrimagnetism. There are different exchange interaction mechanisms which create the magnetism in different ferromagnetic, ferrimagnetic, and antiferromagnetic substances. These mechanisms include direct exchange, RKKY exchange, double exchange, and superexchange. Magnetic anisotropy Although the exchange interaction keeps spins aligned, it does not align them in a particular direction. Without magnetic anisotropy, the spins in a magnet randomly change direction in response to thermal fluctuations and the magnet is superparamagnetic. There are several kinds of magnetic anisotropy, the most common of which is magnetocrystalline anisotropy. This is a dependence of the energy on the direction of magnetization relative to the crystallographic lattice. Another common source of anisotropy, inverse magnetostriction, is induced by internal strains. Single-domain magnets also can have a shape anisotropy due to the magnetostatic effects of the particle shape. As the temperature of a magnet increases, the anisotropy tends to decrease, and there is often a blocking temperature at which a transition to superparamagnetism occurs. Magnetic domains The above would seem to suggest that every piece of ferromagnetic material should have a strong magnetic field, since all the spins are aligned, yet iron and other ferromagnets are often found in an "unmagnetized" state. The reason for this is that a bulk piece of ferromagnetic material is divided into tiny regions called magnetic domains (also known as Weiss domains). Within each domain, the spins are aligned, but (if the bulk material is in its lowest energy configuration; i.e. unmagnetized), the spins of separate domains point in different directions and their magnetic fields cancel out, so the object has no net large scale magnetic field. Ferromagnetic materials spontaneously divide into magnetic domains because the exchange interaction is a short-range force, so over long distances of many atoms the tendency of the magnetic dipoles to reduce their energy by orienting in opposite directions wins out. If all the dipoles in a piece of ferromagnetic material are aligned parallel, it creates a large magnetic field extending into the space around it. This contains a lot of magnetostatic energy. The material can reduce this energy by splitting into many domains pointing in different directions, so the magnetic field is confined to small local fields in the material, reducing the volume of the field. The domains are separated by thin domain walls a number of molecules thick, in which the direction of magnetization of the dipoles rotates smoothly from one domain's direction to the other. Magnetized materials Thus, a piece of iron in its lowest energy state ("unmagnetized") generally has little or no net magnetic field. However, the magnetic domains in a material are not fixed in place; they are simply regions where the spins of the electrons have aligned spontaneously due to their magnetic fields, and thus can be altered by an external magnetic field.
the quantum mechanical effect called the exchange interaction. This in turn affects the electron location and the Coulomb (electrostatic) interaction and thus the energy difference between these states. The exchange interaction is related to the Pauli exclusion principle, which says that two electrons with the same spin cannot also be in the same spatial state (orbital). This is a consequence of the spin-statistics theorem and that electrons are fermions. Therefore, under certain conditions, when the orbitals of the unpaired outer valence electrons from adjacent atoms overlap, the distributions of their electric charge in space are farther apart when the electrons have parallel spins than when they have opposite spins. This reduces the electrostatic energy of the electrons when their spins are parallel compared to their energy when the spins are anti-parallel, so the parallel-spin state is more stable. This difference in energy is called the exchange energy. In simple terms, the outer electrons of adjacent atoms, which repel each other, can move further apart by aligning their spins in parallel, so the spins of these electrons tend to line up. This energy difference can be orders of magnitude larger than the energy differences associated with the magnetic dipole-dipole interaction due to dipole orientation, which tends to align the dipoles antiparallel. In certain doped semiconductor oxides RKKY interactions have been shown to bring about periodic longer-range magnetic interactions, a phenomenon of significance in the study of spintronic materials. The materials in which the exchange interaction is much stronger than the competing dipole-dipole interaction are frequently called magnetic materials. For instance, in iron (Fe) the exchange force is about 1000 times stronger than the dipole interaction. Therefore, below the Curie temperature virtually all of the dipoles in a ferromagnetic material will be aligned. In addition to ferromagnetism, the exchange interaction is also responsible for the other types of spontaneous ordering of atomic magnetic moments occurring in magnetic solids, antiferromagnetism and ferrimagnetism. There are different exchange interaction mechanisms which create the magnetism in different ferromagnetic, ferrimagnetic, and antiferromagnetic substances. These mechanisms include direct exchange, RKKY exchange, double exchange, and superexchange. Magnetic anisotropy Although the exchange interaction keeps spins aligned, it does not align them in a particular direction. Without magnetic anisotropy, the spins in a magnet randomly change direction in response to thermal fluctuations and the magnet is superparamagnetic. There are several kinds of magnetic anisotropy, the most common of which is magnetocrystalline anisotropy. This is a dependence of the energy on the direction of magnetization relative to the crystallographic lattice. Another common source of anisotropy, inverse magnetostriction, is induced by internal strains. Single-domain magnets also can have a shape anisotropy due to the magnetostatic effects of the particle shape. As the temperature of a magnet increases, the anisotropy tends to decrease, and there is often a blocking temperature at which a transition to superparamagnetism occurs. Magnetic domains The above would seem to suggest that every piece of ferromagnetic material should have a strong magnetic field, since all the spins are aligned, yet iron and other ferromagnets are often found in an "unmagnetized" state. The reason for this is that a bulk piece of ferromagnetic material is divided into tiny regions called magnetic domains (also known as Weiss domains). Within each domain, the spins are aligned, but (if the bulk material is in its lowest energy configuration; i.e. unmagnetized), the spins of separate domains point in different directions and their magnetic fields cancel out, so the object has no net large scale magnetic field. Ferromagnetic materials spontaneously divide into magnetic domains because the exchange interaction is a short-range force, so over long distances of many atoms the tendency of the magnetic dipoles to reduce their energy by orienting in opposite directions wins out. If all the dipoles in a piece of ferromagnetic material are aligned parallel, it creates a large magnetic field extending into the space around it. This contains a lot of magnetostatic energy. The material can reduce this energy by splitting into many domains pointing in different directions, so the magnetic field is confined to small local fields in the material, reducing the volume of the field. The domains are separated by thin domain walls a number of molecules thick, in which the direction of magnetization of the dipoles rotates smoothly from one domain's direction to the other. Magnetized materials Thus, a piece of iron in its lowest energy state ("unmagnetized") generally has little or no net magnetic field. However, the magnetic domains in a material are not fixed in place; they are simply regions where the spins of the electrons have aligned spontaneously due to their magnetic fields, and thus can be altered by an external magnetic field. If a strong enough external magnetic field is applied to the material, the domain walls will move by the process of the spins of the electrons in atoms near the wall in one domain turning under the influence of the external field to face in the same direction as the electrons in the other domain, thus reorienting the domains so more of the dipoles are aligned with the external field. The domains will remain aligned when the external field is removed, creating a magnetic field of their own extending into the space around the material, thus creating a "permanent" magnet. The domains do not go back to their original minimum energy configuration when the field is removed because the domain walls tend to become 'pinned' or 'snagged' on defects in the crystal lattice, preserving their parallel orientation. This is shown by the Barkhausen effect: as the magnetizing field is changed, the magnetization changes in thousands of tiny discontinuous jumps as the domain walls suddenly "snap" past defects. This magnetization as a function of the external field is described by a hysteresis curve. Although this state of aligned domains found in a piece of magnetized ferromagnetic material is not a minimal-energy configuration, it is metastable, and can persist for long periods, as shown by samples of magnetite from the sea floor which have maintained their magnetization for millions of years. Heating
in the basement of the station). Nevertheless, soon the evidence gathered on site of the explosion made it clear that the attack constituted an act of terrorism. L'Unità, the newspaper of the Communist Party on 3 August already attributed responsibility for the attack to neo-fascists. Later, in a special session to the Senate, Cossiga supported the theory that neofascists were behind the attack, "unlike leftist terrorism, which strikes at the heart of the state through its representatives, black terrorism prefers the massacre because it promotes panic and impulsive reactions." Later, according to media reports in 2004, taken up again in 2007, Cossiga, in a letter addressed to Enzo Fragala, leader of the National Alliance section in the Mitrokhin Committee, suggested Palestinian involvement of George Habash's Popular Front for the Liberation of Palestine and the Separat group of Ilich Ramirez Sanchez, known as "Carlos the Jackal". In addition, in 2008 Cossiga gave an interview to BBC in which it reaffirmed his belief that the massacre would not be attributable to black terrorism, but to an "incident" of Palestinian resistance groups operating in Italy. He declared also being convinced of the innocence of Francesca Mambro and Giuseppe Valerio Fioravanti, the two neo-fascist terrorists accused of the massacre. The PFLP has always denied responsibility. Resignation In October 1980, Cossiga resigned as Prime Minister after the rejection of the annual budget bill by the Italian Parliament. Following the 1983 general election, Cossiga became a member of the Italian Senate; on 12 July, he was elected President of the Senate. President of Italy In the 1985 presidential election, Cossiga was elected as President of Italy with 752 votes out of 977. His candidacy was endorsed by the Christian Democracy, but supported also by communists, socialists, social democrats, liberals and republicans. This was the first time an Italian presidential candidate had won the election on the first ballot, where a two-thirds majority is necessary. He took office on 29 June 1985 on an interim basis after the resignation of Outgoing President Sandro Pertini, but was not sworn in until a few days later, on 3 July. The Cossiga presidency was essentially divided into two phases related to the attitudes of the head of state. In the first five years, Cossiga played its role in a traditional way, caring for the role of the republican institutions under the Constitution, which makes the President of the Republic a kind of arbitrator in relations between the powers of the state. "Pickaxe-wielder" president It was in his last two years as president that Cossiga began to express some unusual opinions regarding the Italian political system. He opined that the Italian parties, especially the Christian Democrats and the Communists had to take into account the deep changes brought about by the fall of the Berlin Wall and the end of the Cold War. According to him, DC and PCI would therefore have been seriously affected by this change, but Cossiga believed that political parties and the same institutions refused to recognize it. Thus, a period of conflict and political controversy began, often provocative and deliberately excessive, and with very strong media exposure. These statements, soon dubbed "esternazioni", or "mattock blows" (picconate), were considered by many to be inappropriate for a President, and often beyond his constitutional powers; also, his mental health was doubted and Cossiga had to declare "I am the fake madman who speaks the truth." Cossiga suffered from bipolar disorder and depression in the last years of his life. Among the statements of the President there were also allegations of excessive politicization of the judiciary system, and the stigmatization of the fact that young magistrates, who just came into service, were immediately destined for the Sicilian prosecutor to carry out mafia proceedings. For his changed attitude, Cossiga received various criticisms by almost every party, with the exception of the Italian Social Movement, which stood beside him in defense of the "picconate". He will, amongst other things, be considered one of the first "cleansers" of MSI, who recognized it as a constitutional and democratic force. Revelation of Gladio and resignation Tension developed between Cossiga and Prime Minister Giulio Andreotti. This tension emerged when Andreotti revealed the existence of Gladio, a stay-behind organization with the official aim of countering a possible Soviet invasion through sabotage and guerrilla warfare behind enemy lines. Cossiga acknowledged his involvement in the establishment of the organization. The Democratic Party of the Left (successor to the Communist Party) started the procedure of impeachment (Presidents of Italy can be impeached only for high treason against the state or for an attempt to overthrow the Constitution). Although he threatened to prevent the impeachment procedure by dissolving Parliament, the impeachment request was ultimately dismissed. Cossiga resigned two months before the end of his term, on 25 April 1992. In his last speech as president he stated "To young people I want to say to love the fatherland, to honor the nation, to serve the Republic, to believe in freedom and to believe in our country". After the presidency According to the Italian Constitution, after his resignation from the office of President, Cossiga became Lifetime Senator, joining his predecessors in the upper house of Parliament, with whom he also shared the title of President Emeritus of the Italian Republic. On 12 January 1997, Cossiga survived unscathed a railway accident (:it:Incidente ferroviario di Piacenza), while traveling on a high-speed train from Milan to Rome that derailed near Piacenza. In February 1998, Cossiga created the Democratic Union for the Republic
a downtown bar, L'angelo azzurro (The Blue Angel), frequented by young right-wing activists. They threw two Molotov cocktails, and Roberto Crescenzio, a totally apolitical student, died of burns. The perpetrators of the murder were never identified. Lotta Continua leader Silvio Viale called it a "tragic accident". Another innocent victim of the riots of that year was Giorgiana Masi, who was killed in Rome by a gunshot during an event organized by the Radical Party to celebrate the third anniversary of the victory in the referendum on divorce. As the perpetrators of the murder remained unknown, the movement attributed the responsibility of the crime to police officers in plain clothes, which were immortalized at that time dressed in clothing of the style of young people of the movement. Kidnapping of Aldo Moro Cossiga was in office at the time of the kidnapping and murder of the Christian Democratic leader Aldo Moro by the Marxist-Leninist extreme-left terrorist group Red Brigades. On the morning of 16 March 1978, the day on which the new cabinet led by Giulio Andreotti was supposed to have undergone a confidence vote in the Italian Parliament, the car of Moro, former prime minister and then president of DC, was assaulted by a group of Red Brigades terrorists in Via Fani in Rome. Firing automatic weapons, the terrorists killed Moro's bodyguards, (two Carabinieri in Moro's car and three policemen in the following car) and kidnapped him. Cossiga formed immediately two "crisis committees". The first one was a technical-operational-political committee, chaired by Cossiga himself and, in his absence, by undersecretary Nicola Lettieri. Other members included the supreme commanders of the Italian Police Forces, of the Carabinieri, the Guardia di Finanza, the recently named directors of SISMI and SISDE (respectively, Italy's military and civil intelligence services), the national secretary of CESIS (a secret information agency), the director of UCIGOS and the police prefect of Rome. The second one was an information committee, including members of CESIS, SISDE, SISMI and SIOS, another military intelligence office. A third unofficial committee was created which never met officially, called the comitato di esperti ("committee of experts"). Its existence was not disclosed until 1981, by Cossiga himself, in his interrogation by the Italian Parliament's Commission about the Moro affair. He omitted to reveal the decisions and the activities of the committee however. This committee included: Steve Pieczenik, a psychologist of the anti-terrorism section of the US State Department, and notable Italian criminologists. Pieczenik later declared that there were numerous leaks about the discussions made at the committee, and accused Cossiga. However, on 9 May 1978 Moro's body was found in the trunk of a Renault 4 in Via Caetani after 55 days of imprisonment, during which Moro was submitted to a political trial by the so-called "people's court" set up by the Brigate Rosse and the Italian government was asked for an exchange of prisoners. Despite the common interpretation, the car location in Via Caetani was not halfway between the locations of the national offices of DC and of the Italian Communist Party (PCI) in Rome. After two days, Cossiga resigned as Minister of the Interior. According to Italian journalist Enrico Deaglio, Cossiga, to justify his lack of action, "accused the leaders of CGIL and of the Communist Party of knowing where Moro was detained". Cossiga was also accused by Moro himself, in his letters who wrote during his detention, saying that "his blood will fall over him". Prime Minister of Italy One year after Moro's death and the subsequent Cossiga's resignation as Interior Minister, he was appointed Prime Minister of Italy. He led a government's coalition composed by Christian Democrats, Socialists, Democratic Socialists, Republicans and Liberals. Bologna massacre Cossiga was head of the government during the Bologna massacre, a terrorist bombing of the Bologna Central Station on the morning of 2 August 1980, which killed 85 people and wounded more than 200. The attack was attributed to the neo-fascist terrorist organization Nuclei Armati Rivoluzionari (Armed Revolutionary Nucleus), which always denied any involvement; other theories have been proposed, especially in correlation with the strategy of tension. Francesco Cossiga first assumed the explosion to have been caused by an accident (the explosion of an old boiler located in the basement of the station). Nevertheless, soon the evidence gathered on site of the explosion made it clear that the attack constituted an act of terrorism. L'Unità, the newspaper of the Communist Party on 3 August already attributed responsibility for the attack to neo-fascists. Later, in a special session to the Senate, Cossiga supported the theory that neofascists were behind the attack, "unlike leftist terrorism, which strikes at the heart of the state through its representatives, black terrorism prefers the massacre because it promotes panic and impulsive reactions." Later, according to media reports in 2004, taken up again in 2007, Cossiga, in a letter addressed to Enzo Fragala, leader of the National Alliance section in the Mitrokhin Committee, suggested Palestinian involvement of George Habash's Popular Front for the Liberation of Palestine and the Separat group of Ilich Ramirez Sanchez, known as "Carlos the Jackal". In addition, in 2008 Cossiga gave an interview to BBC in which it reaffirmed his belief that the massacre would not be attributable to black terrorism, but to an "incident" of Palestinian resistance groups operating in Italy. He declared also being convinced of the innocence of Francesca Mambro and Giuseppe Valerio Fioravanti, the two neo-fascist terrorists accused of the massacre. The PFLP has always denied responsibility. Resignation In October 1980, Cossiga resigned as Prime Minister after the rejection of the annual budget bill by the Italian Parliament. Following the 1983 general election, Cossiga became a member of the Italian Senate; on 12 July, he was elected President of the Senate. President of Italy In the 1985 presidential election, Cossiga was elected as President of Italy with 752 votes out of 977. His candidacy was endorsed by the Christian Democracy, but supported also by communists, socialists, social democrats, liberals and republicans. This was the first time an Italian presidential candidate had won the election on the first ballot, where a two-thirds majority is necessary. He took office on 29 June 1985 on an interim basis after the resignation of Outgoing President Sandro Pertini, but was not sworn in until a few days later, on 3 July. The Cossiga presidency was essentially divided into two phases related to the attitudes of the head of state. In the first five years, Cossiga played its role in a traditional way, caring for the role of the republican institutions under the Constitution, which makes the President of the Republic a kind of arbitrator in relations between the powers of the state. "Pickaxe-wielder" president It was in his last two years as president that Cossiga began to express some unusual opinions regarding the Italian political system. He opined that the Italian parties, especially the Christian Democrats and the Communists had to take into account the deep changes brought about by the fall of the Berlin Wall and the end of the Cold War. According to him, DC and PCI would therefore have been seriously affected by this change, but Cossiga believed that political parties and the same institutions refused to recognize it. Thus, a period of conflict and political controversy began, often provocative and deliberately excessive, and with very strong media exposure. These statements, soon dubbed "esternazioni", or "mattock blows" (picconate), were considered by many to be inappropriate for a President, and often beyond his constitutional powers; also, his mental health was doubted and Cossiga had to declare "I am the fake madman who speaks the truth." Cossiga suffered from bipolar disorder and depression in the last years of his life. Among the statements of the President there were also allegations of excessive politicization of the judiciary system, and the stigmatization of the fact that young magistrates, who just came into service, were immediately destined for the Sicilian prosecutor to carry out mafia proceedings. For his changed attitude, Cossiga received various criticisms by almost every party, with the exception of the Italian Social Movement, which stood beside him in defense of the "picconate". He will, amongst other things, be considered one of the first "cleansers" of MSI, who recognized it as a constitutional and democratic force. Revelation of Gladio and resignation Tension developed between Cossiga and Prime Minister Giulio Andreotti. This tension emerged when Andreotti revealed the existence of Gladio, a stay-behind organization with the official aim of countering a possible Soviet invasion through sabotage and guerrilla warfare behind enemy lines. Cossiga acknowledged his involvement in the establishment of the organization. The Democratic Party of the Left (successor to the Communist Party) started the procedure of impeachment (Presidents of Italy can be impeached only for high treason against the state or for an attempt to overthrow the Constitution). Although he threatened to prevent the impeachment procedure by dissolving Parliament, the impeachment request was ultimately dismissed. Cossiga resigned two months before the end of his term, on 25 April 1992. In his last speech as president he stated "To young people I want to say to love the fatherland, to honor the nation, to serve the Republic, to believe in freedom and to believe in our country". After the presidency According to the Italian Constitution, after his resignation from the office of President, Cossiga became Lifetime Senator, joining his predecessors in the upper house of Parliament, with whom he also shared the title of President Emeritus of the Italian Republic. On 12 January 1997, Cossiga survived unscathed a railway accident (:it:Incidente ferroviario di Piacenza), while traveling on a high-speed train from Milan to Rome that derailed near Piacenza. In February 1998, Cossiga created the Democratic Union for the Republic (UDR), a Christian democratic political party, declaring it to be politically central. The UDR was a crucial component of the majority
F119 used by the F-22, the F135 has a larger fan and higher bypass ratio to increase subsonic fuel efficiency, and unlike the F119, is not optimized for supercruise. The engine contributes to the F-35's stealth by having a low-observable augmenter, or afterburner, that incorporates fuel injectors into thick curved vanes; these vanes are covered by ceramic radar-absorbent materials and mask the turbine. The stealthy augmenter had problems with pressure pulsations, or "screech", at low altitude and high speed early in its development. The low-observable axisymmetric nozzle consists of 15 partially overlapping flaps that create a sawtooth pattern at the trailing edge, which reduces radar signature and creates shed vortices that reduce the infrared signature of the exhaust plume. Due to the engines large dimensions, the USN had to modify its underway replenishment system to facilitate at-sea logistics support. The F-35's Integrated Power Package (IPP) performs power and thermal management and integrates environment control, auxiliary power unit, engine starting, and other functions into a single system. The F135-PW-600 variant for the F-35B incorporates the SDLF to allow STOVL operations. Designed by Lockheed Martin and developed by Rolls-Royce, the SDLF, also known as the Rolls-Royce LiftSystem, consists of the lift fan, drive shaft, two roll posts, and a "three-bearing swivel module" (3BSM). The thrust vectoring 3BSM nozzle allows the main engine exhaust to be deflected downward at the tail of the aircraft and is moved by a "fueldraulic" actuator that uses pressurized fuel as the working fluid. Unlike the Harriers Pegasus engine that entirely uses direct engine thrust for lift, the F-35B's system augments the swivel nozzle's thrust with the lift fan; the fan is powered by the low-pressure turbine through a drive shaft when engaged with a clutch and placed near the front of the aircraft to provide a counterbalancing thrust. Roll control during slow flight is achieved by diverting unheated engine bypass air through wing-mounted thrust nozzles called roll posts. An alternative engine, the General Electric/Rolls-Royce F136, was being developed in the 2000s; originally, F-35 engines from Lot 6 onward were competitively tendered. Using technology from the General Electric YF120, the F136 was claimed to have a greater temperature margin than the F135. The F136 was canceled in December 2011 due to lack of funding. The F-35 is expected to receive propulsion upgrades over its lifecycle in order to adapt to emerging threats and enable additional capabilities. In 2016, the Adaptive Engine Transition Program (AETP) was launched to develop and test adaptive cycle engines, with one major potential application being the re-engining of the F-35; in 2018, both GE and P&W were awarded contracts to develop thrust class demonstrators, with the designations XA100 and XA101 respectively. In addition to potential re-engining, P&W also plans to improve the baseline F135; in 2017, P&W announced the F135 Growth Option 1.0 and 2.0; Growth Option 1.0 was a drop-in power module upgrade that offered 6–10% thrust improvement and 5–6% fuel burn reduction, while Growth Option 2.0 would be the adaptive cycle XA101. In 2020, P&W shifted its F135 upgrade plan from the Growth Options to a series of Engine Enhancement Packages along with some additional capabilities, while the XA101 became a separate clean-sheet design. The capability packages are planned to be incorporated in two-year increments starting in the mid-2020s. Maintenance and logistics The F-35 is designed to require less maintenance than earlier stealth aircraft. Some 95% of all field-replaceable parts are "one deep"—that is, nothing else need be removed to reach the desired part; for instance, the ejection seat can be replaced without removing the canopy. The F-35 has a fibermat radar-absorbent material (RAM) baked into the skin, which is more durable, easier to work with, and faster to cure than older RAM coatings; similar coatings are currently being considered for application on older stealth aircraft such as the F-22. Skin corrosion on the F-22 led the F-35's designers to use a less galvanic corrosion-inducing skin gap filler and to use fewer gaps in the airframe skin needing filler and better drainage. The flight control system uses electro-hydrostatic actuators rather than traditional hydraulic systems; these controls can be powered by lithium-ion batteries in case of emergency. Commonality between the different variants allowed the USMC to create their first aircraft maintenance Field Training Detachment to apply the USAF's lessons to their F-35 operations. The F-35 was intended to be supported by a computerized maintenance management system named Autonomic Logistics Information System (ALIS). In concept, any aircraft can be serviced at any F-35 maintenance facility and for all parts to be globally tracked and shared as needed. Due to numerous problems, such as unreliable diagnoses, excessive connectivity requirements, and security vulnerabilities, program officials plan to replace ALIS with the cloud-based Operational Data Integrated Network (ODIN) by 2022. ODIN base kits (OBKs)— OBKs are new computer hardware which replace ALIS's Standard Operating Unit unclassified (SOU-U) server hardware. Beginning in September 2020 OBKs were running ALIS software, as well as ODIN software, first at Marine Corps Air Station (MCAS) Yuma, Arizona, then at Naval Air Station Lemoore, California, in support of Strike Fighter Squadron (VFA) 125 on 16 July 2021, and then Nellis Air Force Base, Nevada, in support of the 422nd Test and Evaluation Squadron (TES) on 6 August 2021. In 2022, over a dozen more OBK server installation sites will replace the ALIS SOU-U servers, which will be able to run the legacy ALIS software as well as its replacement ODIN software. The performance on the OBK has doubled so far, compared to ALIS. Operational history Testing The first F-35A, AA-1, conducted its engine run in September 2006 and first flew on 15 December 2006. Unlike all subsequent aircraft, AA-1 did not have the weight optimization from SWAT; consequently, it mainly tested subsystems common to subsequent aircraft, such as the propulsion, electrical system, and cockpit displays. This aircraft was retired from flight testing in December 2009 and was used for live-fire testing at NAS China Lake. The first F-35B, BF-1, flew on 11 June 2008, while the first weight-optimized F-35A and F-35C, AF-1 and CF-1, flew on 14 November 2009 and 6 June 2010 respectively. The F-35B's first hover was on 17 March 2010, followed by its first vertical landing the next day. The F-35 Integrated Test Force (ITF) consisted of 18 aircraft at Edwards Air Force Base and Naval Air Station Patuxent River. Nine aircraft at Edwards, five F-35As, three F-35Bs, and one F-35C, performed flight sciences testing such as F-35A envelope expansion, flight loads, stores separation, as well as mission systems testing. The other nine aircraft at Patuxent River, five F-35Bs and four F-35Cs, were responsible for F-35B and C envelope expansion and STOVL and CV suitability testing. Additional carrier suitability testing was conducted at Naval Air Warfare Center Aircraft Division at Lakehurst, New Jersey. Two non-flying aircraft of each variant were used to test static loads and fatigue. For testing avionics and mission systems, a modified Boeing 737-300 with a duplication of the cockpit, the Lockheed Martin CATBird has been used. Field testing of the F-35's sensors were conducted during Exercise Northern Edge 2009 and 2011, serving as significant risk-reduction steps. Flight tests revealed several serious deficiencies that required costly redesigns, caused delays, and resulted in several fleet-wide groundings. In 2011, the F-35C failed to catch the arresting wire in all eight landing tests; a redesigned tail hook was delivered two years later. By June 2009, many of the initial flight test targets had been accomplished but the program was behind schedule. Software and mission systems were among the biggest sources of delays for the program, with sensor fusion proving especially challenging. In fatigue testing, the F-35B suffered several premature cracks, requiring a redesign of the structure. A third non-flying F-35B is currently planned to test the redesigned structure. The F-35B and C also had problems with the horizontal tails suffering heat damage from prolonged afterburner use. Early flight control laws had problems with "wing drop" and also made the airplane sluggish, with high angles-of-attack tests in 2015 against an F-16 showing a lack of energy. At-sea testing of the F-35B was first conducted aboard . In October 2011, two F-35Bs conducted three weeks of initial sea trials, called Development Test I. The second F-35B sea trials, Development Test II, began in August 2013, with tests including nighttime operations; two aircraft completed 19 nighttime vertical landings using DAS imagery. The first operational testing involving six F-35Bs was done on the Wasp in May 2015. The final Development Test III on involving operations in high sea states was completed in late 2016. A Royal Navy F-35 conducted the first "rolling" landing on board in October 2018. After the redesigned tail hook arrived, the F-35C's carrier-based Development Test I began in November 2014 aboard and focused on basic day carrier operations and establishing launch and recovery handling procedures. Development Test II, which focused on night operations, weapons loading, and full power launches, took place in October 2015. The final Development Test III was completed in August 2016, and included tests of asymmetric loads and certifying systems for landing qualifications and interoperability. Operational test of the F-35C began in 2018. The F-35's reliability and availability have fallen short of requirements, especially in the early years of testing. The ALIS maintenance and logistics system was plagued by excessive connectivity requirements and faulty diagnoses. In late 2017, the GAO reported the time needed to repair an F-35 part averaged 172 days, which was "twice the program's objective," and that shortage of spare parts was degrading readiness. In 2019, while individual F-35 units have achieved mission-capable rates of over the target of 80% for short periods during deployed operations, fleet-wide rates remained below target. The fleet availability goal of 65% was also not met, although the trend shows improvement. Gun accuracy of the F-35A remains unacceptable. As of 2020, the number of the program's most serious issues have been decreased by half. Operational test and evaluation (OT&E) with Block 3F, the final configuration for SDD, began in December 2018. United States The F-35A and F-35B were cleared for basic flight training in early 2012. However, lack of system maturity at the time led to concerns over safety as well as concerns by the Director of Operational Test & Evaluation (DOT&E) over electronic warfare testing, budget, and concurrency for the Operational Test and Evaluation master plan. Nevertheless, on 10 September 2012, the USAF began an operational utility evaluation (OUE) of the F-35A, including logistical support, maintenance, personnel training, and pilot execution. OUE flights began on 26 October and were completed on 14 November after 24 flights, each pilot having completed six flights. On 16 November 2012, the USMC received the first F-35B at MCAS Yuma, although Marine pilots had several flight restrictions. During the Low Rate Initial Production (LRIP) phase, the three U.S. military services jointly developed tactics and procedures using flight simulators, testing effectiveness, discovering problems and refining design. In January 2013, training began at Eglin AFB with capacity for 100 pilots and 2,100 maintainers at once. On 8 January 2015, RAF Lakenheath in the UK was chosen as the first base in Europe to station two USAF F-35 squadrons, with 48 aircraft adding to the 48th Fighter Wing's existing F-15C and F-15E squadrons. The USMC declared Initial Operational Capability (IOC) for the F-35B in the Block 2B configuration on 31 July 2015 after operational trials. However, limitations remained in night operations, communications, software and weapons carriage capabilities. USMC F-35Bs participated in their first Red Flag exercise in July 2016 with 67 sorties conducted. USAF F-35A in the Block 3i configuration achieved IOC with the USAF's 34th Fighter Squadron at Hill Air Force Base, Utah on 2 August 2016. The USN achieved operational status with the F-35C in Block 3F on 28 February 2019. USAF F-35As conducted their first Red Flag exercise in 2017; system maturity had improved and the aircraft scored a kill ratio of 15:1 against an F-16 aggressor squadron in a high-threat environment. The F-35's operating cost is higher than some older fighters. In fiscal year 2018, the F-35A's cost per flight hour (CPFH) was $44,000, a number that was reduced to $35,000 in 2019. For comparison, in 2015 the CPFH of the A-10 was $17,716; the F-15C, $41,921; and the F-16C, $22,514. Lockheed Martin hopes to reduce it to $25,000 by 2025 through performance-based logistics and other measures. The USMC plans to disperse its F-35Bs among forward-deployed bases to enhance survivability while remaining close to a battlespace, similar to RAF Harrier deployment in the Cold War, which relied on the use of off-base locations that offered short runways, shelter, and concealment. Known as distributed STOVL operations (DSO), F-35Bs would operate from temporary bases in allied territory within range of hostile missiles and move between temporary locations inside the enemy's 24- to 48-hour targeting cycle; this strategy accounts for the F-35B's short range, the shortest of the three variants, with mobile forward arming and refueling points (M-Farps) accommodating KC-130 and MV-22 Osprey aircraft to rearm and refuel the jets, as well as littoral areas for sea links of mobile distribution sites. M-Farps can be based on small airfields, multi-lane roads, or damaged main bases, while F-35Bs return to rear-area friendly bases or ships for scheduled maintenance. Helicopter-portable metal planking is needed to protect unprepared roads from the F-35B's exhaust; the USMC are studying lighter heat-resistant options. The first U.S. combat employment began in July 2018 with USMC F-35Bs from the amphibious assault ship , with the first combat strike on 27 September 2018 against a Taliban target in Afghanistan. This was followed by a USAF deployment to Al Dhafra Air Base, UAE on 15 April 2019. On 27 April 2019, USAF F-35As were first used in combat in an airstrike on an Islamic State tunnel network in northern Iraq. On 2 August 2021, the F-35C embarked on its maiden deployment on board the USS Carl Vinson with another aircraft making its debut deployment being the CMV-22 Osprey. United Kingdom The United Kingdom's Royal Air Force and Royal Navy both operate the F-35B, known simply as the Lightning in British service; it has replaced the Harrier GR9, which was retired in 2010, and Tornado GR4, which was retired in 2019. The F-35 is to be Britain's primary strike aircraft for the next three decades. One of the Royal Navy's requirements for the F-35B was a Shipborne Rolling and Vertical Landing (SRVL) mode to increase maximum landing weight by using wing lift during landing. In July 2013, Chief of the Air Staff, Air Chief Marshal Sir Stephen Dalton announced that No. 617 (The Dambusters) Squadron would be the RAF's first operational F-35 squadron. The second operational squadron will be the Fleet Air Arm's 809 Naval Air Squadron which will stand up in April 2023 or later. No. 17 (Reserve) Test and Evaluation Squadron (TES) stood-up on 12 April 2013 as the Operational Evaluation Unit for the Lightning, becoming the first British squadron to operate the type. By June 2013, the RAF had received three F-35s of the 48 on order, all initially based at Eglin Air Force Base. In June 2015, the F-35B undertook its first launches from a ski-jump at NAS Patuxent River. When operating at sea, British F-35Bs use ski-jumps fitted to the flight decks of aircraft carriers HMS Queen Elizabeth (R08) and HMS Prince of Wales (R09). The Italian Navy will use the same process. British F-35Bs are not intended to receive the Brimstone 2 missile. On 5 July 2017, it was announced the second UK-based RAF squadron would be No. 207 Squadron, which reformed on 1 August 2019 as the Lightning Operational Conversion Unit. No. 617 Squadron reformed on 18 April 2018 during a ceremony in Washington, D.C., US, becoming the first RAF front-line squadron to operate the type; receiving its first four F-35Bs on 6 June, flying from MCAS Beaufort to RAF Marham. Both No. 617 Squadron and its F-35s were declared combat ready on 10 January 2019. In April 2019, No. 617 Squadron deployed to RAF Akrotiri, Cyprus, the type's first overseas deployment. On 25 June 2019, the first combat use of an RAF F-35B was reportedly undertaken as armed reconnaissance flights searching for Islamic State targets in Iraq and Syria. In October 2019, the Dambusters and No. 17 TES F-35s were embarked on HMS Queen Elizabeth for the first time. No. 617 Squadron departed RAF Marham on 22 January 2020 for their first Exercise Red Flag with the Lightning. Australia Australia’s first F-35, designated A35-001, was manufactured in 2014, with flight training provided through international Pilot Training Centre (PTC) at Luke Air Force Base in Arizona. The first two F-35s were unveiled to the Australian public on 3 March 2017 at the Avalon Airshow. By 2021, the Royal Australian Air Force had accepted 26 F-35A aircraft, with nine in the US and 17 operating at No 3 Squadron and No 2 Operational Conversion Unit at RAAF Base Williamtown. With 41 trained RAAF pilots and 225 trained technicians for maintenance, the fleet was declared ready to deploy on operations. It is expected that Australia will receive all 72 of the F-35s by 2023. Israel The Israeli Air Force (IAF) declared the F-35 operationally capable on 6 December 2017. According to Kuwaiti newspaper Al Jarida, in July 2018, a test mission of at least three IAF F-35s flew to Iran's capital Tehran and back from Tel Aviv. While publicly unconfirmed, regional leaders acted on the report; Iran's supreme leader Ali Khamenei reportedly fired the air force chief and commander of Iran's Revolutionary Guard Corps over the mission. On 22 May 2018, IAF chief Amikam Norkin said that the service had employed their F-35Is in two attacks on two battle fronts, marking the first combat operation of an F-35 by any country. Norkin said it had been flown "all over the Middle East", and showed photos of an F-35I flying over Beirut in daylight. In July 2019, Israel expanded its strikes against Iranian missile shipments; IAF F-35Is allegedly struck Iranian targets in Iraq twice. In November 2020, the IAF announced the delivery of an F-35I Testbed aircraft amongst a delivery of four aircraft received in August. This example will be used to test and integrate Israeli-produced weapons and electronic systems on future F-35s received. This is the only example of a testbed F-35 delivered to an air force outside of the United States. On 11 May 2021, eight IAF F-35Is took part in an attack on 150 terrorist targets in Hamas' rocket array, including 50-70 launch pits in the northern Gaza Strip, as part of Operation Guardian of the Walls. Italy Italy's F-35As were declared to have reached initial operational capability (IOC) on 30 November 2018. At the time Italy had taken delivery of 10 F-35As and one F-35B, with 2 F-35As and the one F-35B being stationed in the U.S. for training, the remaining 8 F-35As were stationed in Amendola. Norway On 6 November 2019 Norway declared initial operational capability (IOC) for its fleet of 15 F-35As out of a planned 52 F-35As. On January 6, 2022 Norway's F-35As replaced its F-16s for the NATO quick reaction alert mission in the high north. Netherlands On 27 December 2021 the Netherlands declared initial operational capability (IOC) for its fleet of 24 F-35As that it has received to date from its order for 46 F-35As. Variants The F-35 was designed with three initial variants - the F-35A, a CTOL land-based version; the F-35B, a STOVL version capable of use either on land or on aircraft carriers; and the F-35C, a CATOBAR carrier-based version. Since then, there has been work on the design of nationally specific versions for Israel and Canada, as well as initial concept design work for an updated version of the F-35A, which would become the F-35D. F-35A The F-35A is the conventional takeoff and landing (CTOL) variant intended for the USAF and other air forces. It is the smallest, lightest version and capable of 9 g, the highest of all variants. Although the F-35A currently conducts aerial refueling via boom and receptacle method, the aircraft can be modified for probe-and-drogue refueling if needed by the customer. A drag chute pod can be installed on the F-35A, with the Royal Norwegian Air Force being the first operator to adopt it. F-35B The F-35B is the short takeoff and vertical landing (STOVL) variant of the aircraft. Similar in size to the A variant, the B sacrifices about a third of the A variant's fuel volume to accommodate the SDLF. This variant is limited to 7 g. Unlike other variants, the F-35B has no landing hook. The "STOVL/HOOK" control instead engages conversion between normal and vertical flight. The F-35B can also perform vertical and/or short take-off and landing (V/STOL). F-35C The F-35C variant is designed for catapult-assisted take-off but arrested recovery operations from aircraft carriers. Compared to the F-35A, the F-35C features larger wings with foldable wingtip sections, larger control surfaces for improved low-speed control, stronger landing gear for the stresses of carrier arrested landings, a twin-wheel nose gear, and a stronger tailhook for use with carrier arrestor cables. The larger wing area allows for decreased landing speed while increasing both range and payload. The F-35C is limited to 7.5 g. F-35I "Adir" The F-35I Adir (, meaning "Awesome", or "Mighty One") is an F-35A with unique Israeli modifications. The US initially refused to allow such changes before permitting Israel to integrate its own electronic warfare systems, including sensors and countermeasures. The main computer has a plug-and-play function for add-on systems; proposals include an external jamming pod, and new Israeli air-to-air missiles and guided bombs in the internal weapon bays. A senior IAF official said that the F-35's stealth may be partly overcome within 10 years despite a 30 to 40 year service life, thus Israel's insistence on using their own electronic warfare systems. Israel Aerospace Industries (IAI) has considered a
Paveway series of bombs, Joint Standoff Weapon (JSOW), and cluster munitions (Wind Corrected Munitions Dispenser). The station can also carry multiple smaller munitions such as the GBU-39 Small Diameter Bombs (SDB), GBU-53/B SDB II, and SPEAR 3 anti-tank missiles; up to four SDBs can be carried per station for the F-35A and F-35C, and three for the F-35B. The inboard station can carry the AIM-120 AMRAAM. Two compartments behind the weapons bays contain flares, chaff, and towed decoys. The aircraft can use six external weapons stations for missions that do not require stealth. The wingtip pylons each can carry an AIM-9X or AIM-132 ASRAAM and are canted outwards to reduce their radar cross-section. Additionally, each wing has a inboard station and a middle station, or for F-35B. The external wing stations can carry large air-to-surface weapons that would not fit inside the weapons bays such as the AGM-158 Joint Air to Surface Stand-off Missile (JASSM) cruise missile. An air-to-air missile load of eight AIM-120s and two AIM-9s is possible using internal and external weapons stations; a configuration of six bombs, two AIM-120s and two AIM-9s can also be arranged. The F-35A is armed with a 25 mm GAU-22/A rotary cannon mounted internally near the left wing root with 182 rounds carried; the gun is more effective against ground targets than the 20 mm cannon carried by other USAF fighters. The F-35B and F-35C have no internal gun and instead can use a Terma A/S multi-mission pod (MMP) carrying the GAU-22/A and 220 rounds; the pod is mounted on the centerline of the aircraft and shaped to reduce its radar cross-section. In lieu of the gun, the pod can also be used for different equipment and purposes, such as electronic warfare, aerial reconnaissance, or rear-facing tactical radar. Lockheed Martin is developing a weapon rack called Sidekick that would enable the internal outboard station to carry two AIM-120s, thus increasing the internal air-to-air payload to six missiles, currently offered for Block 4. Block 4 will also have a rearranged hydraulic line and bracket to allow the F-35B to carry four SDBs per internal outboard station; integration of the MBDA Meteor is also planned. The USAF and USN are planning to integrate the AGM-88G AARGM-ER internally in the F-35A and F-35C. Norway and Australia are funding an adaptation of the Naval Strike Missile (NSM) for the F-35; designated Joint Strike Missile (JSM), two missiles can be carried internally with an additional four externally. Nuclear weapons delivery via internal carriage of the B61 nuclear bomb is planned for Block 4B in 2024. Both hypersonic missiles and direct energy weapons such as solid-state laser are currently being considered as future upgrades. Lockheed Martin is studying integrating a fiber laser that uses spectral beam combining multiple individual laser modules into a single high-power beam, which can be scaled to various levels. The USAF plans for the F-35A to take up the close air support (CAS) mission in contested environments; amid criticism that it is not as well suited as a dedicated attack platform, USAF chief of staff Mark Welsh placed a focus on weapons for CAS sorties, including guided rockets, fragmentation rockets that shatter into individual projectiles before impact, and more compact ammunition for higher capacity gun pods. Fragmentary rocket warheads create greater effects than cannon shells as each rocket creates a "thousand-round burst", delivering more projectiles than a strafing run. Engine The single-engine aircraft is powered by the Pratt & Whitney F135 low-bypass augmented turbofan with rated thrust of . Derived from the Pratt & Whitney F119 used by the F-22, the F135 has a larger fan and higher bypass ratio to increase subsonic fuel efficiency, and unlike the F119, is not optimized for supercruise. The engine contributes to the F-35's stealth by having a low-observable augmenter, or afterburner, that incorporates fuel injectors into thick curved vanes; these vanes are covered by ceramic radar-absorbent materials and mask the turbine. The stealthy augmenter had problems with pressure pulsations, or "screech", at low altitude and high speed early in its development. The low-observable axisymmetric nozzle consists of 15 partially overlapping flaps that create a sawtooth pattern at the trailing edge, which reduces radar signature and creates shed vortices that reduce the infrared signature of the exhaust plume. Due to the engines large dimensions, the USN had to modify its underway replenishment system to facilitate at-sea logistics support. The F-35's Integrated Power Package (IPP) performs power and thermal management and integrates environment control, auxiliary power unit, engine starting, and other functions into a single system. The F135-PW-600 variant for the F-35B incorporates the SDLF to allow STOVL operations. Designed by Lockheed Martin and developed by Rolls-Royce, the SDLF, also known as the Rolls-Royce LiftSystem, consists of the lift fan, drive shaft, two roll posts, and a "three-bearing swivel module" (3BSM). The thrust vectoring 3BSM nozzle allows the main engine exhaust to be deflected downward at the tail of the aircraft and is moved by a "fueldraulic" actuator that uses pressurized fuel as the working fluid. Unlike the Harriers Pegasus engine that entirely uses direct engine thrust for lift, the F-35B's system augments the swivel nozzle's thrust with the lift fan; the fan is powered by the low-pressure turbine through a drive shaft when engaged with a clutch and placed near the front of the aircraft to provide a counterbalancing thrust. Roll control during slow flight is achieved by diverting unheated engine bypass air through wing-mounted thrust nozzles called roll posts. An alternative engine, the General Electric/Rolls-Royce F136, was being developed in the 2000s; originally, F-35 engines from Lot 6 onward were competitively tendered. Using technology from the General Electric YF120, the F136 was claimed to have a greater temperature margin than the F135. The F136 was canceled in December 2011 due to lack of funding. The F-35 is expected to receive propulsion upgrades over its lifecycle in order to adapt to emerging threats and enable additional capabilities. In 2016, the Adaptive Engine Transition Program (AETP) was launched to develop and test adaptive cycle engines, with one major potential application being the re-engining of the F-35; in 2018, both GE and P&W were awarded contracts to develop thrust class demonstrators, with the designations XA100 and XA101 respectively. In addition to potential re-engining, P&W also plans to improve the baseline F135; in 2017, P&W announced the F135 Growth Option 1.0 and 2.0; Growth Option 1.0 was a drop-in power module upgrade that offered 6–10% thrust improvement and 5–6% fuel burn reduction, while Growth Option 2.0 would be the adaptive cycle XA101. In 2020, P&W shifted its F135 upgrade plan from the Growth Options to a series of Engine Enhancement Packages along with some additional capabilities, while the XA101 became a separate clean-sheet design. The capability packages are planned to be incorporated in two-year increments starting in the mid-2020s. Maintenance and logistics The F-35 is designed to require less maintenance than earlier stealth aircraft. Some 95% of all field-replaceable parts are "one deep"—that is, nothing else need be removed to reach the desired part; for instance, the ejection seat can be replaced without removing the canopy. The F-35 has a fibermat radar-absorbent material (RAM) baked into the skin, which is more durable, easier to work with, and faster to cure than older RAM coatings; similar coatings are currently being considered for application on older stealth aircraft such as the F-22. Skin corrosion on the F-22 led the F-35's designers to use a less galvanic corrosion-inducing skin gap filler and to use fewer gaps in the airframe skin needing filler and better drainage. The flight control system uses electro-hydrostatic actuators rather than traditional hydraulic systems; these controls can be powered by lithium-ion batteries in case of emergency. Commonality between the different variants allowed the USMC to create their first aircraft maintenance Field Training Detachment to apply the USAF's lessons to their F-35 operations. The F-35 was intended to be supported by a computerized maintenance management system named Autonomic Logistics Information System (ALIS). In concept, any aircraft can be serviced at any F-35 maintenance facility and for all parts to be globally tracked and shared as needed. Due to numerous problems, such as unreliable diagnoses, excessive connectivity requirements, and security vulnerabilities, program officials plan to replace ALIS with the cloud-based Operational Data Integrated Network (ODIN) by 2022. ODIN base kits (OBKs)— OBKs are new computer hardware which replace ALIS's Standard Operating Unit unclassified (SOU-U) server hardware. Beginning in September 2020 OBKs were running ALIS software, as well as ODIN software, first at Marine Corps Air Station (MCAS) Yuma, Arizona, then at Naval Air Station Lemoore, California, in support of Strike Fighter Squadron (VFA) 125 on 16 July 2021, and then Nellis Air Force Base, Nevada, in support of the 422nd Test and Evaluation Squadron (TES) on 6 August 2021. In 2022, over a dozen more OBK server installation sites will replace the ALIS SOU-U servers, which will be able to run the legacy ALIS software as well as its replacement ODIN software. The performance on the OBK has doubled so far, compared to ALIS. Operational history Testing The first F-35A, AA-1, conducted its engine run in September 2006 and first flew on 15 December 2006. Unlike all subsequent aircraft, AA-1 did not have the weight optimization from SWAT; consequently, it mainly tested subsystems common to subsequent aircraft, such as the propulsion, electrical system, and cockpit displays. This aircraft was retired from flight testing in December 2009 and was used for live-fire testing at NAS China Lake. The first F-35B, BF-1, flew on 11 June 2008, while the first weight-optimized F-35A and F-35C, AF-1 and CF-1, flew on 14 November 2009 and 6 June 2010 respectively. The F-35B's first hover was on 17 March 2010, followed by its first vertical landing the next day. The F-35 Integrated Test Force (ITF) consisted of 18 aircraft at Edwards Air Force Base and Naval Air Station Patuxent River. Nine aircraft at Edwards, five F-35As, three F-35Bs, and one F-35C, performed flight sciences testing such as F-35A envelope expansion, flight loads, stores separation, as well as mission systems testing. The other nine aircraft at Patuxent River, five F-35Bs and four F-35Cs, were responsible for F-35B and C envelope expansion and STOVL and CV suitability testing. Additional carrier suitability testing was conducted at Naval Air Warfare Center Aircraft Division at Lakehurst, New Jersey. Two non-flying aircraft of each variant were used to test static loads and fatigue. For testing avionics and mission systems, a modified Boeing 737-300 with a duplication of the cockpit, the Lockheed Martin CATBird has been used. Field testing of the F-35's sensors were conducted during Exercise Northern Edge 2009 and 2011, serving as significant risk-reduction steps. Flight tests revealed several serious deficiencies that required costly redesigns, caused delays, and resulted in several fleet-wide groundings. In 2011, the F-35C failed to catch the arresting wire in all eight landing tests; a redesigned tail hook was delivered two years later. By June 2009, many of the initial flight test targets had been accomplished but the program was behind schedule. Software and mission systems were among the biggest sources of delays for the program, with sensor fusion proving especially challenging. In fatigue testing, the F-35B suffered several premature cracks, requiring a redesign of the structure. A third non-flying F-35B is currently planned to test the redesigned structure. The F-35B and C also had problems with the horizontal tails suffering heat damage from prolonged afterburner use. Early flight control laws had problems with "wing drop" and also made the airplane sluggish, with high angles-of-attack tests in 2015 against an F-16 showing a lack of energy. At-sea testing of the F-35B was first conducted aboard . In October 2011, two F-35Bs conducted three weeks of initial sea trials, called Development Test I. The second F-35B sea trials, Development Test II, began in August 2013, with tests including nighttime operations; two aircraft completed 19 nighttime vertical landings using DAS imagery. The first operational testing involving six F-35Bs was done on the Wasp in May 2015. The final Development Test III on involving operations in high sea states was completed in late 2016. A Royal Navy F-35 conducted the first "rolling" landing on board in October 2018. After the redesigned tail hook arrived, the F-35C's carrier-based Development Test I began in November 2014 aboard and focused on basic day carrier operations and establishing launch and recovery handling procedures. Development Test II, which focused on night operations, weapons loading, and full power launches, took place in October 2015. The final Development Test III was completed in August 2016, and included tests of asymmetric loads and certifying systems for landing qualifications and interoperability. Operational test of the F-35C began in 2018. The F-35's reliability and availability have fallen short of requirements, especially in the early years of testing. The ALIS maintenance and logistics system was plagued by excessive connectivity requirements and faulty diagnoses. In late 2017, the GAO reported the time needed to repair an F-35 part averaged 172 days, which was "twice the program's objective," and that shortage of spare parts was degrading readiness. In 2019, while individual F-35 units have achieved mission-capable rates of over the target of 80% for short periods during deployed operations, fleet-wide rates remained below target. The fleet availability goal of 65% was also not met, although the trend shows improvement. Gun accuracy of the F-35A remains unacceptable. As of 2020, the number of the program's most serious issues have been decreased by half. Operational test and evaluation (OT&E) with Block 3F, the final configuration for SDD, began in December 2018. United States The F-35A and F-35B were cleared for basic flight training in early 2012. However, lack of system maturity at the time led to concerns over safety as well as concerns by the Director of Operational Test & Evaluation (DOT&E) over electronic warfare testing, budget, and concurrency for the Operational Test and Evaluation master plan. Nevertheless, on 10 September 2012, the USAF began an operational utility evaluation (OUE) of the F-35A, including logistical support, maintenance, personnel training, and pilot execution. OUE flights began on 26 October and were completed on 14 November after 24 flights, each pilot having completed six flights. On 16 November 2012, the USMC received the first F-35B at MCAS Yuma, although Marine pilots had several flight restrictions. During the Low Rate Initial Production (LRIP) phase, the three U.S. military services jointly developed tactics and procedures using flight simulators, testing effectiveness, discovering problems and refining design. In January 2013, training began at Eglin AFB with capacity for 100 pilots and 2,100 maintainers at once. On 8 January 2015, RAF Lakenheath in the UK was chosen as the first base in Europe to station two USAF F-35 squadrons, with 48 aircraft adding to the 48th Fighter Wing's existing F-15C and F-15E squadrons. The USMC declared Initial Operational Capability (IOC) for the F-35B in the Block 2B configuration on 31 July 2015 after operational trials. However, limitations remained in night operations, communications, software and weapons carriage capabilities. USMC F-35Bs participated in their first Red Flag exercise in July 2016 with 67 sorties conducted. USAF F-35A in the Block 3i configuration achieved IOC with the USAF's 34th Fighter Squadron at Hill Air Force Base, Utah on 2 August 2016. The USN achieved operational status with the F-35C in Block 3F on 28 February 2019. USAF F-35As conducted their first Red Flag exercise in 2017; system maturity had improved and the aircraft scored a kill ratio of 15:1 against an F-16 aggressor squadron in a high-threat environment. The F-35's operating cost is higher than some older fighters. In fiscal year 2018, the F-35A's cost per flight hour (CPFH) was $44,000, a number that was reduced to $35,000 in 2019. For comparison, in 2015 the CPFH of the A-10 was $17,716; the F-15C, $41,921; and the F-16C, $22,514. Lockheed Martin hopes to reduce it to $25,000 by 2025 through performance-based logistics and other measures. The USMC plans to disperse its F-35Bs among forward-deployed bases to enhance survivability while remaining close to a battlespace, similar to RAF Harrier deployment in the Cold War, which relied on the use of off-base locations that offered short runways, shelter, and concealment. Known as distributed STOVL operations (DSO), F-35Bs would operate from temporary bases in allied territory within range of hostile missiles and move between temporary locations inside the enemy's 24- to 48-hour targeting cycle; this strategy accounts for the F-35B's short range, the shortest of the three variants, with mobile forward arming and refueling points (M-Farps) accommodating KC-130 and MV-22 Osprey aircraft to rearm and refuel the jets, as well as littoral areas for sea links of mobile distribution sites. M-Farps can be based on small airfields, multi-lane roads, or damaged main bases, while F-35Bs return to rear-area friendly bases or ships for scheduled maintenance. Helicopter-portable metal planking is needed to protect unprepared roads from the F-35B's exhaust; the USMC are studying lighter heat-resistant options. The first U.S. combat employment began in July 2018 with USMC F-35Bs from the amphibious assault ship , with the first combat strike on 27 September 2018 against a Taliban target in Afghanistan. This was followed by a USAF deployment to Al Dhafra Air Base, UAE on 15 April 2019. On 27 April 2019, USAF F-35As were first used in combat in an airstrike on an Islamic State tunnel network in northern Iraq. On 2 August 2021, the F-35C embarked on its maiden deployment on board the USS Carl Vinson with another aircraft making its debut deployment being the CMV-22 Osprey. United Kingdom The United Kingdom's Royal Air Force and Royal Navy both operate the F-35B, known simply as the Lightning in British service; it has replaced the Harrier GR9, which was retired in 2010, and Tornado GR4, which was retired in 2019. The F-35 is to be Britain's primary strike aircraft for the next three decades. One of the Royal Navy's requirements for the F-35B was a Shipborne Rolling and Vertical Landing (SRVL) mode to increase maximum landing weight by using wing lift during landing. In July 2013, Chief of the Air Staff, Air Chief Marshal Sir Stephen Dalton announced that No. 617 (The Dambusters) Squadron would be the RAF's first operational F-35 squadron. The second operational squadron will be the Fleet Air Arm's 809 Naval Air Squadron which will stand up in April 2023 or later. No. 17 (Reserve) Test and Evaluation Squadron (TES) stood-up on 12 April 2013 as the Operational Evaluation Unit for the Lightning, becoming the first British squadron to operate the type. By June 2013, the RAF had received three F-35s of the 48 on order, all initially based at Eglin Air Force Base. In June 2015, the F-35B undertook its first launches from a ski-jump at NAS Patuxent River. When operating at sea, British F-35Bs use ski-jumps fitted to the flight decks of aircraft carriers HMS Queen Elizabeth (R08) and HMS Prince of Wales (R09). The Italian Navy will use the same process. British F-35Bs are not intended to receive the Brimstone 2 missile. On 5 July 2017, it was announced the second UK-based RAF squadron would be No. 207 Squadron, which reformed on 1 August 2019 as the Lightning Operational Conversion Unit. No. 617 Squadron reformed on 18 April 2018 during a ceremony in Washington, D.C., US, becoming the first RAF front-line squadron to operate the type; receiving its first four F-35Bs on 6 June, flying from MCAS Beaufort to RAF Marham. Both No. 617 Squadron and its F-35s were declared combat ready on 10 January 2019. In April 2019, No. 617 Squadron deployed to RAF Akrotiri, Cyprus, the type's first overseas deployment. On 25 June 2019, the first combat use of an RAF F-35B was reportedly undertaken as armed reconnaissance flights searching for Islamic State targets in Iraq and Syria. In October 2019, the Dambusters and No. 17 TES F-35s were embarked on HMS Queen Elizabeth for the first time. No. 617 Squadron departed RAF Marham on 22 January 2020 for their first Exercise Red Flag with the Lightning. Australia Australia’s first F-35, designated A35-001, was manufactured in 2014, with flight training provided through international Pilot Training Centre (PTC) at Luke Air Force Base in Arizona. The first two F-35s were unveiled to the Australian public on 3 March 2017 at the Avalon Airshow. By 2021, the Royal Australian Air Force had accepted 26 F-35A aircraft, with nine in the US and 17 operating at No 3 Squadron and No 2 Operational Conversion Unit at RAAF Base Williamtown. With 41 trained RAAF pilots and 225 trained technicians for maintenance, the fleet was declared ready to deploy on operations. It is expected that Australia will receive all 72 of the F-35s by 2023. Israel The Israeli Air Force (IAF) declared the F-35 operationally capable on 6 December 2017. According to Kuwaiti newspaper Al Jarida, in July 2018, a test mission of at least three IAF F-35s flew to Iran's capital Tehran and back from Tel Aviv. While publicly unconfirmed, regional leaders acted on the report; Iran's supreme leader Ali Khamenei reportedly fired the air force chief and commander of Iran's Revolutionary Guard Corps over the mission. On 22 May 2018, IAF chief Amikam Norkin said that the service had employed their F-35Is in two attacks on two battle fronts, marking the first combat operation of an F-35 by any country. Norkin said it had been flown "all over the Middle East", and showed photos of an F-35I flying over Beirut in daylight. In July 2019, Israel expanded its strikes against Iranian missile shipments; IAF F-35Is allegedly struck Iranian targets in Iraq twice. In November 2020, the IAF announced the delivery of an F-35I Testbed aircraft amongst a delivery of four aircraft received in August. This example will be used to test and integrate Israeli-produced weapons and electronic systems on future F-35s received. This is the only example of a testbed F-35 delivered to an air force outside of the United States. On 11 May 2021, eight IAF F-35Is took part in an attack on 150 terrorist targets in Hamas' rocket array, including 50-70 launch pits in the northern Gaza Strip, as part of Operation Guardian of the Walls. Italy Italy's F-35As were declared to have reached initial operational capability (IOC) on 30 November 2018. At the time Italy had taken delivery of 10 F-35As and one F-35B, with 2 F-35As and the one F-35B being stationed in the U.S. for training, the remaining 8 F-35As were stationed in Amendola. Norway On 6 November 2019 Norway declared initial operational capability (IOC) for its fleet of 15 F-35As out of a planned 52 F-35As. On January 6, 2022 Norway's F-35As replaced its F-16s for the NATO quick reaction alert mission in the high north. Netherlands On 27 December 2021 the Netherlands declared initial operational capability (IOC) for its fleet of 24 F-35As that it has received to date from its order for 46 F-35As. Variants The F-35 was designed with three initial variants - the F-35A, a CTOL land-based version; the F-35B, a STOVL version capable of use either on land or on aircraft carriers; and the F-35C, a CATOBAR carrier-based version. Since then, there has been work on the design of nationally specific versions for Israel and Canada, as well as initial concept design work for an updated version of the F-35A, which would become the F-35D. F-35A The F-35A is the conventional takeoff and landing (CTOL) variant intended for the USAF and other air forces. It is the smallest, lightest version and capable of 9 g, the highest of all variants. Although the F-35A currently conducts aerial refueling via boom and receptacle method, the aircraft can be modified for probe-and-drogue refueling if needed by the customer. A drag chute pod can be installed on the F-35A, with the Royal Norwegian Air Force being the first operator to adopt it. F-35B The F-35B is the short takeoff and vertical landing (STOVL) variant of the aircraft. Similar in size to the A variant, the B sacrifices about a third of the A variant's fuel volume to accommodate the SDLF. This variant is limited to 7 g. Unlike other variants, the F-35B has no landing hook. The "STOVL/HOOK" control instead engages conversion between normal and vertical flight. The F-35B can also perform vertical and/or short take-off and landing (V/STOL). F-35C The F-35C variant is designed for catapult-assisted take-off but arrested recovery operations from aircraft carriers. Compared to the F-35A, the F-35C features larger wings with foldable wingtip sections, larger control surfaces for improved low-speed control, stronger landing gear for the stresses of carrier arrested landings, a twin-wheel nose gear, and a stronger tailhook for use with carrier arrestor cables. The larger wing area allows for decreased landing speed while increasing both range and payload. The F-35C is limited to 7.5 g. F-35I "Adir" The F-35I Adir (, meaning "Awesome", or "Mighty One") is an F-35A with unique Israeli modifications. The US initially refused to allow such changes before permitting Israel to integrate its own electronic warfare systems, including sensors and countermeasures. The main computer has a plug-and-play function for add-on systems; proposals include an external jamming pod, and new Israeli air-to-air missiles and guided bombs in the internal weapon bays. A senior IAF official said that the F-35's stealth may be partly overcome within 10 years despite a 30 to 40 year service life, thus Israel's insistence on using their own electronic warfare systems. Israel Aerospace Industries (IAI) has considered a two-seat F-35 concept; an IAI executive noted: "There is a known demand for two seats not only from Israel but from other air forces". IAI plans to produce conformal fuel tanks. Proposed variants F-35D A study for a possible upgrade of the F-35A to be fielded by the 2035 target date of the USAF's Future Operating Concept. CF-35 The Canadian CF-35 is a proposed variant that would differ from the F-35A through the addition of a drogue parachute and may include an F-35B/C-style refueling probe. In 2012, it was revealed that the CF-35 would employ the same boom refueling system as the F-35A. One alternative proposal would have been the adoption of the F-35C for its probe refueling and lower landing speed; however, the Parliamentary Budget Officer's report cited the F-35C's limited performance and payload as being too high a price to pay. Following the 2015 Federal Election the Liberal Party, whose campaign had included a pledge to cancel the F-35 procurement, formed a new government and commenced an open competition to replace the existing CF-18 Hornet. New Export variant In December 2021, it was reported that Lockheed Martin was developing a new variant for a unspecified foreign customer. The Department of Defense released US$49 million in funding for this work. Operators Royal Australian Air Force – 44 F-35A delivered as of November 2021, of 72 ordered. Belgian Air Component – 34 F-35A planned. Royal Danish Air Force – 4 F-35A delivered of the 27 planned. Finnish Air Force – F-35A Block 4 selected via the HX fighter program to replace the current F/A-18 Hornets. 64 F-35As on order. Israeli Air Force – 30 delivered as of September 2021 (F-35I "Adir"). Includes one F-35 testbed aircraft for indigenous Israeli weapons, electronics and structural upgrades, designated (AS-15). A total of 75 ordered with 75 planned. Italian Air Force – 12 F-35As delivered as of May 2020. 1 F-35B delivered as of October 2020, at which point Italy planned to order 60 F-35As and 15 F-35Bs for the Italian Air Force. Italian Navy – 2 had been delivered as of October 2020. 15 F-35Bs planned for the Italian Navy. Japan Air Self-Defense Force – 23 F-35As operational as of December 2021 with a total order of 147, including 42 F-35Bs. Royal Netherlands Air Force – 24 F-35As delivered and operational out of 46 ordered Royal Norwegian Air Force – 31 F-35As delivered and operational, of which 21 are in Norway and 10 are based in the US for training as of August 11th 2021 of 52 F-35As planned in total. They differ from other F-35A through the addition of a drogue parachute. Polish Air Force – 32 F-35As on order. Option for additional 16. Republic of Korea Air Force – 40 F-35A delivered as of January 2022, with 20 more on order. Republic of Korea Navy – about 20 F-35Bs planned Republic of Singapore Air Force – four F-35Bs to be ordered with option to order eight more as of March 2019. Royal Air Force and Royal Navy (owned by the RAF but jointly operated) – 27 F-35Bs received with 23 in the UK after the loss of one aircraft in November 2021; the other three are in the US where they are used for testing and training. 42 (24 FOC fighters and 18 training aircraft) to be fast-tracked by 2023; A total of 48 ordered as of 2021; a total of 60 to 80 F-35Bs are planned to be ordered. United States Air Force – 1,763 F-35As planned United States Marine Corps – 353 F-35Bs and 67 F-35Cs planned United States Navy – 273 F-35Cs planned Order and approval cancellations Turkish Air Force – Four F-35As delivered to Luke Air Force Base for training in July 2018. 30 were ordered, of up to 120 total planned. Future purchases have been banned by the U.S. with contracts canceled by early 2020. All four F-35A have been withheld at Luke Air Force Base and not sent to Turkey. Accidents and notable incidents On 23 June 2014, an F-35A's engine caught fire at Eglin AFB. The pilot escaped unharmed, while the aircraft sustained an estimated US$50 million in damage. The accident caused all flights to be halted on 3 July. The fleet returned to flight on 15 July with flight envelope restrictions. In June 2015, the USAF Air Education and Training Command (AETC) issued
complete list of all the names. Categories Food additives can be divided into several groups, although there is some overlap because some additives exert more than one effect. For example, salt is both a preservative as well as a flavor. Acidulants Acidulants confer sour or acid taste. Common acidulants include vinegar, citric acid, tartaric acid, malic acid, fumaric acid, and lactic acid. Acidity regulators Acidity regulators are used for controlling the pH of foods for stability or to affect activity of enzymes. Anticaking agents Anticaking agents keep powders such as milk powder from caking or sticking. Antifoaming and foaming agents Antifoaming agents reduce or prevent foaming in foods. Foaming agents do the reverse. Antioxidants Antioxidants such as vitamin C are preservatives by inhibiting the degradation of food by oxygen. Bulking agents Bulking agents such as starch are additives that increase the bulk of a food without affecting its taste. Food coloring Colorings are added to food to replace colors lost during preparation or to make food look more attractive. Fortifying agents Vitamins, minerals, and dietary supplements to increase the nutritional value Color retention agents In contrast to colorings, color retention agents are used to preserve a food's existing color. Emulsifiers Emulsifiers allow water and oils to remain mixed together in an emulsion, as in mayonnaise, ice cream, and homogenized milk. Flavors* Flavors are additives that give food a particular taste or smell, and may be derived from natural ingredients or created artificially. *In EU flavors do not have an E-code and they are not considered as food additives. Flavor enhancers Flavor enhancers enhance a food's existing flavors. A popular example is monosodium glutamate. Some flavor enhancers have their own flavors that are independent of the food. Flour treatment agents Flour treatment agents are added to flour to improve its color or its use in baking. Glazing agents Glazing agents provide a shiny appearance or protective coating to foods. Humectants Humectants prevent foods from drying out. Tracer gas Tracer gas allow for package integrity testing to prevent foods from being exposed to atmosphere, thus guaranteeing shelf life. Preservatives Preservatives prevent or inhibit spoilage of food due to fungi, bacteria and other microorganisms. Stabilizers Stabilizers, thickeners and gelling agents, like agar or pectin (used in jam for example) give foods a firmer texture. While they are not true emulsifiers, they help to stabilize emulsions. Sweeteners Sweeteners are added to foods for flavoring. Sweeteners other than sugar are added to keep the food energy (calories) low, or because they have beneficial effects regarding diabetes mellitus, tooth decay, or diarrhea. Thickeners Thickening agents are substances which, when added to the mixture, increase its viscosity without substantially modifying its other properties. Packaging Bisphenols, phthalates, and perfluoroalkyl chemicals (PFCs) are indirect additives used in manufacturing or packaging. In July 2018 the American Academy of Pediatrics called for more careful study of those three substances, along with nitrates and food coloring, as they might harm children during development. Safety and regulation With the increasing use of processed foods since the 19th century, food additives are more widely used. Many countries regulate their use. For example, boric acid was widely used as a food preservative from the 1870s to the 1920s, but was banned after World War I due to its toxicity, as demonstrated in animal and human studies. During World War II, the urgent need for cheap, available food preservatives led to it being used again, but it was finally banned in the 1950s. Such cases led to a general mistrust of food additives, and an application of the precautionary principle led to the conclusion that only additives that are known to be safe should be used in foods. In the United States, this led to the adoption of the Delaney clause, an amendment to the Federal Food, Drug, and Cosmetic Act of 1938, stating that no carcinogenic substances may be used as food additives. However, after the banning of cyclamates in the United States and Britain in 1969, saccharin, the only remaining legal artificial sweetener at the time, was found to cause cancer in rats. Widespread public outcry in the United States, partly communicated to Congress by
not approved for use in Europe so does not have an E number, although it is approved for use in Australia and New Zealand. Since 1987, Australia has had an approved system of labelling for additives in packaged foods. Each food additive has to be named or numbered. The numbers are the same as in Europe, but without the prefix "E". The United States Food and Drug Administration (FDA) lists these items as "generally recognized as safe" (GRAS); they are listed under both their Chemical Abstracts Service number and FDA regulation under the United States Code of Federal Regulations. See list of food additives for a complete list of all the names. Categories Food additives can be divided into several groups, although there is some overlap because some additives exert more than one effect. For example, salt is both a preservative as well as a flavor. Acidulants Acidulants confer sour or acid taste. Common acidulants include vinegar, citric acid, tartaric acid, malic acid, fumaric acid, and lactic acid. Acidity regulators Acidity regulators are used for controlling the pH of foods for stability or to affect activity of enzymes. Anticaking agents Anticaking agents keep powders such as milk powder from caking or sticking. Antifoaming and foaming agents Antifoaming agents reduce or prevent foaming in foods. Foaming agents do the reverse. Antioxidants Antioxidants such as vitamin C are preservatives by inhibiting the degradation of food by oxygen. Bulking agents Bulking agents such as starch are additives that increase the bulk of a food without affecting its taste. Food coloring Colorings are added to food to replace colors lost during preparation or to make food look more attractive. Fortifying agents Vitamins, minerals, and dietary supplements to increase the nutritional value Color retention agents In contrast to colorings, color retention agents are used to preserve a food's existing color. Emulsifiers Emulsifiers allow water and oils to remain mixed together in an emulsion, as in mayonnaise, ice cream, and homogenized milk. Flavors* Flavors are additives that give food a particular taste or smell, and may be derived from natural ingredients or created artificially. *In EU flavors do not have an E-code and they are not considered as food additives. Flavor enhancers Flavor enhancers enhance a food's existing flavors. A popular example is monosodium glutamate. Some flavor enhancers have their own flavors that are independent of the food. Flour treatment agents Flour treatment agents are added to flour to improve its color or its use in baking. Glazing agents Glazing agents provide a shiny appearance or protective coating to foods. Humectants Humectants prevent foods from drying out. Tracer gas Tracer gas allow for package integrity testing to prevent foods from being exposed to atmosphere, thus guaranteeing shelf life. Preservatives Preservatives prevent or inhibit spoilage of food due to fungi, bacteria and other microorganisms. Stabilizers Stabilizers, thickeners and gelling agents, like agar or pectin (used in jam for example) give foods a firmer texture. While they are not true emulsifiers, they help to stabilize emulsions. Sweeteners Sweeteners are added to foods for flavoring. Sweeteners other than sugar are added to keep the food energy (calories) low, or because they have beneficial effects regarding diabetes mellitus, tooth decay, or diarrhea. Thickeners Thickening agents are substances which, when added to the mixture, increase its viscosity without substantially modifying its other properties. Packaging Bisphenols, phthalates, and perfluoroalkyl chemicals (PFCs) are indirect additives used in manufacturing or packaging. In July 2018 the American Academy of Pediatrics called for more careful study of those three substances, along with nitrates and food coloring, as they might harm children during development. Safety and regulation With the increasing use of processed foods since the 19th century, food additives are more widely used. Many countries regulate their use. For example, boric acid was widely used as a food preservative from the 1870s to the 1920s, but was banned after World War I due to its toxicity, as demonstrated in animal and human studies. During World War II, the urgent need for cheap, available food preservatives led to it being used again, but it was finally banned in the 1950s. Such cases led to a general mistrust of food additives, and an application of the precautionary principle led to the conclusion that only additives that
Fram expedition Planning Nansen first began to consider the possibility of reaching the North Pole after reading meteorologist Henrik Mohn's theory on transpolar drift in 1884. Artefacts found on the coast of Greenland were identified to have come from the Jeannette expedition. In June 1881, was crushed and sunk off the Siberian coast—the opposite side of the Arctic Ocean. Mohn surmised the location of the artefacts indicated the existence of an ocean current from east to west, all the way across the polar sea and possibly over the pole itself. The idea remained fixated in Nansen's mind for the next couple of years. He developed a detailed plan for a polar venture after his triumphant return from Greenland. He made his idea public in February 1890, at a meeting of the newly formed Norwegian Geographical Society. Previous expeditions, he argued, approached the North Pole from the west and failed because they were working against the prevailing east–west current; the secret was to work with the current. A workable plan would require a sturdy and manoeuvrable small ship, capable of carrying fuel and provisions for twelve men for five years. This ship would enter the ice pack close to the approximate location of Jeannette's sinking, drifting west with the current towards the pole and beyond it—eventually reaching the sea between Greenland and Spitsbergen. Experienced polar explorers were dismissive: Adolphus Greely called the idea "an illogical scheme of self-destruction". Equally dismissive were Sir Allen Young, a veteran of the searches for Franklin's lost expedition, and Sir Joseph Dalton Hooker, who had sailed to the Antarctic on the Ross expedition. Nansen still managed to secure a grant from the Norwegian parliament after an impassioned speech. Additional funding was secured through a national appeal for private donations. Preparations Nansen chose naval engineer Colin Archer to design and build a ship. Archer designed an extraordinarily sturdy vessel with an intricate system of crossbeams and braces of the toughest oak timbers. Its rounded hull was designed to push the ship upwards when beset by pack ice. Speed and manoeuvrability were to be secondary to its ability as a safe and warm shelter during their predicted confinement. The length-to-beam ratio— and —gave it a stubby appearance, justified by Archer: "A ship that is built with exclusive regard to its suitability for [Nansen's] object must differ essentially from any known vessel." It was christened Fram and launched on 6 October 1892. Nansen selected a party of twelve from thousands of applicants. Otto Sverdrup, who took part in Nansen's earlier Greenland expedition was appointed as the expedition's second-in-command. Competition was so fierce that army lieutenant and dog-driving expert Hjalmar Johansen signed on as ship's stoker, the only position still available. Into the ice Fram left Christiania on 24 June 1893, cheered on by thousands of well-wishers. After a slow journey around the coast, the final port of call was Vardø, in the far north-east of Norway. Fram left Vardø on 21 July, following the North-East Passage route pioneered by Nordenskiöld in 1878–1879, along the northern coast of Siberia. Progress was impeded by fog and ice conditions in the mainly uncharted seas. The crew also experienced the dead water phenomenon, where a ship's forward progress is impeded by friction caused by a layer of fresh water lying on top of heavier salt water. Nevertheless, Cape Chelyuskin, the most northerly point of the Eurasian continental mass, was passed on 10 September. Heavy pack ice was sighted ten days later at around latitude 78°N, as Fram approached the area in which was crushed. Nansen followed the line of the pack northwards to a position recorded as , before ordering engines stopped and the rudder raised. From this point Fram's drift began. The first weeks in the ice were frustrating, as the drift moved unpredictably; sometimes north, sometimes south. By 19 November, Fram's latitude was south of that at which she had entered the ice. Only after the turn of the year, in January 1894, did the northerly direction become generally settled; the 80°N mark was finally passed on 22 March. Nansen calculated that, at this rate, it might take the ship five years to reach the pole. As the ship's northerly progress continued at a rate rarely above a kilometre and a half per day, Nansen began privately to consider a new plan—a dog sledge journey towards the pole. With this in mind, he began to practice dog-driving, making many experimental journeys over the ice. In November, Nansen announced his plan: when the ship passed latitude 83°N, he and Hjalmar Johansen would leave the ship with the dogs and make for the pole while Fram, under Sverdrup, continued its drift until it emerged from the ice in the North Atlantic. After reaching the pole, Nansen and Johansen would make for the nearest known land, the recently discovered and sketchily mapped Franz Josef Land. They would then cross to Spitzbergen where they would find a ship to take them home. The crew spent the rest of the winter of 1894 preparing clothing and equipment for the forthcoming sledge journey. Kayaks were built, to be carried on the sledges until needed for the crossing of open water. Preparations were interrupted early in January when violent tremors shook the ship. The crew disembarked, fearing the vessel would be crushed, but Fram proved herself equal to the danger. On 8 January 1895, the ship's position was 83°34′N, above Greely's previous record of 83°24′N. Dash for the pole With the ship's latitude at 84°4′N and after two false starts, Nansen and Johansen began their journey on 14 March 1895. Nansen allowed 50 days to cover the to the pole, an average daily journey of . After a week of travel, a sextant observation indicated they averaged per day, which put them ahead of schedule. However, uneven surfaces made skiing more difficult, and their speeds slowed. They also realised they were marching against a southerly drift, and that distances travelled did not necessarily equate to distance progressed. On 3 April, Nansen began to doubt whether the pole was attainable. Unless their speed improved, their food would not last them to the pole and back to Franz Josef Land. He confided in his diary: "I have become more and more convinced we ought to turn before time." Four days later, after making camp, he observed the way ahead was "... a veritable chaos of iceblocks stretching as far as the horizon." Nansen recorded their latitude as 86°13′6″N—almost three degrees beyond the previous record—and decided to turn around and head back south. Retreat At first Nansen and Johansen made good progress south, but suffered a serious setback on 13 April, when in his eagerness to break camp, they had forgotten to wind their chronometers, which made it impossible to calculate their longitude and accurately navigate to Franz Josef Land. They restarted the watches based on Nansen's guess they were at 86°E. From then on they were uncertain of their true position. The tracks of an Arctic fox were observed towards the end of April. It was the first trace of a living creature other than their dogs since they left Fram. They soon saw bear tracks and by the end of May saw evidence of nearby seals, gulls and whales. On 31 May, Nansen calculated they were only from Cape Fligely, Franz Josef Land's northernmost point. Travel conditions worsened as increasingly warmer weather caused the ice to break up. On 22 June, the pair decided to rest on a stable ice floe while they repaired their equipment and gathered strength for the next stage of their journey. They remained on the floe for a month. The day after leaving this camp, Nansen recorded: "At last the marvel has come to pass—land, land, and after we had almost given up our belief in it!" Whether this still-distant land was Franz Josef Land or a new discovery they did not know—they had only a rough sketch map to guide them. The edge of the pack ice was reached on 6 August and they shot the last of their dogs—the weakest of which they killed regularly to feed the others since 24 April. The two kayaks were lashed together, a sail was raised, and they made for the land. It soon became clear this land was part of an archipelago. As they moved southwards, Nansen tentatively identified a headland as Cape Felder on the western edge of Franz Josef Land. Towards the end of August, as the weather grew colder and travel became increasingly difficult, Nansen decided to camp for the winter. In a sheltered cove, with stones and moss for building materials, the pair erected a hut which was to be their home for the next eight months. With ready supplies of bear, walrus and seal to keep their larder stocked, their principal enemy was not hunger but inactivity. After muted Christmas and New Year celebrations, in slowly improving weather, they began to prepare to leave their refuge, but it was 19 May 1896 before they were able to resume their journey. Rescue and return On 17 June, during a stop for repairs after the kayaks had been attacked by a walrus, Nansen thought he heard a dog barking as well as human voices. He went to investigate, and a few minutes later saw the figure of a man approaching. It was the British explorer Frederick Jackson, who was leading an expedition to Franz Josef Land and was camped at Cape Flora on nearby Northbrook Island. The two were equally astonished by their encounter; after some awkward hesitation Jackson asked: "You are Nansen, aren't you?", and received the reply "Yes, I am Nansen." Johansen was picked up and the pair were taken to Cape Flora where, during the following weeks, they recuperated from their ordeal. Nansen later wrote that he could "still scarcely grasp" their sudden change of fortune; had it not been for the walrus attack that caused the delay, the two parties might have been unaware of each other's existence. On 7 August, Nansen and Johansen boarded Jackson's supply ship Windward, and sailed for Vardø where they arrived on the 13th. They were greeted by Hans Mohn, the originator of the polar drift theory, who was in the town by chance. The world was quickly informed by telegram of Nansen's safe return, but as yet there was no news of Fram. Taking the weekly mail steamer south, Nansen and Johansen reached Hammerfest on 18 August, where they learned that Fram had been sighted. She had emerged from the ice north and west of Spitsbergen, as Nansen had predicted, and was now on her way to Tromsø. She had not passed over the pole, nor exceeded Nansen's northern mark. Without delay Nansen and Johansen sailed for Tromsø, where they were reunited with their comrades. The homeward voyage to Christiania was a series of triumphant receptions at every port. On 9 September, Fram was escorted into Christiania's harbour and welcomed by the largest crowds the city had ever seen. The crew were received by King Oscar, and Nansen, reunited with family, remained at the palace for several days as special guests. Tributes arrived from all over the world; typical was that from the British mountaineer Edward Whymper, who wrote that Nansen had made "almost as great an advance as has been accomplished by all other voyages in the nineteenth century put together". National figure Scientist and polar oracle Nansen's first task on his return was to write his account of the voyage. This he did remarkably quickly, producing 300,000 words of Norwegian text by November 1896; the English translation, titled Farthest North, was ready in January 1897. The book was an instant success, and secured Nansen's long-term financial future. Nansen included without comment the one significant adverse criticism of his conduct, that of Greely, who had written in Harper's Weekly on Nansen's decision to leave Fram and strike for the pole: "It passes comprehension how Nansen could have thus deviated from the most sacred duty devolving on the commander of a naval expedition." During the 20 years following his return from the Arctic, Nansen devoted most of his energies to scientific work. In 1897 he accepted a professorship in zoology at the Royal Frederick University, which gave him a base from which he could tackle the major task of editing the reports of the scientific results of the Fram expedition. This was a much more arduous task than writing the expedition narrative. The results were eventually published in six volumes, and according to a later polar scientist, Robert Rudmose-Brown, "were to Arctic oceanography what the Challenger expedition results had been to the oceanography of other oceans." In 1900, Nansen became director of the Christiania-based International Laboratory for North Sea Research, and helped found the International Council for the Exploration of the Sea. Through his connection with the latter body, in the summer of 1900 Nansen embarked on his first visit to Arctic waters since the Fram expedition, a cruise to Iceland and Jan Mayen Land on the oceanographic research vessel Michael Sars, named after Eva's father. Shortly after his return he learned that his Farthest North record had been passed, by members of the Duke of the Abruzzi's Italian expedition. They had reached 86°34′N on 24 April 1900, in an attempt to reach the North Pole from Franz Josef Land. Nansen received the news philosophically: "What is the value of having goals for their own sake? They all vanish ... it is merely a question of time." Nansen was now considered an oracle by all would-be explorers of the north and south polar regions. Abruzzi had consulted him, as had the Belgian Adrien de Gerlache, each of whom took expeditions to the Antarctic. Although Nansen refused to meet his own countryman and fellow-explorer Carsten Borchgrevink (whom he considered a fraud), he gave advice to Robert Falcon Scott on polar equipment and transport, prior to the 1901–04 Discovery expedition. At one point Nansen seriously considered leading a South Pole expedition himself, and asked Colin Archer to design two ships. However, these plans remained on the drawing board. By 1901 Nansen's family had expanded considerably. A daughter, Liv, had been born just before Fram set out; a son, Kåre was born in 1897 followed by a daughter, Irmelin, in 1900 and a second son Odd in 1901. The family home, which Nansen had built in 1891 from the profits of his Greenland expedition book, was now too small. Nansen acquired a plot of land in the Lysaker district and built, substantially to his own design, a large and imposing house which combined some of the characteristics of an English manor house with features from the Italian renaissance. The house was ready for occupation by April 1902; Nansen called it Polhøgda (in English "polar heights"), and it remained his home for the rest of his life. A fifth and final child, son Asmund, was born at Polhøgda in 1903. Politician and diplomat The union between Norway and Sweden, imposed by the Great Powers in 1814, had been under considerable strain through the 1890s, the chief issue in question being Norway's rights to its own consular service. Nansen, although not by inclination a politician, had spoken out on the issue on several occasions in defence of Norway's interests. It seemed, early in the 20th century that agreement between the two countries might be possible, but hopes were dashed when negotiations broke down in February 1905. The Norwegian government fell, and was replaced by one led by Christian Michelsen, whose programme was one of separation from Sweden. In February and March Nansen published a series of newspaper articles which placed him firmly in the separatist camp. The new prime minister wanted Nansen in the cabinet, but Nansen had no political ambitions. However, at Michelsen's request he went to Berlin and then to London where, in a letter to The Times, he presented Norway's legal case for a separate consular service to the English-speaking world. On 17 May 1905, Norway's Constitution Day, Nansen addressed a large crowd in Christiania, saying: "Now have all ways of retreat been closed. Now remains only one path, the way forward, perhaps through difficulties and hardships, but forward for our country, to a free Norway". He also wrote a book, Norway and the Union with Sweden, to promote Norway's case abroad. On 23 May the Storting passed the Consulate Act establishing a separate consular service. King Oscar refused his assent; on 27 May the Norwegian cabinet resigned, but the king would not recognise this step. On 7 June the Storting unilaterally announced that the union with Sweden was dissolved. In a tense situation the Swedish government agreed to Norway's request that the dissolution should be put to a referendum of the Norwegian people. This was held on 13 August 1905 and resulted in an overwhelming vote for independence, at which point King Oscar relinquished the crown of Norway while retaining the Swedish throne. A second referendum, held in November, determined that the new independent state should be a monarchy rather than a republic. In anticipation of this, Michelsen's government had been considering the suitability of various princes as candidates for the Norwegian throne. Faced with King Oscar's refusal to allow anyone from his own House of Bernadotte to accept the crown, the favoured choice was Prince Charles of Denmark. In July 1905 Michelsen sent Nansen to Copenhagen on a secret mission to persuade Charles to accept the Norwegian throne. Nansen was successful; shortly after the second referendum Charles was proclaimed king, taking the name Haakon VII. He and his wife, the British princess Maud, were crowned in the Nidaros Cathedral in Trondheim on 22 June 1906. In April 1906 Nansen was appointed Norway's first Minister in London. His main task was to work with representatives of the major European powers on an Integrity Treaty which would guarantee Norway's position. Nansen was popular in England, and got on well with King Edward, though he found court functions and diplomatic duties disagreeable; "frivolous and boring" was his description. However, he was able to pursue his geographical and scientific interests through contacts with the Royal Geographical Society and other learned bodies. The Treaty was signed on 2 November 1907, and Nansen considered his task complete. Resisting the pleas of, among others, King Edward that he should remain in London, on 15 November Nansen resigned his post. A few weeks later, still in England as the king's guest at Sandringham, Nansen received word that Eva was seriously ill with pneumonia. On 8 December he set out for home, but before he reached Polhøgda he learned, from a telegram, that Eva had died. Oceanographer and traveller After a period of mourning, Nansen returned to London. He had been persuaded by his government to rescind his resignation until after King Edward's state visit to Norway in April 1908. His formal retirement from the diplomatic service was dated 1 May 1908, the same day on which his university professorship was changed from zoology to oceanography. This new designation reflected the general character of Nansen's more recent scientific interests. In 1905, he had supplied the Swedish physicist Walfrid Ekman with the data which established the principle in oceanography known as the Ekman spiral. Based on Nansen's observations of ocean currents recorded during the Fram expedition, Ekman concluded that the effect of wind on the sea's surface produced currents which "formed something like a spiral staircase, down towards the depths". In 1909 Nansen combined with Bjørn Helland-Hansen to publish an academic paper, The Norwegian Sea: its Physical Oceanography, based on the Michael Sars voyage of 1900. Nansen had by now retired from polar
mainly in the North Atlantic, and contributed to the development of modern oceanographic equipment. As one of his country's leading citizens, in 1905 Nansen spoke out for the ending of Norway's union with Sweden, and was instrumental in persuading Prince Carl of Denmark to accept the throne of the newly independent Norway. Between 1906 and 1908 he served as the Norwegian representative in London, where he helped negotiate the Integrity Treaty that guaranteed Norway's independent status. In the final decade of his life, Nansen devoted himself primarily to the League of Nations, following his appointment in 1921 as the League's High Commissioner for Refugees. In 1922 he was awarded the Nobel Peace Prize for his work on behalf of the displaced victims of World War I and related conflicts. Among the initiatives he introduced was the "Nansen passport" for stateless persons, a certificate that used to be recognized by more than 50 countries. He worked on behalf of refugees until his sudden death in 1930, after which the League established the Nansen International Office for Refugees to ensure that his work continued. This office received the Nobel Peace Prize in 1938. His name is commemorated in numerous geographical features, particularly in the polar regions. Family background and childhood The Nansen family originated in Denmark. Hans Nansen (1598–1667), a trader, was an early explorer of the White Sea region of the Arctic Ocean. In later life he settled in Copenhagen, becoming the city's borgmester in 1654. Later generations of the family lived in Copenhagen until the mid-18th century, when Ancher Antoni Nansen moved to Norway (then in a union with Denmark). His son, Hans Leierdahl Nansen (1764–1821), was a magistrate first in the Trondheim district, later in Jæren. After Norway's separation from Denmark in 1814, he entered national political life as the representative for Stavanger in the first Storting, and became a strong advocate of union with Sweden. After suffering a paralytic stroke in 1821 Hans Leierdahl Nansen died, leaving a four-year-old son, Baldur Fridtjof Nansen, the explorer's father. Baldur was a lawyer without ambitions for public life, who became Reporter to the Supreme Court of Norway. He married twice, the second time to Adelaide Johanne Thekla Isidore Bølling Wedel-Jarlsberg from Bærum, a niece of Herman Wedel-Jarlsberg who had helped frame the Norwegian constitution of 1814 and was later the Swedish king's Norwegian Viceroy. Baldur and Adelaide settled at Store Frøen, an estate at Aker, a few kilometres north of Norway's capital city, Christiania (since renamed Oslo). The couple had three children; the first died in infancy, the second, born 10 October 1861, was Fridtjof Wedel-Jarlsberg Nansen. Store Frøen's rural surroundings shaped the nature of Nansen's childhood. In the short summers the main activities were swimming and fishing, while in the autumn the chief pastime was hunting for game in the forests. The long winter months were devoted mainly to skiing, which Nansen began to practice at the age of two, on improvised skis. At the age of 10 he defied his parents and attempted the ski jump at the nearby Huseby installation. This exploit had near-disastrous consequences, as on landing the skis dug deep into the snow, pitching the boy forward: "I, head first, described a fine arc in the air ... [W]hen I came down again I bored into the snow up to my waist. The boys thought I had broken my neck, but as soon as they saw there was life in me ... a shout of mocking laughter went up." Nansen's enthusiasm for skiing was undiminished, though as he records, his efforts were overshadowed by those of the skiers from the mountainous region of Telemark, where a new style of skiing was being developed. "I saw this was the only way", wrote Nansen later. At school, Nansen worked adequately without showing any particular aptitude. Studies took second place to sports, or to expeditions into the forests where he would live "like Robinson Crusoe" for weeks at a time. Through such experiences Nansen developed a marked degree of self-reliance. He became an accomplished skier and a highly proficient skater. Life was disrupted when, in the summer of 1877, Adelaide Nansen died suddenly. Distressed, Baldur Nansen sold the Store Frøen property and moved with his two sons to Christiania. Nansen's sporting prowess continued to develop; at 18 he broke the world one-mile (1.6 km) skating record, and in the following year won the national cross-country skiing championship, a feat he would repeat on 11 subsequent occasions. Student and adventurer In 1880 Nansen passed his university entrance examination, the examen artium. He decided to study zoology, claiming later that he chose the subject because he thought it offered the chance of a life in the open air. He began his studies at the Royal Frederick University in Christiania early in 1881. Early in 1882 Nansen took "...the first fatal step that led me astray from the quiet life of science." Professor Robert Collett of the university's zoology department proposed that Nansen take a sea voyage, to study Arctic zoology at first hand. Nansen was enthusiastic, and made arrangements through a recent acquaintance, Captain Axel Krefting, commander of the sealer Viking. The voyage began on 11 March 1882 and extended over the following five months. In the weeks before sealing started, Nansen was able to concentrate on scientific studies. From water samples he showed that, contrary to previous assumption, sea ice forms on the surface of the water rather than below. His readings also demonstrated that the Gulf Stream flows beneath a cold layer of surface water. Through the spring and early summer Viking roamed between Greenland and Spitsbergen in search of seal herds. Nansen became an expert marksman, and on one day proudly recorded that his team had shot 200 seal. In July, Viking became trapped in the ice close to an unexplored section of the Greenland coast; Nansen longed to go ashore, but this was impossible. However, he began to develop the idea that the Greenland icecap might be explored, or even crossed. On 17 July the ship broke free from the ice, and early in August was back in Norwegian waters. Nansen did not resume formal studies at the university. Instead, on Collett's recommendation, he accepted a post as curator in the zoological department of the Bergen Museum. He was to spend the next six years of his life there—apart from a six-month sabbatical tour of Europe—working and studying with leading figures such as Gerhard Armauer Hansen, the discoverer of the leprosy bacillus, and Daniel Cornelius Danielssen, the museum's director who had turned it from a backwater collection into a centre of scientific research and education. Nansen's chosen area of study was the then relatively unexplored field of neuroanatomy, specifically the central nervous system of lower marine creatures. Before leaving for his sabbatical in February 1886 he published a paper summarising his research to date, in which he stated that "anastomoses or unions between the different ganglion cells" could not be demonstrated with certainty. This unorthodox view was confirmed by the simultaneous researches of the embryologist Wilhelm His and the psychiatrist August Forel. Nansen is considered the first Norwegian defender of the neuron theory, originally proposed by Santiago Ramón y Cajal. His subsequent paper, The Structure and Combination of Histological Elements of the Central Nervous System, published in 1887, became his doctoral thesis. Crossing of Greenland Planning The idea of an expedition across the Greenland icecap grew in Nansen's mind throughout his Bergen years. In 1887, after the submission of his doctoral thesis, he finally began organising this project. Before then, the two most significant penetrations of the Greenland interior had been those of Adolf Erik Nordenskiöld in 1883, and Robert Peary in 1886. Both had set out from Disko Bay on the western coast, and had travelled about eastward before turning back. By contrast, Nansen proposed to travel from east to west, ending rather than beginning his trek at Disko Bay. A party setting out from the inhabited west coast would, he reasoned, have to make a return trip, as no ship could be certain of reaching the dangerous east coast and picking them up. By starting from the east—assuming that a landing could be made there—Nansen's would be a one-way journey towards a populated area. The party would have no line of retreat to a safe base; the only way to go would be forward, a situation that fitted Nansen's philosophy completely. Nansen rejected the complex organisation and heavy manpower of other Arctic ventures, and instead planned his expedition for a small party of six. Supplies would be manhauled on specially designed lightweight sledges. Much of the equipment, including sleeping bags, clothing and cooking stoves, also needed to be designed from scratch. These plans received a generally poor reception in the press; one critic had no doubt that "if [the] scheme be attempted in its present form ... the chances are ten to one that he will ... uselessly throw his own and perhaps others' lives away". The Norwegian parliament refused to provide financial support, believing that such a potentially risky undertaking should not be encouraged. The project was eventually launched with a donation from a Danish businessman, Augustin Gamél; the rest came mainly from small contributions from Nansen's countrymen, through a fundraising effort organised by students at the university. Despite the adverse publicity, Nansen received numerous applications from would-be adventurers. He wanted expert skiers, and attempted to recruit from the skiers of Telemark, but his approaches were rebuffed. Nordenskiöld had advised Nansen that Sami people, from Finnmark in the far north of Norway, were expert snow travellers, so Nansen recruited a pair, Samuel Balto and Ole Nielsen Ravna. The remaining places went to Otto Sverdrup, a former sea-captain who had more recently worked as a forester; Oluf Christian Dietrichson, an army officer, and Kristian Kristiansen, an acquaintance of Sverdrup's. All had experience of outdoor life in extreme conditions, and were experienced skiers. Just before the party's departure, Nansen attended a formal examination at the university, which had agreed to receive his doctoral thesis. In accordance with custom he was required to defend his work before appointed examiners acting as "devil's advocates". He left before knowing the outcome of this process. Expedition The sealer Jason picked up Nansen's party on 3 June 1888 from the Icelandic port of Ísafjörður. They sighted the Greenland coast a week later, but thick pack ice hindered progress. With the coast still away, Nansen decided to launch the small boats. They were within sight of Sermilik Fjord on 17 July; Nansen believed it would offer a route up the icecap. The expedition left Jason "in good spirits and with the highest hopes of a fortunate result." Days of extreme frustration followed as they drifted south. Weather and sea conditions prevented them from reaching the shore. They spent most time camping on the ice itself—it was too dangerous to launch the boats. By 29 July, they found themselves south of the point where they left the ship. That day they finally reached land but were too far south to begin the crossing. Nansen ordered the team back into the boats after a brief rest and to begin rowing north. The party battled northward along the coast through the ice floes for the next 12 days. They encountered a large Eskimo encampment on the first day, near Cape Steen Bille. Occasional contacts with the nomadic native population continued as the journey progressed. The party reached Umivik Bay on 11 August, after covering . Nansen decided they needed to begin the crossing. Although they were still far south of his intended starting place; the season was becoming too advanced. After they landed at Umivik, they spent the next four days preparing for their journey. They set out on the evening of 15 August, heading north-west towards Christianhaab on the western shore of Disko Bay— away. Over the next few days, the party struggled to ascend. The inland ice had a treacherous surface with many hidden crevasses and the weather was bad. Progress stopped for three days because of violent storms and continuous rain one time. The last ship was due to leave Christianhaab by mid-September. They would not be able to reach it in time, Nansen concluded on 26 August. He ordered a change of course due west, towards Godthaab; a shorter journey by at least . The rest of the party, according to Nansen, "hailed the change of plan with acclamation." They continued climbing until 11 September and reached a height of above sea level. Temperatures on the icecap summit of the icecap dropped to at night. From then on the downward slope made travelling easier. Yet, the terrain was rugged and the weather remained hostile. Progress was slow: fresh snowfalls made dragging the sledges like pulling them through sand. On 26 September, they battled their way down the edge of a fjord westward towards Godthaab. Sverdrup constructed a makeshift boat out of parts of the sledges, willows, and their tent. Three days later, Nansen and Sverdrup began the last stage of the journey; rowing down the fjord. On 3 October, they reached Godthaab, where the Danish town representative greeted them. He first informed Nansen that he secured his doctorate, a matter that "could not have been more remote from [Nansen's] thoughts at that moment." The team accomplished their crossing in 49 days. Throughout the journey, they maintained meteorological and geographical and other records relating to the previously unexplored interior. The rest of the team arrived in Godthaab on 12 October. Nansen soon learned no ship was likely to call at Godthaab until the following spring. Still, they were able to send letters back to Norway via a boat leaving Ivigtut at the end of October. He and his party spent the next seven months in Greenland. On 15 April 1889, the Danish ship Hvidbjørnen finally entered the harbour. Nansen recorded: "It was not without sorrow that we left this place and these people, among whom we had enjoyed ourselves so well." Interlude and marriage Hvidbjørnen reached Copenhagen on 21 May 1889. News of the crossing had preceded its arrival, and Nansen and his companions were feted as heroes. This welcome, however, was dwarfed by the reception in Christiania a week later, when crowds of between thirty and forty thousand—a third of the city's population—thronged the streets as the party made its way to the first of a series of receptions. The interest and enthusiasm generated by the expedition's achievement led directly to the formation that year of the Norwegian Geographical Society. Nansen accepted the position of curator of the Royal Frederick University's zoology collection, a post which carried a salary but involved no duties; the university was satisfied by the association with the explorer's name. Nansen's main task in the following weeks was writing his account of the expedition, but he found time late in June to visit London, where he met the Prince of Wales (the future Edward VII), and addressed a meeting of the Royal Geographical Society (RGS). The RGS president, Sir Mountstuart Elphinstone Grant Duff, said that Nansen has claimed "the foremost place amongst northern travellers", and later awarded him the Society's prestigious Founder's Medal. This was one of many honours Nansen received from institutions all over Europe. He was invited by a group of Australians to lead an expedition to Antarctica, but declined, believing that Norway's interests would be better served by a North Pole conquest. On 11 August 1889 Nansen announced his engagement to Eva Sars, the daughter of Michael Sars, a zoology professor who had died when Eva was 11 years old. The couple had met some years previously, at the skiing resort of Frognerseteren, where Nansen recalled seeing "two feet sticking out of the snow". Eva was three years older than Nansen, and despite the evidence of this first meeting, was an accomplished skier. She was also a celebrated classical singer who had been coached in Berlin by Désirée Artôt, one-time paramour of Tchaikovsky. The engagement surprised many; since Nansen had previously expressed himself forcefully against the institution of marriage, Otto Sverdrup assumed he had read the message wrongly. The wedding took place on 6 September 1889, less than a month after the engagement. Fram expedition Planning Nansen first began to consider the possibility of reaching the North Pole after reading meteorologist Henrik Mohn's theory on transpolar drift in 1884. Artefacts found on the coast of Greenland were identified to have come from the Jeannette expedition. In June 1881, was crushed and sunk off the Siberian coast—the opposite side of the Arctic Ocean. Mohn surmised the location of the artefacts indicated the existence of an ocean current from east to west, all the way across the polar sea and possibly over the pole itself. The idea remained fixated in Nansen's mind for the next couple of years. He developed a detailed plan for a polar venture after his triumphant return from Greenland. He made his idea public in February 1890, at a meeting of the newly formed Norwegian Geographical Society. Previous expeditions, he argued, approached the North Pole from the west and failed because they were working against the prevailing east–west current; the secret was to work with the current. A workable plan would require a sturdy and manoeuvrable small ship, capable of carrying fuel and provisions for twelve men for five years. This ship would enter the ice pack close to the approximate location of Jeannette's sinking, drifting west with the current towards the pole and beyond it—eventually reaching the sea between Greenland and Spitsbergen. Experienced polar explorers were dismissive: Adolphus Greely called the idea "an illogical scheme of self-destruction". Equally dismissive were Sir Allen Young, a veteran of the searches for Franklin's lost expedition, and Sir Joseph Dalton Hooker, who had sailed to the Antarctic on the Ross expedition. Nansen still managed to secure a grant from the Norwegian parliament after an impassioned speech. Additional funding was secured through a national appeal for private donations. Preparations Nansen chose naval engineer Colin Archer to design and build a ship. Archer designed an extraordinarily sturdy vessel with an intricate system of crossbeams and braces of the toughest oak timbers. Its rounded hull was designed to push the ship upwards when beset by pack ice. Speed and manoeuvrability were to be secondary to its ability as a safe and warm shelter during their predicted confinement. The length-to-beam ratio— and —gave it a stubby appearance, justified by Archer: "A ship that is built with exclusive regard to its suitability for [Nansen's] object must differ essentially from any known vessel." It was christened Fram and launched on 6 October 1892. Nansen selected a party of twelve from thousands of applicants. Otto Sverdrup, who took part in Nansen's earlier Greenland expedition was appointed as the expedition's second-in-command. Competition was so fierce that army lieutenant and dog-driving expert Hjalmar Johansen signed on as ship's stoker, the only position still available. Into the ice Fram left Christiania on 24 June 1893, cheered on by thousands of well-wishers. After a slow journey around the coast, the final port of call was Vardø, in the far north-east of Norway. Fram left Vardø on 21 July, following the North-East Passage route pioneered by Nordenskiöld in 1878–1879, along the northern coast of Siberia. Progress was impeded by fog and ice conditions in the mainly uncharted seas. The crew also experienced the dead water phenomenon, where a ship's forward progress is impeded by friction caused by a layer of fresh water lying on top of heavier salt water. Nevertheless, Cape Chelyuskin, the most northerly point of the Eurasian continental mass, was passed on 10 September. Heavy pack ice was sighted ten days later at around latitude 78°N, as Fram approached the area in which was crushed. Nansen followed the line of the pack northwards to a position recorded as , before ordering engines stopped and the rudder raised. From this point Fram's drift began. The first weeks in the ice were frustrating, as the drift moved unpredictably; sometimes north, sometimes south. By 19 November, Fram's latitude was south of that at which she had entered the ice. Only after the turn of the year, in January 1894, did the northerly direction become generally settled; the 80°N mark was finally passed on 22 March. Nansen calculated that, at this rate, it might take the ship five years to reach the pole. As the ship's northerly progress continued at a rate rarely above a kilometre and a half per day, Nansen began privately to consider a new plan—a dog sledge journey towards the pole. With this in mind, he began to practice dog-driving, making many experimental journeys over the ice. In November, Nansen announced his plan: when the ship passed latitude 83°N, he and Hjalmar Johansen would leave the ship with the dogs and make for the pole while Fram, under Sverdrup, continued its drift until it emerged from the ice in the North Atlantic. After reaching the pole, Nansen and Johansen would make for the nearest known land, the recently discovered and sketchily mapped Franz Josef Land. They would then cross to Spitzbergen where they would find a ship to take them home. The crew spent the rest of the winter of 1894 preparing clothing and equipment for the forthcoming sledge journey. Kayaks were built, to be carried on the sledges until needed for the crossing of open water. Preparations were interrupted early in January when violent tremors shook the ship. The crew disembarked, fearing the vessel would be crushed, but Fram proved herself equal to the danger. On 8 January 1895, the ship's position was 83°34′N, above Greely's previous record of 83°24′N. Dash for the pole With the ship's latitude at 84°4′N and after two false starts, Nansen and Johansen began their journey on 14 March 1895. Nansen allowed 50 days to cover the to the pole, an average daily journey of . After a week of travel, a sextant observation indicated they averaged per day, which put them ahead of schedule. However, uneven surfaces made skiing more difficult, and their speeds slowed. They also realised they were marching against a southerly drift, and that distances travelled did not necessarily equate to distance progressed. On 3 April, Nansen began to doubt whether the pole was attainable. Unless their speed improved, their food would not last them to the pole and back to Franz Josef Land. He confided in his diary: "I have become more and more convinced we ought to turn before time." Four days later, after making camp, he observed the way ahead was "... a veritable chaos of iceblocks stretching as far as the horizon." Nansen recorded their latitude as 86°13′6″N—almost three degrees beyond the previous record—and decided to turn around and head back south. Retreat At first Nansen and Johansen made good progress south, but suffered a serious setback on 13 April, when in his eagerness to break camp, they had forgotten to wind their chronometers, which made it impossible to calculate their longitude and accurately navigate to Franz Josef Land. They restarted the watches based on Nansen's guess they were at 86°E. From then on they were uncertain of their true position. The tracks of an Arctic fox were observed towards the end of April. It was the first trace of a living creature other than their dogs since they left Fram. They soon saw bear tracks and by the end of May saw evidence of nearby seals, gulls and whales. On 31 May, Nansen calculated they were only from Cape Fligely, Franz Josef Land's northernmost point. Travel conditions worsened as increasingly warmer weather caused the ice to break up. On 22 June, the pair decided to rest on a stable ice floe while they repaired their equipment and gathered strength for the next stage of their journey. They remained on the floe for a month. The day after leaving this camp, Nansen recorded: "At last the marvel has come to pass—land, land, and after we had almost given up our belief in it!" Whether this still-distant land was Franz Josef Land or a new discovery they did not know—they had only a rough sketch map to guide them. The edge of the pack ice was reached on 6 August and they shot the last of their dogs—the weakest of which they killed regularly to feed the others since 24 April. The two kayaks were lashed together, a sail was raised, and they made for the land. It soon became clear this land was part of an archipelago. As they moved southwards, Nansen tentatively identified a headland as Cape Felder on the western edge of Franz Josef Land. Towards the end of August, as the weather grew colder and travel became increasingly difficult, Nansen decided to camp for the winter. In a sheltered cove, with stones and moss for building materials, the pair erected a hut which was to be their home for the next eight months. With ready supplies of bear, walrus and seal to keep their larder stocked, their principal enemy was not hunger but inactivity. After muted Christmas and New Year celebrations, in slowly improving weather, they began to prepare to leave their refuge, but it was 19 May 1896 before they were able to resume their journey. Rescue and return On 17 June, during a stop for repairs after the kayaks had been attacked by a walrus, Nansen thought he heard a dog barking as well as human voices. He went to investigate, and a few minutes later saw the figure of a man approaching. It was the British explorer Frederick Jackson, who was leading an expedition to Franz Josef Land and was camped at Cape Flora on nearby Northbrook Island. The two were equally astonished by their encounter; after some awkward hesitation Jackson asked: "You are Nansen, aren't you?", and received the reply "Yes, I am Nansen." Johansen was picked up and the pair were taken to Cape Flora where, during the following weeks, they recuperated from their ordeal. Nansen later wrote that he could "still scarcely grasp" their sudden change of fortune; had it not been for the walrus attack that caused the delay, the two parties might have been unaware of each other's existence. On 7 August, Nansen and Johansen boarded Jackson's supply ship Windward, and sailed for Vardø where they arrived on the 13th. They were greeted by Hans Mohn, the
history collection. It was not a state visit, but the King was the guest of Queen Victoria and Prince Albert at Windsor Castle, visited many of the sights in London and in the university cities of Oxford and Cambridge, and toured widely in England, Wales and Scotland. Accidental Death During a journey in Tyrol, he had an accident in Brennbüchel in which he fell in front of a horse that stepped on his head. On 8 August 1854, he died in the Gasthof Neuner. He was buried on 16 August in the Katholische Hofkirche of Dresden. In his memory, the Dowager Queen Maria arranged to establish the Königskapelle (King's Chapel) at the accident place, which was consecrated one year later, some of the last members of the Saxon royal family, including Maria Emanuel, Margrave of Meissen, are buried beside the chapel. Marriages In Vienna on 26 September 1819 (by proxy) and again in Dresden on 7 October 1819 (in person), Frederick Augustus married firstly with the Archduchess Maria Caroline of Austria (Maria Karoline Ferdinande Theresia Josephine Demetria), daughter of Emperor Francis I of Austria. They had no children. In Dresden on 24 April 1833 Frederick Augustus
Augustus brought Free Autonomy to the cities. Also, by an edict of 17 March of that year, the farmers were freed from the corvée and hereditary submission. King of Saxony On 6 June 1836, King Anton died and Frederick Augustus succeeded him. As an intelligent man, he was quickly popular with the people as he had been since the time of his regency. The new king solved political questions only from a pure sense of duty. Mostly he preferred to leave these things on the hands of his ministers. A standardized jurisdiction for Saxony created the Criminal Code of 1836. During the Revolutionary disturbances of 1848 (March Revolution), he appointed liberal ministers in the government, lifted censorship, and remitted a liberal electoral law. Later his attitude changed. On 28 April Frederick August II dissolved the Parliament. In 1849, Frederick Augustus was forced to flee to the Königstein Fortress. The May Uprising was crushed by Saxon and Prussian troops and Frederick was able to return after only a few days. Journey through England and Scotland In 1844 Frederick Augustus, accompanied by his personal physician Carl Gustav Carus, made an informal (incognito) visit to England and Scotland. Among places they visited were Lyme Regis where he purchased from the local fossil collector and dealer, Mary Anning, an ichthyosaur skeleton for his own extensive natural history collection. It was not a state visit, but the King was the guest of Queen Victoria and Prince Albert at Windsor Castle, visited many of the sights in London and in the university cities of Oxford and Cambridge, and toured widely in England, Wales and Scotland. Accidental Death During a journey in Tyrol, he had an accident in Brennbüchel in which he fell in front of a horse that stepped on his head. On 8 August 1854, he died in the Gasthof Neuner. He was buried on 16 August in the Katholische Hofkirche of Dresden. In his memory, the Dowager Queen Maria arranged to establish the Königskapelle (King's Chapel) at the accident place, which was consecrated one year later, some of the last members of the Saxon royal family, including Maria Emanuel, Margrave of Meissen, are buried beside the chapel. Marriages In Vienna on 26 September 1819 (by proxy) and again in Dresden on 7
well functioning market. It is suggested this would both eliminate the need for regular taxes that have a negative effect on trade (see deadweight loss) as well as release land and resources that are speculated upon or monopolised, two features that improve the competition and free market mechanisms. Winston Churchill supported this view by the following statement: "Land is the mother of all monopoly". The American economist and social philosopher Henry George, the most famous proponent of this thesis, wanted to accomplish this through a high land value tax that replaces all other taxes. Followers of his ideas are often called Georgists or geoists and geolibertarians. Léon Walras, one of the founders of the neoclassical economics who helped formulate the general equilibrium theory, had a very similar view. He argued that free competition could only be realized under conditions of state ownership of natural resources and land. Additionally, income taxes could be eliminated because the state would receive income to finance public services through owning such resources and enterprises. Laissez-faire The laissez-faire principle expresses a preference for an absence of non-market pressures on prices and wages such as those from discriminatory government taxes, subsidies, tariffs, regulations, or government-granted monopolies. In The Pure Theory of Capital, Friedrich Hayek argued that the goal is the preservation of the unique information contained in the price itself. According to Karl Popper, the idea of the free market is paradoxical, as it requires interventions towards the goal of preventing interventions. Although laissez-faire has been commonly associated with capitalism, there is a similar economic theory associated with socialism called left-wing or socialist laissez-faire, also known as free-market anarchism, free-market anti-capitalism and free-market socialism to distinguish it from laissez-faire capitalism. Critics of laissez-faire as commonly understood argue that a truly laissez-faire system would be anti-capitalist and socialist. American individualist anarchists such as Benjamin Tucker saw themselves as economic free-market socialists and political individualists while arguing that their "anarchistic socialism" or "individual anarchism" was "consistent Manchesterism". Socialism Various forms of socialism based on free markets have existed since the 19th century. Early notable socialist proponents of free markets include Pierre-Joseph Proudhon, Benjamin Tucker and the Ricardian socialists. These economists believed that genuinely free markets and voluntary exchange could not exist within the exploitative conditions of capitalism. These proposals ranged from various forms of worker cooperatives operating in a free-market economy such as the mutualist system proposed by Proudhon, to state-owned enterprises operating in unregulated and open markets. These models of socialism are not to be confused with other forms of market socialism (e.g. the Lange model) where publicly owned enterprises are coordinated by various degrees of economic planning, or where capital good prices are determined through marginal cost pricing. Advocates of free-market socialism such as Jaroslav Vanek argue that genuinely free markets are not possible under conditions of private ownership of productive property. Instead, he contends that the class differences and inequalities in income and power that result from private ownership enable the interests of the dominant class to skew the market to their favor, either in the form of monopoly and market power, or by utilizing their wealth and resources to legislate government policies that benefit their specific business interests. Additionally, Vanek states that workers in a socialist economy based on cooperative and self-managed enterprises have stronger incentives to maximize productivity because they would receive a share of the profits (based on the overall performance of their enterprise) in addition to receiving their fixed wage or salary. The stronger incentives to maximize productivity that he conceives as possible in a socialist economy based on cooperative and self-managed enterprises might be accomplished in a free-market economy if employee-owned companies were the norm as envisioned by various thinkers including Louis O. Kelso and James S. Albus. Socialists also assert that free-market capitalism leads to an excessively skewed distributions of income and economic instabilities which in turn leads to social instability. Corrective measures in the form of social welfare, re-distributive taxation and regulatory measures and their associated administrative costs which are required create agency costs for society. These costs would not be required in a self-managed socialist economy. Concepts Economic equilibrium The general equilibrium theory has demonstrated that, under certain theoretical conditions of perfect competition, the law of supply and demand influences prices toward an equilibrium that balances the demands for the products against the supplies. At these equilibrium prices, the market distributes the products to the purchasers according to each purchaser's preference or utility for each product and within the relative limits of each buyer's purchasing power. This result is described as market efficiency, or more specifically a Pareto optimum. Low barriers to entry A free market does not directly require the existence of competition; however, it does require a framework that freely allows new market entrants. Hence, competition in a free market is a consequence of the conditions of a free market, including that market participants not be obstructed from following their profit motive. Perfect competition and market failure An absence of any of the conditions of perfect competition is considered a market failure. Regulatory intervention may provide a substitute force to counter a market failure, which leads some economists to believe that some forms of market regulation may be better than an unregulated market at providing a free market. Spontaneous order Friedrich Hayek popularized the view that market economies promote spontaneous order which results in a better "allocation of societal resources than any design could achieve". According to this view, market economies are characterized by the formation of complex transactional networks that produce and distribute goods and services throughout the economy. These networks are not designed, but they nevertheless emerge as a result of decentralized individual economic decisions. The idea of spontaneous order is an elaboration on the invisible hand proposed by Adam Smith in The Wealth of Nations. About the individual, Smith wrote: By preferring the support of domestic to that of foreign industry, he intends only his own security; and by directing that industry in such a manner as its produce may be of the greatest value, he intends only his own gain, and he is in this, as in many other cases, led by an invisible hand to promote an end which was no part of his intention. Nor is it always the worse for society that it was no part of it. By pursuing his own interest, he frequently promotes that of the society more effectually than when he really intends to promote it. I have never known much good done by those who affected to trade for the public good. Smith pointed out that one does not get one's dinner by appealing to the brother-love of the butcher, the farmer or the baker. Rather, one appeals to their self-interest and pays them for their labor, arguing: It is not from the benevolence of the butcher, the brewer or the baker, that we expect our dinner, but from their regard to their own self-interest. We address ourselves, not to their humanity but to their self-love, and never talk to them of our own necessities but of their advantages. Supporters of this view claim that spontaneous order is superior to any order that does not allow individuals to make their own choices of what to produce, what to buy, what to sell and at what prices due to the number and complexity of the factors involved. They further believe that any attempt to implement central planning will result in more disorder, or a less efficient production and distribution of goods and services. Critics such as political economist Karl Polanyi question whether a spontaneously ordered market can exist, completely free of distortions of political policy, claiming that even the ostensibly freest markets require a state to exercise coercive power in some areas, namely to enforce contracts, govern the formation of labor unions, spell out the rights and obligations of corporations, shape who has standing to bring legal actions and define what constitutes an unacceptable conflict of interest. Supply and demand Demand for an item (such as goods or services) refers to the economic market pressure from people trying to buy it. Buyers have a maximum price they are willing to pay for an item, and sellers have a minimum price at which they are willing to offer their product. The point at which the supply and demand curves meet is the equilibrium price of the good and quantity demanded. Sellers willing to offer their goods at a lower price than the equilibrium price receive the difference as producer surplus. Buyers willing to pay for goods at a higher price than the equilibrium price receive the difference as consumer surplus. The model is commonly applied to wages in the market for labor. The typical roles of supplier and consumer are reversed. The suppliers are individuals, who try to sell (supply) their labor for the highest price. The consumers are businesses, which try to buy (demand) the type of labor they need at the lowest price. As more people offer their labor in that market, the equilibrium wage decreases and the equilibrium level of employment increases as the supply curve shifts to the right. The opposite happens if fewer people offer their wages in the market as the supply curve shifts to the left. In a free market, individuals and firms taking part in these transactions have the liberty to enter, leave and participate in the market as they so choose. Prices and quantities are allowed to adjust according to economic conditions in order to reach equilibrium and allocate resources. However, in many countries around the world governments seek to intervene in the free market in order to achieve certain social or political agendas. Governments may attempt to create social equality or equality of outcome by intervening in the market through actions such as imposing a minimum wage (price floor) or erecting price controls (price ceiling). Other lesser-known goals are also pursued, such as in the United States, where the federal government subsidizes owners of fertile land to not grow crops in order to prevent the supply curve from further shifting to the right and decreasing the equilibrium price. This is done under the justification of maintaining farmers' profits; due to the relative inelasticity of demand for crops, increased supply would lower the price but not significantly increase quantity demanded, thus placing pressure on farmers to exit the market. Those interventions are often done in the name of maintaining basic assumptions of free markets such as the idea that the costs of production must be included in the price of goods. Pollution and depletion costs are sometimes not included in the cost of production (a manufacturer that withdraws water at one location then discharges it polluted downstream, avoiding the cost of treating the water), therefore governments may opt to impose regulations in an attempt to try to internalize all of the cost of production and ultimately include them in the price of the goods. Advocates of the free market contend that government intervention hampers economic growth by disrupting the efficient allocation of resources according to supply and demand while critics of the free market contend that government intervention is sometimes necessary to protect a country's economy from better-developed and more influential economies, while providing the stability necessary for wise long-term investment. Milton Friedman argued against central planning, price controls and state-owned corporations, particularly as practiced in the Soviet Union and China while Ha-Joon Chang cites the examples of post-war Japan and the growth of South Korea's steel industry as positive examples of government intervention. Criticism Critics of a laissez-faire free market have argued that in real world situations it has proven to be susceptible to the development of price fixing monopolies. Such reasoning has led to government intervention, e.g. the United States antitrust law. Two prominent Canadian authors argue that government at times has to intervene to ensure competition in large and important industries.
claiming that even the ostensibly freest markets require a state to exercise coercive power in some areas, namely to enforce contracts, govern the formation of labor unions, spell out the rights and obligations of corporations, shape who has standing to bring legal actions and define what constitutes an unacceptable conflict of interest. Supply and demand Demand for an item (such as goods or services) refers to the economic market pressure from people trying to buy it. Buyers have a maximum price they are willing to pay for an item, and sellers have a minimum price at which they are willing to offer their product. The point at which the supply and demand curves meet is the equilibrium price of the good and quantity demanded. Sellers willing to offer their goods at a lower price than the equilibrium price receive the difference as producer surplus. Buyers willing to pay for goods at a higher price than the equilibrium price receive the difference as consumer surplus. The model is commonly applied to wages in the market for labor. The typical roles of supplier and consumer are reversed. The suppliers are individuals, who try to sell (supply) their labor for the highest price. The consumers are businesses, which try to buy (demand) the type of labor they need at the lowest price. As more people offer their labor in that market, the equilibrium wage decreases and the equilibrium level of employment increases as the supply curve shifts to the right. The opposite happens if fewer people offer their wages in the market as the supply curve shifts to the left. In a free market, individuals and firms taking part in these transactions have the liberty to enter, leave and participate in the market as they so choose. Prices and quantities are allowed to adjust according to economic conditions in order to reach equilibrium and allocate resources. However, in many countries around the world governments seek to intervene in the free market in order to achieve certain social or political agendas. Governments may attempt to create social equality or equality of outcome by intervening in the market through actions such as imposing a minimum wage (price floor) or erecting price controls (price ceiling). Other lesser-known goals are also pursued, such as in the United States, where the federal government subsidizes owners of fertile land to not grow crops in order to prevent the supply curve from further shifting to the right and decreasing the equilibrium price. This is done under the justification of maintaining farmers' profits; due to the relative inelasticity of demand for crops, increased supply would lower the price but not significantly increase quantity demanded, thus placing pressure on farmers to exit the market. Those interventions are often done in the name of maintaining basic assumptions of free markets such as the idea that the costs of production must be included in the price of goods. Pollution and depletion costs are sometimes not included in the cost of production (a manufacturer that withdraws water at one location then discharges it polluted downstream, avoiding the cost of treating the water), therefore governments may opt to impose regulations in an attempt to try to internalize all of the cost of production and ultimately include them in the price of the goods. Advocates of the free market contend that government intervention hampers economic growth by disrupting the efficient allocation of resources according to supply and demand while critics of the free market contend that government intervention is sometimes necessary to protect a country's economy from better-developed and more influential economies, while providing the stability necessary for wise long-term investment. Milton Friedman argued against central planning, price controls and state-owned corporations, particularly as practiced in the Soviet Union and China while Ha-Joon Chang cites the examples of post-war Japan and the growth of South Korea's steel industry as positive examples of government intervention. Criticism Critics of a laissez-faire free market have argued that in real world situations it has proven to be susceptible to the development of price fixing monopolies. Such reasoning has led to government intervention, e.g. the United States antitrust law. Two prominent Canadian authors argue that government at times has to intervene to ensure competition in large and important industries. Naomi Klein illustrates this roughly in her work The Shock Doctrine and John Ralston Saul more humorously illustrates this through various examples in The Collapse of Globalism and the Reinvention of the World. While its supporters argue that only a free market can create healthy competition and therefore more business and reasonable prices, opponents say that a free market in its purest form may result in the opposite. According to Klein and Ralston, the merging of companies into giant corporations or the privatization of government-run industry and national assets often result in monopolies or oligopolies requiring government intervention to force competition and reasonable prices. Another form of market failure is speculation, where transactions are made to profit from short term fluctuation, rather from the intrinsic value of the companies or products. This criticism has been challenged by historians such as Lawrence Reed, who argued that monopolies have historically failed to form even in the absence of antitrust law. This is because monopolies are inherently difficult to maintain as a company that tries to maintain its monopoly by buying out new competitors, for instance, is incentivizing newcomers to enter the market in hope of a buy-out. Furthermore, according to writer Walter Lippman and economist Milton Friedman, historical analysis of the formation of monopolies reveals that, contrary to popular belief, these were the result not of unfettered market forces, but of legal privileges granted by government. American philosopher and author Cornel West has derisively termed what he perceives as dogmatic arguments for laissez-faire economic policies as free-market fundamentalism. West has contended that such mentality "trivializes the concern for public interest" and "makes money-driven, poll-obsessed elected officials deferential to corporate goals of profit – often at the cost of the common good". American political philosopher Michael J. Sandel contends that in the last thirty years the United States has moved beyond just having a market economy and has become a market society where literally everything is for sale, including aspects of social and civic life such as education, access to justice and political influence. The economic historian Karl Polanyi was highly critical of the idea of the market-based society in his book The Great Transformation, noting that any attempt at its creation would undermine human society and the common good. David McNally of the University of Houston argues in the Marxist tradition that the logic of the market inherently produces inequitable outcomes and leads to unequal exchanges, arguing that Adam Smith's moral intent and moral philosophy espousing equal exchange
was rebuilt by Holman Moody in California to handle the 7.0-liter FE (427 ci) engine from the Ford Galaxie, used in NASCAR at the time and modified for road course use. The car's chassis was similar to the British-built Mk I chassis, but it and other parts of the car had to be redesigned and modified by Holman Moody to accommodate the larger and heavier 427 engine. A new Kar Kraft-built four-speed gearbox replaced the ZF five-speed used in the Mk I. This car is sometimes called the Ford Mk II. In 1966, the three teams racing the Mk II (Chris Amon and Bruce McLaren, Denny Hulme and Ken Miles, and Dick Hutcherson and Ronnie Bucknum) dominated Le Mans, taking European audiences by surprise and beating Ferrari to finish 1-2-3 in the standings. The Ford GT40 went on to win the race for the next three years. For 1967, the Mk IIs were upgraded to "B" spec; they had re-designed bodywork and twin Holley carburetors for an additional . A batch of improperly heat-treated input shafts in the transaxles sidelined virtually every Ford in the race at Daytona, however, and Ferrari won 1-2-3. The Mk IIBs were also used for Sebring and Le Mans that year and won the Reims 12 Hours in France. For the Daytona 24 Hours, two Mk II models (chassis 1016 and 1047) had their engines re-badged as Mercury engines; Ford seeing a good opportunity to advertise that division of the company. In 2018, a Mk II that was 3rd overall at the 1966 Le Mans 24 Hours was sold by RM Sotheby's for $9,795,000 (£7,624,344) - the highest price achieved for a GT40 at auction. Mk III The Mk III was a road-car only, of which seven were built. The car had four headlamps, the rear part of the body was expanded to make room for luggage, the 4.7-liter engine was detuned to , the shock absorbers were softened, the shift lever was moved to the center, an ashtray was added, and the car was available with the steering wheel on the left side of the car. As the Mk III looked significantly different from the racing models many customers interested in buying a GT40 for road use chose to buy a Mk I that was available from Wyer Ltd. Of the seven MK III that were produced four were left-hand drive. J-car In an effort to develop a car with better aerodynamics (potentially resulting in superior control and speed compared to competitors), the decision was made to re-conceptualize and redesign everything about the vehicle other than its powerful 7-liter engine. This would result in the abandonment of the original Mk I/Mk II chassis. In order to bring the car into alignment with Ford's "in house" ideology at the time, more restrictive partnerships were implemented with English firms, which resulted in the sale of Ford Advanced Vehicles (acquired by John Wyer), ultimately leading to a new vehicle which would be slated for design by Ford's studios and produced by Ford's subsidiary Kar-Kraft under Ed Hull. Furthermore, there was also a partnership with the Brunswick Aircraft Corporation for expertise on the novel use of aluminum honeycomb panels bonded together to form a lightweight, rigid "tub". The car was designated as the J-car, as it was constructed to meet the new Appendix J regulations which were introduced by the FIA in 1966. The first J-car was completed in March 1966 and set the fastest time at the Le Mans trials that year. The tub weighed only , and the entire car weighed only , less than the Mk II. It was decided to run the Mk IIs due to their proven reliability, however, and little or no development was done on the J-car for the rest of the season. Following Le Mans, the development program for the J-car was resumed, and a second car was built. During a test session at Riverside International Raceway in August 1966 with Ken Miles driving, the car suddenly went out of control at the end of Riverside's high-speed, 1-mile-long back straight. The aluminum honeycomb chassis did not live up to its design goal, shattering upon impact. The car burst into flames, killing Miles. It was determined that the unique, flat-topped "bread van" aerodynamics of the car, lacking any sort of spoiler, were implicated in generating excess lift. Therefore, a conventional but significantly more aerodynamic body was designed for the subsequent development of the J-car which was officially known as the Mk IV. A total of nine cars were constructed with J-car chassis numbers although six were designated as Mk IVs and one as the G7A. Mk IV The Mk IV was built around a reinforced J chassis powered by the same 7.0 L engine as the Mk II. Excluding the engine, gearbox, some suspension parts and the brakes from the Mk II, the Mk IV was totally different from other GT40s, using a specific, all-new chassis and bodywork. It was undoubtedly the most radical and American variant of all the GT40's over the years. As a direct result of the Miles accident, the team installed a NASCAR-style steel-tube roll cage in the Mk IV, which made it much safer, but the roll cage was so heavy that it negated most of the weight saving of the then-highly advanced, radically innovative honeycomb-panel construction. The Mk IV had a long, streamlined shape, which gave it exceptional top speed, crucial to do well at Le Mans in those days (a circuit made up predominantly of straights)—the race it was ultimately built for. A 2-speed automatic gearbox was tried, but during the extensive testing of the J-car in 1966 and 1967, it was decided that the 4-speed from the Mk II would be retained. Dan Gurney often complained about the weight of the Mk IV, since the car was heavier than the Ferrari 330 P4's. During practice at Le Mans in 1967, in an effort to preserve the highly stressed brakes, Gurney developed a strategy (also adopted by co-driver A.J. Foyt) of backing completely off the throttle several hundred yards before the approach to the Mulsanne hairpin and virtually coasting into the braking area. This technique saved the brakes, but the resulting increase in the car's recorded lap times during practice led to speculation within the Ford team that Gurney and Foyt, in an effort to compromise on chassis settings, had hopelessly "dialed out" their car. The car proved to be fastest in a straight line that year, thanks to its streamlined aerodynamics, achieving 212 mph on the 3.6-mile Mulsanne Straight. The Mk IV ran in only two races, the 1967 12 Hours of Sebring and the 1967 24 Hours of Le Mans and won both events. Only one Mk IV was completed for Sebring; the pressure from Ford had been amped up considerably after Ford's humiliation at Daytona two months earlier. Mario Andretti and Bruce McLaren won Sebring, Dan Gurney and A. J. Foyt won Le Mans (Gurney and Foyt's car was the Mk IV that was apparently least likely to win), where the Ford-representing Shelby-American and Holman & Moody teams showed up to Le Mans with 2 Mk IVs each. The installation of the roll cage was ultimately credited by many with saving the life of Andretti, who crashed violently at the Esses during the 1967 Le Mans 24 Hours, but escaped with minor injuries. Unlike the earlier Mk I - III cars, which were built in England, the Mk IVs were built in the United States by Kar Kraft. Le Mans 1967 remains the only all-American victory in Le Mans history—American drivers, team, chassis, engine, and tires. A total of six Mk IVs were constructed. One of the Mk IVs was rebuilt to the Ford G7 in 1968, and used in the Can-Am series for 1969 and 1970, but with no success. This car is sometimes called the Ford Mk IV. Mk V For years Peter Thorp had searched for a GT40 in good condition. Most of the cars had problems including the dreaded rust issue. His company, Safir Engineering, was building and fielding Formula 3 race cars, in addition, had a Token Formula One car purchased from the Ron Dennis Company, Rondell Racing. Formula One events in which Safir Engineering competed included Brands Hatch and Silverstone. Safir was also redesigning Range Rovers modifying the unit to six-wheel drive and exporting them. Safir technical capabilities were such that they could rebuild GT40s. It was with this in mind that Thorp approached John Willment for his thoughts. It was soon decided that there would be a limited, further run of the significant GT40. JW Engineering would oversee the build, and Safir was to do the work. The continued JW Engineering/Safir Engineering production would utilize sequential serial numbers starting at the last used GT40 serial number and move forward. Maintaining the GT40 Mark nomenclature, this continued production would be named GT40 Mk V. JW Engineering wished to complete the GT40 chassis numbers GT40P-1087, 1088 and 1089. This was supposed to take place prior to the beginning of Safir production, however, the completion of these three chassis’ was very much delayed. Ford's Len Bailey was hired to inspect the proposed build and engineer any changes he thought prudent to ensure the car was safe, as well as minimize problems experienced in the past. Baily changed the front suspension to Alan Mann specifications, which minimized nose-dive under braking. Zinc coated steel replaced the previous uncoated rust-prone sheet metal. The vulnerable drive donuts were replaced with CV joints and the leak-prone rubber gas tanks were replaced with aluminum tanks. The GT40 chassis was upgraded without making any major changes. Tennant Panels supplied the roof structure and the balance of the chassis was completed by Safir. Bill Pink, noted for his electrical experience and the wiring installation of previous GT40s, was brought in. Also, Jim Rose was hired for his experience with working at both Alan Mann and Shelby. After the manufacture of chassis 1120, John Etheridge was hired to manage the GT40 build. The chassis was supplied from Adams McCall Engineering and parts supplied from Tennant panels. For the most part, the Mk V resembled very closely the Mk I car, although there were a few changes, and, as with the '60s production, very few cars were identical. The first car, GT40P-1090, had an open-top in place of roofed doors. Most motors were Ford small block, Webers or 4 Barrel Carburetor. Safir produced five Big Block GT40s, serial numbers GT40P-1128 to GT40P-1132. These aluminum big block cars all had easily removable door roof sections. Most GT40s were high-performance street cars however some of the Mk V production can be described as full race. Two road cars GT40P-1133 (roadster) and GT40P-1142 (roofed doors) were built as lightweights which included an aluminum honeycomb chassis and carbon fiber bodywork. Continuation models, replicas and modernizations Several kit cars and replicas inspired by the Ford GT40 have been built. They are generally intended for assembly in a home workshop or garage. There are two alternatives to the kit car approach, either continuation models (exact and licensed replicas true to the original GT40) or modernizations (replicas with upgraded components, ergonomics & trim for improved usability, drivability, and performance). GT40/R Competition, United States: Authentic GT40 built by Superformance and co-designed with Pathfinder Motorsports. This is the only GT40 continuation licensed by Safir GT40 Spares LLC, the holders of the GT40 trademark. A GT40/R (GT40P/2094) campaigned by Pathfinder Motorsports with an engine built by Holman Moody won both the 2009 US Vintage Grand Prix and the 2009 Governor's Cup at Watkins Glen. Southern GT: Built-in Swanmore, Southampton, UK. Specializing in GT40 Mk1 and Mk2, as well as Lola T70. Kit form or fully built to your specifications. CAV GT: Originally designed for customers to build as a kit, the CAV GT has evolved into a modernized replica that is now factory-built in Cape Town, South Africa. Holman Moody: GT40 Mark II won third at Le Mans in 1966, and can still manufacture a Holman GT from 1966 blueprints. GT40 Spyder, United States: Built by E.R.A. Replica Automobiles in New Britain, CT, the Spyder is a MK2 Canadian American (CAN-AM) racing replica. The ERA GT is "No Longer Available" according to their website (October 3, 2021). Ford GT At the 1995 North American International Auto Show, the Ford GT90 concept was shown and at the 2002 show, a new GT40 Concept was unveiled by Ford. While similar in appearance to the original cars, it was bigger, wider, and 3 inches (76 mm) taller than the original 40 inches (1020 mm). Three production prototype cars were shown in 2003 as part of Ford's centenary, and delivery of the production Ford GT began in the fall of 2004. The Ford GT was assembled in the Ford Wixom plant and painted by Saleen, Incorporated at their Saleen Special Vehicles plant in Troy, Michigan. A British company, Safir Engineering, who continued to produce a limited number of GT40s (the Mk V) in the 1980s under an agreement with Walter Hayes of Ford and John Wilmont of J.W. Automotive Engineering, owned the GT40 trademark at that time, and when they completed production, they sold the excess parts, tooling, design, and trademark to a small American company called Safir GT40 Spares, Limited based in Ohio. Safir GT40 Spares licensed the use of the GT40 trademark to Ford for the initial 2002 show car, but when Ford decided to make the production vehicle, negotiations between the two failed, and as a result, the new Ford GT does not wear the badge GT40. Bob Wood, one of three partners who own Safir GT40 Spares, said: "When we talked with Ford, they asked what we wanted. We said that Ford owns Beanstalk in New York, the company that licenses the Blue Oval for Ford on such things as T-shirts. Since Beanstalk gets 7.5 percent of the retail cost of the item for licensing the name, we suggested 7.5 percent on each GT40 sold." In this instance, Ford wished to purchase, not just license the GT40 trademark. At the then-estimated $125,000 per copy, 7.5% of 4,500 vehicles would have totalled approximately $42,187,500. It was widely and erroneously reported following an Automotive News Weekly story that Safir "demanded" the $40 million for the sale of the trademark. Discussions between Safir and Ford ensued. However, in fact, the Ford Motor Company never made an offer in writing to purchase the famed GT40 trademark. Later models or prototypes have also been called the Ford GT but have had different numbering on them such as the Ford GT90 or the Ford GT70. The GT40 name and trademark is currently licensed to Superformance in the USA. A second-generation Ford GT was unveiled at the 2015 North American International Auto Show. It features a 3.5L twin-turbocharged V6 engine, carbon fiber monocoque and body panels, pushrod suspension and active aerodynamics. It entered the 2016 season of the FIA World Endurance Championship and the United SportsCar Championship, and started being sold in a street-legal version at Ford dealerships in 2017. See also Ford Supervan, a van-bodied variant Bundle of Snakes, characteristic exhaust system Colotti Trasmissioni, transmission of the initial, and early models AC Cobra, a car of similar Anglo-American parentage Ford v Ferrari, 2019 film about the GT40's development References Further reading "17 Ford GT40s Stampede into Pebble Beach! We Dive into Their Histories" (with historic and modern photo gallery), by Don Sherman, Car and Driver, August 2016. "An American Challenge" , Ford press release, 1966. Auto Passion n°49 July 1991 (in French) La Revue de l'Automobile Historique n°7 March/April 2001 (in French) Ford: The Dust and the Glory/A motor racing history by Leo Levine/1968 Ford vs. Ferrari: the Battle for Le Mans by Anthony Pritchard, 1984 Zuma Marketing Ford GT-40: An Individual History and Race Record by Ronnie Spain 1986 Go Like Hell: Ford, Ferrari, and Their Battle for Speed and Glory at Le Mans by A. J. Baime 12 Hours of Sebring 1965 by Harry Hurst and Dave Friedman Ford GT40 Manual: An Insight into Owning, Racing and Maintaining Ford's Legendary Sports Racing Car''(Haynes Owners' Workshop Manuals) by Gordon Bruce External links GT40 GT40 Rear mid-engine, rear-wheel-drive vehicles Sports cars Group 4 cars
did not live up to its design goal, shattering upon impact. The car burst into flames, killing Miles. It was determined that the unique, flat-topped "bread van" aerodynamics of the car, lacking any sort of spoiler, were implicated in generating excess lift. Therefore, a conventional but significantly more aerodynamic body was designed for the subsequent development of the J-car which was officially known as the Mk IV. A total of nine cars were constructed with J-car chassis numbers although six were designated as Mk IVs and one as the G7A. Mk IV The Mk IV was built around a reinforced J chassis powered by the same 7.0 L engine as the Mk II. Excluding the engine, gearbox, some suspension parts and the brakes from the Mk II, the Mk IV was totally different from other GT40s, using a specific, all-new chassis and bodywork. It was undoubtedly the most radical and American variant of all the GT40's over the years. As a direct result of the Miles accident, the team installed a NASCAR-style steel-tube roll cage in the Mk IV, which made it much safer, but the roll cage was so heavy that it negated most of the weight saving of the then-highly advanced, radically innovative honeycomb-panel construction. The Mk IV had a long, streamlined shape, which gave it exceptional top speed, crucial to do well at Le Mans in those days (a circuit made up predominantly of straights)—the race it was ultimately built for. A 2-speed automatic gearbox was tried, but during the extensive testing of the J-car in 1966 and 1967, it was decided that the 4-speed from the Mk II would be retained. Dan Gurney often complained about the weight of the Mk IV, since the car was heavier than the Ferrari 330 P4's. During practice at Le Mans in 1967, in an effort to preserve the highly stressed brakes, Gurney developed a strategy (also adopted by co-driver A.J. Foyt) of backing completely off the throttle several hundred yards before the approach to the Mulsanne hairpin and virtually coasting into the braking area. This technique saved the brakes, but the resulting increase in the car's recorded lap times during practice led to speculation within the Ford team that Gurney and Foyt, in an effort to compromise on chassis settings, had hopelessly "dialed out" their car. The car proved to be fastest in a straight line that year, thanks to its streamlined aerodynamics, achieving 212 mph on the 3.6-mile Mulsanne Straight. The Mk IV ran in only two races, the 1967 12 Hours of Sebring and the 1967 24 Hours of Le Mans and won both events. Only one Mk IV was completed for Sebring; the pressure from Ford had been amped up considerably after Ford's humiliation at Daytona two months earlier. Mario Andretti and Bruce McLaren won Sebring, Dan Gurney and A. J. Foyt won Le Mans (Gurney and Foyt's car was the Mk IV that was apparently least likely to win), where the Ford-representing Shelby-American and Holman & Moody teams showed up to Le Mans with 2 Mk IVs each. The installation of the roll cage was ultimately credited by many with saving the life of Andretti, who crashed violently at the Esses during the 1967 Le Mans 24 Hours, but escaped with minor injuries. Unlike the earlier Mk I - III cars, which were built in England, the Mk IVs were built in the United States by Kar Kraft. Le Mans 1967 remains the only all-American victory in Le Mans history—American drivers, team, chassis, engine, and tires. A total of six Mk IVs were constructed. One of the Mk IVs was rebuilt to the Ford G7 in 1968, and used in the Can-Am series for 1969 and 1970, but with no success. This car is sometimes called the Ford Mk IV. Mk V For years Peter Thorp had searched for a GT40 in good condition. Most of the cars had problems including the dreaded rust issue. His company, Safir Engineering, was building and fielding Formula 3 race cars, in addition, had a Token Formula One car purchased from the Ron Dennis Company, Rondell Racing. Formula One events in which Safir Engineering competed included Brands Hatch and Silverstone. Safir was also redesigning Range Rovers modifying the unit to six-wheel drive and exporting them. Safir technical capabilities were such that they could rebuild GT40s. It was with this in mind that Thorp approached John Willment for his thoughts. It was soon decided that there would be a limited, further run of the significant GT40. JW Engineering would oversee the build, and Safir was to do the work. The continued JW Engineering/Safir Engineering production would utilize sequential serial numbers starting at the last used GT40 serial number and move forward. Maintaining the GT40 Mark nomenclature, this continued production would be named GT40 Mk V. JW Engineering wished to complete the GT40 chassis numbers GT40P-1087, 1088 and 1089. This was supposed to take place prior to the beginning of Safir production, however, the completion of these three chassis’ was very much delayed. Ford's Len Bailey was hired to inspect the proposed build and engineer any changes he thought prudent to ensure the car was safe, as well as minimize problems experienced in the past. Baily changed the front suspension to Alan Mann specifications, which minimized nose-dive under braking. Zinc coated steel replaced the previous uncoated rust-prone sheet metal. The vulnerable drive donuts were replaced with CV joints and the leak-prone rubber gas tanks were replaced with aluminum tanks. The GT40 chassis was upgraded without making any major changes. Tennant Panels supplied the roof structure and the balance of the chassis was completed by Safir. Bill Pink, noted for his electrical experience and the wiring installation of previous GT40s, was brought in. Also, Jim Rose was hired for his experience with working at both Alan Mann and Shelby. After the manufacture of chassis 1120, John Etheridge was hired to manage the GT40 build. The chassis was supplied from Adams McCall Engineering and parts supplied from Tennant panels. For the most part, the Mk V resembled very closely the Mk I car, although there were a few changes, and, as with the '60s production, very few cars were identical. The first car, GT40P-1090, had an open-top in place of roofed doors. Most motors were Ford small block, Webers or 4 Barrel Carburetor. Safir produced five Big Block GT40s, serial numbers GT40P-1128 to GT40P-1132. These aluminum big block cars all had easily removable door roof sections. Most GT40s were high-performance street cars however some of the Mk V production can be described as full race. Two road cars GT40P-1133 (roadster) and GT40P-1142 (roofed doors) were built as lightweights which included an aluminum honeycomb chassis and carbon fiber bodywork. Continuation models, replicas and modernizations Several kit cars and replicas inspired by the Ford GT40 have been built. They are generally intended for assembly in a home workshop or garage. There are two alternatives to the kit car approach, either continuation models (exact and licensed replicas true to the original GT40) or modernizations (replicas with upgraded components, ergonomics & trim for improved usability, drivability, and performance). GT40/R Competition, United States: Authentic GT40 built by Superformance and co-designed with Pathfinder Motorsports. This is the only GT40 continuation licensed by Safir GT40 Spares LLC, the holders of the GT40 trademark. A GT40/R (GT40P/2094) campaigned by Pathfinder Motorsports with an engine built by Holman Moody won both the 2009 US Vintage Grand Prix and the 2009 Governor's Cup at Watkins Glen. Southern GT: Built-in Swanmore, Southampton, UK. Specializing in GT40 Mk1 and Mk2, as well as Lola T70. Kit form or fully built to your specifications. CAV GT: Originally designed for customers to build as a kit, the CAV GT has evolved into a modernized replica that is now factory-built in Cape Town, South Africa. Holman Moody: GT40 Mark II won third at Le Mans in 1966, and can still manufacture a Holman GT from 1966 blueprints. GT40 Spyder, United States: Built by E.R.A. Replica Automobiles in New Britain, CT, the Spyder is a MK2 Canadian American (CAN-AM) racing replica. The ERA GT is "No Longer Available" according to their website (October 3, 2021). Ford GT At the 1995 North American International Auto Show, the Ford GT90 concept was shown and at the 2002 show, a new GT40 Concept was unveiled by Ford. While similar in appearance to the original cars, it was bigger, wider, and 3 inches (76 mm) taller than the original 40 inches (1020 mm). Three production prototype cars were shown in 2003 as part of Ford's centenary, and delivery of the production Ford GT began in the fall of 2004. The Ford GT was assembled in the Ford Wixom plant and painted by Saleen, Incorporated at their
hydrophobic environments, due to its minimal side chain of only one hydrogen atom. History and etymology Glycine was discovered in 1820 by the French chemist Henri Braconnot when he hydrolyzed gelatin by boiling it with sulfuric acid. He originally called it "sugar of gelatin", but the French chemist Jean-Baptiste Boussingault showed that it contained nitrogen. The American scientist Eben Norton Horsford, then a student of the German chemist Justus von Liebig, proposed the name "glycocoll"; however, the Swedish chemist Berzelius suggested the simpler name "glycine". The name comes from the Greek word γλυκύς "sweet tasting" (which is also related to the prefixes glyco- and gluco-, as in glycoprotein and glucose). In 1858, the French chemist Auguste Cahours determined that glycine was an amine of acetic acid. Production Although glycine can be isolated from hydrolyzed protein, this is not used for industrial production, as it can be manufactured more conveniently by chemical synthesis. The two main processes are amination of chloroacetic acid with ammonia, giving glycine and ammonium chloride, and the Strecker amino acid synthesis, which is the main synthetic method in the United States and Japan. About 15 thousand tonnes are produced annually in this way. Glycine is also cogenerated as an impurity in the synthesis of EDTA, arising from reactions of the ammonia coproduct. Chemical reactions Its acid–base properties are most important. In aqueous solution, glycine is amphoteric: below pH = 2.4, it converts to the ammonium cation called glycinium. Above about 9.6, it converts to glycinate. Glycine functions as a bidentate ligand for many metal ions, forming amino acid complexes. A typical complex is Cu(glycinate)2, i.e. Cu(H2NCH2CO2)2, which exists both in cis and trans isomers. With acid chlorides, glycine converts to the amidocarboxylic acid, such as hippuric acid and acetylglycine. With nitrous acid, one obtains glycolic acid (van Slyke determination). With methyl iodide, the amine becomes quaternized to give trimethylglycine, a natural product: + 3 CH3I → + 3 HI Glycine condenses with itself to give peptides, beginning with the formation of glycylglycine: 2 → + H2O Pyrolysis of glycine or glycylglycine gives 2,5-diketopiperazine, the cyclic diamide. It forms esters with alcohols. They are often isolated as their hydrochloride]], e.g., glycine methyl ester hydrochloride. Otherwise the free ester tends to convert to diketopiperazine. As a bifunctional molecule, glycine reacts with many reagents. These can be classified into N-centered and carboxylate-center reactions. Metabolism Biosynthesis Glycine is not essential to the human diet, as it is biosynthesized in the body from the amino acid serine, which is in turn derived from 3-phosphoglycerate, but the metabolic capacity for glycine biosynthesis does not satisfy the need for collagen synthesis. In most organisms, the enzyme serine hydroxymethyltransferase catalyses this transformation via the cofactor pyridoxal phosphate: serine + tetrahydrofolate → glycine + N5,N10-methylene tetrahydrofolate + H2O In the liver of vertebrates, glycine synthesis is catalyzed by glycine synthase (also called glycine cleavage enzyme).
one study, the half-life varied between 0.5 and 4.0 hours. Glycine is extremely sensitive to antibiotics which target folate, and blood glycine levels drop severely within a minute of antibiotic injections. Some antibiotics can deplete more than 90% of glycine within a few minutes of being administered. Physiological function The principal function of glycine is it act as a precursor to proteins. Most proteins incorporate only small quantities of glycine, a notable exception being collagen, which contains about 35% glycine due to its periodically repeated role in the formation of collagen's helix structure in conjunction with hydroxyproline. In the genetic code, glycine is coded by all codons starting with GG, namely GGU, GGC, GGA and GGG. As a biosynthetic intermediate In higher eukaryotes, δ-aminolevulinic acid, the key precursor to porphyrins, is biosynthesized from glycine and succinyl-CoA by the enzyme ALA synthase. Glycine provides the central C2N subunit of all purines. As a neurotransmitter Glycine is an inhibitory neurotransmitter in the central nervous system, especially in the spinal cord, brainstem, and retina. When glycine receptors are activated, chloride enters the neuron via ionotropic receptors, causing an inhibitory postsynaptic potential (IPSP). Strychnine is a strong antagonist at ionotropic glycine receptors, whereas bicuculline is a weak one. Glycine is a required co-agonist along with glutamate for NMDA receptors. In contrast to the inhibitory role of glycine in the spinal cord, this behaviour is facilitated at the (NMDA) glutamatergic receptors which are excitatory. The of glycine is 7930 mg/kg in rats (oral), and it usually causes death by hyperexcitability. Uses In the US, glycine is typically sold in two grades: United States Pharmacopeia (“USP”), and technical grade. USP grade sales account for approximately 80 to 85 percent of the U.S. market for glycine. If purity greater than the USP standard is needed, for example for intravenous injections, a more expensive pharmaceutical grade glycine can be used. Technical grade glycine, which may or may not meet USP grade standards, is sold at a lower price for use in industrial applications, e.g., as an agent in metal complexing and finishing. Animal and human foods Glycine is not widely used in foods for its nutritional value, except in infusions. Instead glycine's role in food chemistry is as a flavorant. It is mildly sweet, and it counters the aftertaste of saccharine. It also has preservative properties, perhaps owing to its complexation to metal ions. Metal glycinate complexes, e.g. copper(II) glycinate are used as supplements for animal feeds. Chemical feedstock Glycine is an intermediate in the synthesis of a variety of chemical products. It is used in the manufacture of the herbicides glyphosate, iprodione, glyphosine, imiprothrin, and eglinazine. It is used as an intermediate of the medicine such as thiamphenicol. Laboratory research Glycine is a significant component of some solutions used in the SDS-PAGE method of protein analysis. It serves as a buffering agent, maintaining pH and preventing sample damage during electrophoresis. Glycine is also used to remove protein-labeling antibodies from Western blot membranes to enable the probing of numerous proteins of interest from SDS-PAGE gel. This allows more data to be drawn from the same specimen, increasing the reliability of the data, reducing the amount of sample processing, and number of samples required. This process is known as stripping. Presence in space The presence of glycine outside the earth was confirmed in 2009, based on the analysis of samples that had been taken in 2004 by the NASA spacecraft Stardust from comet Wild 2 and subsequently returned to earth. Glycine had previously been identified in the Murchison meteorite in 1970. The discovery of glycine in outer space bolstered the hypothesis of so called soft-panspermia, which claims that the "building blocks" of life are widespread throughout the universe. In 2016, detection of glycine within Comet 67P/Churyumov–Gerasimenko by the Rosetta spacecraft was announced. The detection of glycine outside the Solar System in the interstellar medium has been debated. In 2008, the Max Planck Institute for Radio Astronomy discovered the spectral lines of a glycine precursor (aminoacetonitrile) in the Large Molecule Heimat, a giant gas cloud near the galactic center in the constellation Sagittarius. Evolution Several independent evolutionary studies using different types of data have suggested that glycine belongs to a group of amino acids that constituted the early genetic code. For example, low complexity regions (in proteins), that may resemble the proto-peptides of the early genetic code are highly enriched in glycine. Presence in foods See also Trimethylglycine Amino acid neurotransmitter References Further reading External links Glycine MS Spectrum Glycine at PDRHealth.com Glycine cleavage system Glycine Therapy - A New Direction for Schizophrenia Treatment? ChemSub Online (Glycine). NASA scientists have discovered glycine, a fundamental building block of life, in samples of comet Wild
call-in program, which first aired in 1998 on KUSP, GeekSpeak has been a weekly podcast since 2004. The program's slogan is "Bridging the gap between geeks and the rest of humanity". History GeekSpeak was created and originally broadcast on KUSP by Chris Neklason of Cruzio, Steve Schaefer of Guenther Computer, and board operator Ray Price from KUSP. Shortly there after Mark Hanford of Cruzio joined the program. Currently, the host/producer is Lyle Troxell, who took over in September 2000. In April 2016, citing financial
the rest of humanity". History GeekSpeak was created and originally broadcast on KUSP by Chris Neklason of Cruzio, Steve Schaefer of Guenther Computer, and board operator Ray Price from KUSP. Shortly there after Mark Hanford of Cruzio joined the program. Currently, the host/producer is Lyle
produced by a vibrating string stretched between two fixed points. Historically, a guitar was constructed from wood with its strings made of catgut. Steel guitar strings were introduced near the end of the nineteenth century in the United States; nylon strings came in the 1940s. The guitar's ancestors include the gittern, the vihuela, the four-course Renaissance guitar, and the five-course baroque guitar, all of which contributed to the development of the modern six-string instrument. There are three main types of modern guitar: the classical guitar (Spanish guitar/nylon-string guitar); the steel-string acoustic guitar or electric guitar; and the Hawaiian guitar (played across the player's lap). Traditional acoustic guitars include the flat top guitar (typically with a large sound hole) or an archtop guitar, which is sometimes called a "jazz guitar". The tone of an acoustic guitar is produced by the strings' vibration, amplified by the hollow body of the guitar, which acts as a resonating chamber. The classical Spanish guitar is often played as a solo instrument using a comprehensive fingerstyle technique where each string is plucked individually by the player's fingers, as opposed to being strummed. The term "finger-picking" can also refer to a specific tradition of folk, blues, bluegrass, and country guitar playing in the United States. Electric guitars, first patented in 1937, use a pickup and amplifier that made the instrument loud enough to be heard, but also enabled manufacturing guitars with a solid block of wood needing no resonant chamber. A wide array of electronic effects units became possible including reverb and distortion (or "overdrive"). Solid-body guitars began to dominate the guitar market during the 1960s and 1970s; they are less prone to unwanted acoustic feedback. As with acoustic guitars, there are a number of types of electric guitars, including hollowbody guitars, archtop guitars (used in jazz guitar, blues and rockabilly) and solid-body guitars, which are widely used in rock music. The loud, amplified sound and sonic power of the electric guitar played through a guitar amp has played a key role in the development of blues and rock music, both as an accompaniment instrument (playing riffs and chords) and performing guitar solos, and in many rock subgenres, notably heavy metal music and punk rock. The electric guitar has had a major influence on popular culture. The guitar is used in a wide variety of musical genres worldwide. It is recognized as a primary instrument in genres such as blues, bluegrass, country, flamenco, folk, jazz, jota, mariachi, metal, punk, reggae, rock, soul, and pop. History Before the development of the electric guitar and the use of synthetic materials, a guitar was defined as being an instrument having "a long, fretted neck, flat wooden soundboard, ribs, and a flat back, most often with incurved sides." The term is used to refer to a number of chordophones that were developed and used across Europe, beginning in the 12th century and, later, in the Americas. A 3,300-year-old stone carving of a Hittite bard playing a stringed instrument is the oldest iconographic representation of a chordophone and clay plaques from Babylonia show people playing an instrument that has a strong resemblance to the guitar, indicating a possible Babylonian origin for the guitar. The modern word guitar, and its antecedents, has been applied to a wide variety of chordophones since classical times and as such causes confusion. The English word guitar, the German , and the French were all adopted from the Spanish , which comes from the Andalusian Arabic () and the Latin , which in turn came from the Ancient Greek . Kithara appears in the Bible four times (1 Cor. 14:7, Rev. 5:8, 14:2 and 15:2), and is usually translated into English as harp. Many influences are cited as antecedents to the modern guitar. Although the development of the earliest "guitars" is lost in the history of medieval Spain, two instruments are commonly cited as their most influential predecessors, the European lute and its cousin, the four-string oud; the latter was brought to Iberia by the Moors in the 8th century. At least two instruments called "guitars" were in use in Spain by 1200: the (Latin guitar) and the so-called (Moorish guitar). The guitarra morisca had a rounded back, wide fingerboard, and several sound holes. The guitarra Latina had a single sound hole and a narrower neck. By the 14th century the qualifiers "moresca" or "morisca" and "latina" had been dropped, and these two chordophones were simply referred to as guitars. The Spanish vihuela, called in Italian the "", a guitar-like instrument of the 15th and 16th centuries, is widely considered to have been the single most important influence in the development of the baroque guitar. It had six courses (usually), lute-like tuning in fourths and a guitar-like body, although early representations reveal an instrument with a sharply cut waist. It was also larger than the contemporary four-course guitars. By the 16th century, the vihuela's construction had more in common with the modern guitar, with its curved one-piece ribs, than with the viols, and more like a larger version of the contemporary four-course guitars. The vihuela enjoyed only a relatively short period of popularity in Spain and Italy during an era dominated elsewhere in Europe by the lute; the last surviving published music for the instrument appeared in 1576. Meanwhile, the five-course baroque guitar, which was documented in Spain from the middle of the 16th century, enjoyed popularity, especially in Spain, Italy and France from the late 16th century to the mid-18th century. In Portugal, the word viola referred to the guitar, as guitarra meant the "Portuguese guitar", a variety of cittern. There were many different plucked instruments that were being invented and used in Europe, during the Middle Ages. By the 16th century, most of the forms of guitar had fallen off, to never be seen again. However, midway through the 16th century, the five-course guitar was established. It was not a straightforward process. There were two types of five-course guitars, they differed in the location of the major third and in the interval pattern. The fifth course can be placed on the instrument because it was known to play seventeen notes or more. Because the guitar had a fifth string, it was capable of playing that amount of notes. The guitar's strings were tuned in unison, so, in other words, it was tuned by placing a finger on the second fret of the thinnest string and tuning the guitar bottom to top. The strings were a whole octave apart from one another, which is the reason for the different method of tuning. Because it was so different, there was major controversy as to who created the five course guitar. A literary source, Lope de Vega's Dorotea, gives the credit to the poet and musician Vicente Espinel. This claim was also repeated by Nicolas Doizi de Velasco in 1640, however this claim has been refuted by others who state that Espinel's birth year (1550) make it impossible for him to be responsible for the tradition. He believed that the tuning was the reason the instrument became known as the Spanish guitar in Italy. Even later, in the same century, Gaspar Sanz wrote that other nations such as Italy or France added to the Spanish guitar. All of these nations even imitated the five-course guitar by "recreating" their own. Finally, circa 1850, the form and structure of the modern guitar are followed by different Spanish makers such as Manuel de Soto y Solares and perhaps the most important of all guitar makers Antonio Torres Jurado, who increased the size of the guitar body, altered its proportions, and invented the breakthrough fan-braced pattern. Bracing, which refers to the internal pattern of wood reinforcements used to secure the guitar's top and back and prevent the instrument from collapsing under tension, is an important factor in how the guitar sounds. Torres' design greatly improved the volume, tone, and projection of the instrument, and it has remained essentially unchanged since. Types Guitars can be divided into two broad categories, acoustic and electric guitars. Within each of these categories, there are also further sub-categories. For example, an electric guitar can be purchased in a six-string model (the most common model) or in seven- or twelve-string models. Acoustic Acoustic guitars form several notable subcategories within the acoustic guitar group: classical and flamenco guitars; steel-string guitars, which include the flat-topped, or "folk", guitar; twelve-string guitars; and the arched-top guitar. The acoustic guitar group also includes unamplified guitars designed to play in different registers, such as the acoustic bass guitar, which has a similar tuning to that of the electric bass guitar. Renaissance and Baroque Renaissance and Baroque guitars are the ancestors of the modern classical and flamenco guitar. They are substantially smaller, more delicate in construction, and generate less volume. The strings are paired in courses as in a modern 12-string guitar, but they only have four or five courses of strings rather than six single strings normally used now. They were more often used as rhythm instruments in ensembles than as solo instruments, and can often be seen in that role in early music performances. (Gaspar Sanz's Instrucción de Música sobre la Guitarra Española of 1674 contains his whole output for the solo guitar.) Renaissance and Baroque guitars are easily distinguished, because the Renaissance guitar is very plain and the Baroque guitar is very ornate, with ivory or wood inlays all over the neck and body, and a paper-cutout inverted "wedding cake" inside the hole. Classical Classical guitars, also known as "Spanish" guitars, are typically strung with nylon strings, plucked with the fingers, played in a seated position and are used to play a diversity of musical styles including classical music. The classical guitar's wide, flat neck allows the musician to play scales, arpeggios, and certain chord forms more easily and with less adjacent string interference than on other styles of guitar. Flamenco guitars are very similar in construction, but they are associated with a more percussive tone. In Portugal, the same instrument is often used with steel strings particularly in its role within fado music. The guitar is called viola, or violão in Brazil, where it is often used with an extra seventh string by choro musicians to provide extra bass support. In Mexico, the popular mariachi band includes a range of guitars, from the small requinto to the guitarrón, a guitar larger than a cello, which is tuned in the bass register. In Colombia, the traditional quartet includes a range of instruments too, from the small bandola (sometimes known as the Deleuze-Guattari, for use when traveling or in confined rooms or spaces), to the slightly larger tiple, to the full-sized classical guitar. The requinto also appears in other Latin-American countries as a complementary member of the guitar family, with its smaller size and scale, permitting more projection for the playing of single-lined melodies. Modern dimensions of the classical instrument were established by the Spaniard Antonio de Torres Jurado (1817–1892). Flat-top Flat-top guitars with steel strings are similar to the classical guitar, however, the flat-top body size is usually significantly larger than a classical guitar, and has a narrower, reinforced neck and stronger structural design. The robust X-bracing typical of flat-top guitars was developed in the 1840s by German-American luthiers, of whom Christian Friedrich "C. F." Martin is the best known. Originally used on gut-strung instruments, the strength of the system allowed the later guitars to withstand the additional tension of steel strings. Steel strings produce a brighter tone and a louder sound. The acoustic guitar is used in many kinds of music including folk, country, bluegrass, pop, jazz, and blues. Many variations are possible from the roughly classical-sized OO and Parlour to the large Dreadnought (the most commonly available type) and Jumbo. Ovation makes a modern variation, with a rounded back/side assembly molded from artificial materials. Archtop Archtop guitars are steel-string instruments in which the top (and often the back) of the instrument are carved, from a solid billet, into a curved, rather than a flat, shape. This violin-like construction is usually credited to the American Orville Gibson. Lloyd Loar of the Gibson Mandolin-Guitar Mfg. Co introduced the violin-inspired "F"-shaped hole design now usually associated with archtop guitars, after designing a style of mandolin of the same type. The typical archtop guitar has a large, deep, hollow body whose form is much like that of a mandolin or a violin-family instrument. Nowadays, most archtops are equipped with magnetic pickups, and they are therefore both acoustic and electric. F-hole archtop guitars were immediately adopted, upon their release, by both jazz and country musicians, and have remained particularly popular in jazz music, usually with flatwound strings. Resonator, resophonic or Dobros All three principal types of resonator guitars were invented by the Slovak-American John Dopyera (1893–1988) for the National and Dobro (Dopyera Brothers) companies. Similar to the flat top guitar in appearance, but with a body that may be made of brass, nickel-silver, or steel as well as wood, the sound of the resonator guitar is produced by one or more aluminum resonator cones mounted in the middle of the top. The physical principle of the guitar is therefore similar to the loudspeaker. The original purpose of the resonator was to produce a very loud sound; this purpose has been largely superseded by electrical amplification, but the resonator guitar is still played because of its distinctive tone. Resonator guitars may have either one or three resonator cones. The method of transmitting sound resonance to the cone is either a "biscuit" bridge, made of a small piece of hardwood at the vertex of the cone (Nationals), or a "spider" bridge, made of metal and mounted around the rim of the (inverted) cone (Dobros). Three-cone resonators always use a specialized metal bridge. The type of resonator guitar with a neck with a square cross-section—called "square neck" or "Hawaiian"—is usually played face up, on the lap of the seated player, and often with a metal or glass slide. The round neck resonator guitars are normally played in the same fashion as other guitars, although slides are also often used, especially in blues. Steel guitar A steel guitar is any guitar played while moving a polished steel bar or similar hard object against plucked strings. The bar itself is called a "steel" and is the source of the name "steel guitar". The instrument differs from a conventional guitar in that it does not use frets; conceptually, it is somewhat akin to playing a guitar with one finger (the bar). Known for its portamento capabilities, gliding smoothly over every pitch between notes, the instrument can produce a sinuous crying sound and deep vibrato emulating the human singing voice. Typically, the strings are plucked (not strummed) by the fingers of the dominant hand, while the steel tone bar is pressed lightly against the strings and moved by the opposite hand. The instrument is played while sitting, placed horizontally across the player's knees or otherwise supported. The horizontal playing style is called "Hawaiian style". Twelve-string The twelve-string guitar usually has steel strings, and it is widely used in folk music, blues, and rock and roll. Rather than having only six strings, the 12-string guitar has six courses made up of two strings each, like a mandolin or lute. The highest two courses are tuned in unison, while the others are tuned in octaves. The 12-string guitar is also made in electric forms. The chime-like sound of the 12-string electric guitar was the basis of jangle pop. Acoustic bass The acoustic bass guitar is a bass instrument with a hollow wooden body similar to, though usually somewhat larger than, that of a 6-string acoustic guitar. Like the traditional electric bass guitar and the double bass, the acoustic bass guitar commonly has four strings, which are normally tuned E-A-D-G, an octave below the lowest four strings of the 6-string guitar, which is the same tuning pitch as an electric bass guitar. It can, more rarely, be found with 5 or 6 strings, which provides a wider range of notes to be played with less movement up and down the neck. Electric Electric guitars can have solid, semi-hollow, or hollow bodies; solid bodies produce little sound without amplification. In contrast to a standard acoustic guitar, electric guitars instead rely on electromagnetic pickups, and sometimes piezoelectric pickups, that convert the vibration of the steel strings into signals, which are fed to an amplifier through a patch cable or radio transmitter. The sound is frequently modified by other electronic devices (effects units) or the natural distortion of valves (vacuum tubes) or the pre-amp in the amplifier. There are two main types of magnetic pickups, single- and double-coil (or humbucker), each of which can be passive or active. The electric guitar is used extensively in jazz, blues, R & B, and rock and roll. The first successful magnetic pickup for a guitar was invented by George Beauchamp, and incorporated into the 1931 Ro-Pat-In (later Rickenbacker) "Frying Pan" lap steel; other manufacturers, notably Gibson, soon began to install pickups in archtop models. After World War II the completely solid-body electric was popularized by Gibson in collaboration with Les Paul, and independently by Leo Fender of Fender Music. The lower fretboard action (the height of the strings from the fingerboard), lighter (thinner) strings, and its electrical amplification lend the electric guitar to techniques less frequently used on acoustic guitars. These include tapping, extensive use of legato through pull-offs and hammer-ons (also known as slurs), pinch harmonics, volume swells, and use of a tremolo arm or effects pedals. Some electric guitar models feature piezoelectric pickups, which function as transducers to provide a sound closer to that of an acoustic guitar with the flip of a switch or knob, rather than switching guitars. Those that combine piezoelectric pickups and magnetic pickups are sometimes known as hybrid guitars. Hybrids of acoustic and electric guitars are also common. There are also more exotic varieties, such as guitars with two, three, or rarely four necks, all manner of alternate string arrangements, fretless fingerboards (used almost exclusively on bass guitars, meant to emulate the sound of a stand-up bass), 5.1 surround guitar, and such. Seven-string and eight-string Solid-body seven-string guitars were popularized in the 1980s and 1990s. Other artists go a step further, by using an eight-string guitar with two extra low strings. Although the most common seven-string has a low B string, Roger McGuinn (of The Byrds and Rickenbacker) uses an octave G string paired with the regular G string as on a 12-string guitar, allowing him to incorporate chiming 12-string elements in standard six-string playing. In 1982 Uli Jon Roth developed the "Sky Guitar", with a vastly extended number of frets, which was the first guitar to venture into the upper registers of the violin. Roth's seven-string and "Mighty Wing" guitar features a wider octave range. Electric bass The bass guitar (also called an "electric bass", or simply a "bass") is similar in appearance and construction to an electric guitar, but with a longer neck and scale length, and four to six strings. The four-string bass, by far the most common, is usually tuned the same as the double bass, which corresponds to pitches one octave lower than the four lowest pitched strings of a guitar (E, A, D, and G). The bass guitar is a transposing instrument, as it is notated in bass clef an octave higher than it sounds (as is the double bass) to avoid excessive ledger lines being required below the staff. Like the electric guitar, the bass guitar has pickups and it is plugged into an amplifier and speaker for live performances. Construction Handedness Modern guitars can be constructed to suit both left- and right-handed players. Typically the dominant hand is used to pluck or strum the strings. This is similar to the violin family of instruments where the dominant hand controls the bow. Left-handed players usually play a mirror image instrument manufactured especially for left-handed players. There are other options, some unorthodox, including learn to play a right-handed guitar as if the player is right-handed or playing an unmodified right-handed guitar reversed. Guitarist Jimi Hendrix) played a right-handed guitar strung in reverse (the treble strings and bass strings reversed). The problem with doing this is that it reverses the guitar's saddle angle. The saddle is the strip of material on top of the bridge where the strings rest. It is normally slanted slightly, making the bass strings longer than the treble strings. In part, the reason for this is the difference in the thickness of the strings. Physical properties of the thicker bass strings require them to be slightly longer than the treble strings to correct intonation. Reversing the strings, therefore, reverses the orientation of the saddle, adversely affecting intonation. Components Head The headstock is located at the end of the guitar neck farthest from the body. It is fitted with machine heads that adjust the tension of the strings, which in turn affects the pitch. The traditional tuner layout is "3+3", in which each side of the headstock has three tuners (such as on Gibson Les Pauls). In this layout, the headstocks are commonly symmetrical. Many guitars feature other layouts, including six-in-line tuners (featured on Fender Stratocasters) or even "4+2" (e.g. Ernie Ball Music Man). Some guitars (such as Steinbergers) do not have headstocks at all, in which case the tuning machines are located elsewhere, either on the body or the bridge. The nut is a small strip of bone, plastic, brass, corian, graphite, stainless steel, or other medium-hard material, at the joint where the headstock meets the fretboard. Its grooves guide the strings onto the fretboard, giving consistent lateral string placement. It is one of the endpoints of the strings' vibrating length. It must be accurately cut, or it can contribute to tuning problems due to string slippage or string buzz. To reduce string friction in the nut, which can adversely affect tuning stability, some guitarists fit a roller nut. Some instruments use a zero fret just in front of the nut. In this case the nut is used only for lateral alignment of the strings, the string height and length being dictated by the zero fret. Neck A guitar's frets, fretboard, tuners, headstock, and truss rod, all attached to a long wooden extension, collectively constitute its neck. The wood used to make the fretboard usually differs from the wood in the rest of the neck. The bending stress on the neck is considerable, particularly when heavier gauge strings are used (see Tuning), and the ability of the neck to resist bending (see Truss rod) is important to the guitar's ability to hold a constant pitch during tuning or when strings are fretted. The rigidity of the neck with respect to the body of the guitar is one determinant of a good instrument versus a poor-quality one. The cross-section of the neck can also vary, from a gentle "C" curve to a more pronounced "V" curve. There are many different types of neck profiles available, giving the guitarist many options. Some aspects to consider in a guitar neck may be the overall width of the fretboard, scale (distance between the frets), the neck wood, the type of neck construction (for example, the neck may be glued in or bolted on), and the shape (profile) of the back of the neck. Other types of material used to make guitar necks are graphite (Steinberger guitars), aluminum (Kramer Guitars, Travis Bean and Veleno guitars), or carbon fiber (Modulus Guitars and ThreeGuitars). Double neck electric guitars have two necks, allowing the musician to quickly switch between guitar sounds. The neck joint or heel is the point at which the neck is either bolted or glued to the body of the guitar. Almost all acoustic steel-string guitars, with the primary exception of Taylors, have glued (otherwise known as set) necks, while electric guitars are constructed using both types. Most classical guitars have a neck and headblock carved from one piece of wood, known as a "Spanish heel". Commonly used set neck joints include mortise and tenon joints (such as those used by C. F. Martin & Co.), dovetail joints (also used by C. F. Martin on the D-28 and similar models) and Spanish heel neck joints, which are named after the shoe they resemble and commonly found in classical guitars. All three types offer stability. Bolt-on necks, though they are historically associated with cheaper instruments, do offer greater flexibility in the guitar's set-up, and allow easier access for neck joint maintenance and repairs. Another type of neck, only available for solid-body electric guitars, is the neck-through-body construction. These are designed so that everything from the machine heads down to the bridge is located on the same piece of wood. The sides (also known as wings) of the guitar are then glued to this central piece. Some luthiers prefer this method of construction as they claim it allows better sustain of each note. Some instruments may not have a neck joint at all, having the neck and sides built as one piece and the body built around it. The fingerboard, also called the fretboard, is a piece of wood embedded with metal frets that comprises the top of the neck. It is flat on classical guitars and slightly curved crosswise on acoustic and electric guitars. The curvature of the fretboard is measured by the fretboard radius, which is the radius of a hypothetical circle of which the fretboard's surface constitutes a segment. The smaller the fretboard radius, the more noticeably curved the fretboard is. Most modern guitars feature a 12" neck radius, while older guitars from the 1960s and 1970s usually feature a 6-8" neck radius. Pinching a string against a fret on the fretboard effectively shortens the vibrating length of the string, producing a higher pitch. Fretboards are most commonly made of rosewood, ebony, maple, and sometimes manufactured using composite materials such as HPL or resin. See the section "Neck" below for the importance of the length of the fretboard in connection to other dimensions of the guitar. The fingerboard plays an essential role in the treble tone for acoustic guitars. The quality of vibration of the fingerboard is the principal characteristic for generating the best treble tone. For that reason, ebony wood is better, but because of high use, ebony has become rare and extremely expensive. Most guitar manufacturers have adopted rosewood instead of ebony. Frets Almost all guitars have frets, which are metal strips (usually nickel alloy or stainless steel) embedded along the fretboard and located at exact points that divide the scale length in accordance with a specific mathematical formula. The exceptions include fretless bass guitars and very rare fretless guitars. Pressing a string against a fret determines the strings' vibrating length and therefore its resultant pitch. The pitch of each consecutive fret is defined at a half-step interval on the chromatic scale. Standard classical guitars have 19 frets and electric guitars between 21 and 24 frets, although guitars have been made with as many as 27 frets. Frets are laid out to accomplish an equal tempered division of the octave. Each set of twelve frets represents an octave. The twelfth fret divides the scale length exactly into two halves, and the 24th fret position divides one of those halves in half again. The ratio of the spacing of two consecutive frets is (twelfth root of two). In practice, luthiers determine fret positions using the constant 17.817—an approximation to 1/(1-1/). If the nth fret is a distance x from the bridge, then the distance from the (n+1)th fret to the bridge is x-(x/17.817). Frets are available in several different gauges and can be fitted according to player preference. Among these are "jumbo" frets, which have a much thicker gauge, allowing for use of a slight vibrato technique from pushing the string down harder and softer. "Scalloped" fretboards, where the wood of the fretboard itself is "scooped out" between the frets, allow a dramatic vibrato effect. Fine frets, much flatter, allow a very low string-action, but require that other conditions, such as curvature of the neck, be well-maintained to prevent buzz. Truss rod The truss rod is a thin, strong metal rod that runs along the inside of the neck. It is used to correct changes to the neck's curvature caused by aging of the neck timbers, changes in humidity, or to compensate for changes in the tension of strings. The tension of the rod and neck assembly is adjusted by a hex nut or an allen-key bolt on the rod, usually located either
used by C. F. Martin & Co.), dovetail joints (also used by C. F. Martin on the D-28 and similar models) and Spanish heel neck joints, which are named after the shoe they resemble and commonly found in classical guitars. All three types offer stability. Bolt-on necks, though they are historically associated with cheaper instruments, do offer greater flexibility in the guitar's set-up, and allow easier access for neck joint maintenance and repairs. Another type of neck, only available for solid-body electric guitars, is the neck-through-body construction. These are designed so that everything from the machine heads down to the bridge is located on the same piece of wood. The sides (also known as wings) of the guitar are then glued to this central piece. Some luthiers prefer this method of construction as they claim it allows better sustain of each note. Some instruments may not have a neck joint at all, having the neck and sides built as one piece and the body built around it. The fingerboard, also called the fretboard, is a piece of wood embedded with metal frets that comprises the top of the neck. It is flat on classical guitars and slightly curved crosswise on acoustic and electric guitars. The curvature of the fretboard is measured by the fretboard radius, which is the radius of a hypothetical circle of which the fretboard's surface constitutes a segment. The smaller the fretboard radius, the more noticeably curved the fretboard is. Most modern guitars feature a 12" neck radius, while older guitars from the 1960s and 1970s usually feature a 6-8" neck radius. Pinching a string against a fret on the fretboard effectively shortens the vibrating length of the string, producing a higher pitch. Fretboards are most commonly made of rosewood, ebony, maple, and sometimes manufactured using composite materials such as HPL or resin. See the section "Neck" below for the importance of the length of the fretboard in connection to other dimensions of the guitar. The fingerboard plays an essential role in the treble tone for acoustic guitars. The quality of vibration of the fingerboard is the principal characteristic for generating the best treble tone. For that reason, ebony wood is better, but because of high use, ebony has become rare and extremely expensive. Most guitar manufacturers have adopted rosewood instead of ebony. Frets Almost all guitars have frets, which are metal strips (usually nickel alloy or stainless steel) embedded along the fretboard and located at exact points that divide the scale length in accordance with a specific mathematical formula. The exceptions include fretless bass guitars and very rare fretless guitars. Pressing a string against a fret determines the strings' vibrating length and therefore its resultant pitch. The pitch of each consecutive fret is defined at a half-step interval on the chromatic scale. Standard classical guitars have 19 frets and electric guitars between 21 and 24 frets, although guitars have been made with as many as 27 frets. Frets are laid out to accomplish an equal tempered division of the octave. Each set of twelve frets represents an octave. The twelfth fret divides the scale length exactly into two halves, and the 24th fret position divides one of those halves in half again. The ratio of the spacing of two consecutive frets is (twelfth root of two). In practice, luthiers determine fret positions using the constant 17.817—an approximation to 1/(1-1/). If the nth fret is a distance x from the bridge, then the distance from the (n+1)th fret to the bridge is x-(x/17.817). Frets are available in several different gauges and can be fitted according to player preference. Among these are "jumbo" frets, which have a much thicker gauge, allowing for use of a slight vibrato technique from pushing the string down harder and softer. "Scalloped" fretboards, where the wood of the fretboard itself is "scooped out" between the frets, allow a dramatic vibrato effect. Fine frets, much flatter, allow a very low string-action, but require that other conditions, such as curvature of the neck, be well-maintained to prevent buzz. Truss rod The truss rod is a thin, strong metal rod that runs along the inside of the neck. It is used to correct changes to the neck's curvature caused by aging of the neck timbers, changes in humidity, or to compensate for changes in the tension of strings. The tension of the rod and neck assembly is adjusted by a hex nut or an allen-key bolt on the rod, usually located either at the headstock, sometimes under a cover, or just inside the body of the guitar underneath the fretboard and accessible through the sound hole. Some truss rods can only be accessed by removing the neck. The truss rod counteracts the immense amount of tension the strings place on the neck, bringing the neck back to a straighter position. Turning the truss rod clockwise tightens it, counteracting the tension of the strings and straightening the neck or creating a backward bow. Turning the truss rod counter-clockwise loosens it, allowing string tension to act on the neck and creating a forward bow. Adjusting the truss rod affects the intonation of a guitar as well as the height of the strings from the fingerboard, called the action. Some truss rod systems, called double action truss systems, tighten both ways, pushing the neck both forward and backward (standard truss rods can only release to a point beyond which the neck is no longer compressed and pulled backward). The artist and luthier Irving Sloane pointed out, in his book Steel-String Guitar Construction, that truss rods are intended primarily to remedy concave bowing of the neck, but cannot correct a neck with "back bow" or one that has become twisted. Classical guitars do not require truss rods, as their nylon strings exert a lower tensile force with lesser potential to cause structural problems. However, their necks are often reinforced with a strip of harder wood, such as an ebony strip that runs down the back of a cedar neck. There is no tension adjustment on this form of reinforcement. Inlays Inlays are visual elements set into the exterior surface of a guitar, both for decoration and artistic purposes and, in the case of the markings on the 3rd, 5th, 7th and 12th fret (and in higher octaves), to provide guidance to the performer about the location of frets on the instrument. The typical locations for inlay are on the fretboard, headstock, and on acoustic guitars around the soundhole, known as the rosette. Inlays range from simple plastic dots on the fretboard to intricate works of art covering the entire exterior surface of a guitar (front and back). Some guitar players have used LEDs in the fretboard to produce unique lighting effects onstage. Fretboard inlays are most commonly shaped like dots, diamond shapes, parallelograms, or large blocks in between the frets. Dots are usually inlaid into the upper edge of the fretboard in the same positions, small enough to be visible only to the player. These usually appear on the odd-numbered frets, but also on the 12th fret (the one-octave mark) instead of the 11th and 13th frets. Some older or high-end instruments have inlays made of mother of pearl, abalone, ivory, colored wood or other exotic materials and designs. Simpler inlays are often made of plastic or painted. High-end classical guitars seldom have fretboard inlays as a well-trained player is expected to know his or her way around the instrument. In addition to fretboard inlay, the headstock and soundhole surround are also frequently inlaid. The manufacturer's logo or a small design is often inlaid into the headstock. Rosette designs vary from simple concentric circles to delicate fretwork mimicking the historic rosette of lutes. Bindings that edge the finger and soundboards are sometimes inlaid. Some instruments have a filler strip running down the length and behind the neck, used for strength or to fill the cavity through which the truss rod was installed in the neck. Body In acoustic guitars, string vibration is transmitted through the bridge and saddle to the body via sound board. The sound board is typically made of tonewoods such as spruce or cedar. Timbers for tonewoods are chosen for both strength and ability to transfer mechanical energy from the strings to the air within the guitar body. Sound is further shaped by the characteristics of the guitar body's resonant cavity. In expensive instruments, the entire body is made of wood. In inexpensive instruments, the back may be made of plastic. In an acoustic instrument, the body of the guitar is a major determinant of the overall sound quality. The guitar top, or soundboard, is a finely crafted and engineered element made of tonewoods such as spruce and red cedar. This thin piece of wood, often only 2 or 3 mm thick, is strengthened by differing types of internal bracing. Many luthiers consider the top the dominant factor in determining the sound quality. The majority of the instrument's sound is heard through the vibration of the guitar top as the energy of the vibrating strings is transferred to it. The body of an acoustic guitar has a sound hole through which sound projects. The sound hole is usually a round hole in the top of the guitar under the strings. The air inside the body vibrates as the guitar top and body is vibrated by the strings, and the response of the air cavity at different frequencies is characterized, like the rest of the guitar body, by a number of resonance modes at which it responds more strongly. The top, back and ribs of an acoustic guitar body are very thin (1–2 mm), so a flexible piece of wood called lining is glued into the corners where the rib meets the top and back. This interior reinforcement provides 5 to 20 mm of solid gluing area for these corner joints. Solid linings are often used in classical guitars, while kerfed lining is most often found in steel-string acoustics. Kerfed lining is also called kerfing because it is scored, or "kerfed"(incompletely sawn through), to allow it to bend with the shape of the rib). During final construction, a small section of the outside corners is carved or routed out and filled with binding material on the outside corners and decorative strips of material next to the binding, which is called purfling. This binding serves to seal off the end grain of the top and back. Purfling can also appear on the back of an acoustic guitar, marking the edge joints of the two or three sections of the back. Binding and purfling materials are generally made of either wood or plastic. Body size, shape and style have changed over time. 19th-century guitars, now known as salon guitars, were smaller than modern instruments. Differing patterns of internal bracing have been used over time by luthiers. Torres, Hauser, Ramirez, Fleta, and C. F. Martin were among the most influential designers of their time. Bracing not only strengthens the top against potential collapse due to the stress exerted by the tensioned strings but also affects the resonance characteristics of the top. The back and sides are made out of a variety of timbers such as mahogany, Indian rosewood and highly regarded Brazilian rosewood (Dalbergia nigra). Each one is primarily chosen for their aesthetic effect and can be decorated with inlays and purfling. Instruments with larger areas for the guitar top were introduced by Martin in an attempt to create greater volume levels. The popularity of the larger "dreadnought" body size amongst acoustic performers is related to the greater sound volume produced. Most electric guitar bodies are made of wood and include a plastic pickguard. Boards wide enough to use as a solid body are very expensive due to the worldwide depletion of hardwood stock since the 1970s, so the wood is rarely one solid piece. Most bodies are made from two pieces of wood with some of them including a seam running down the center line of the body. The most common woods used for electric guitar body construction include maple, basswood, ash, poplar, alder, and mahogany. Many bodies consist of good-sounding, but inexpensive woods, like ash, with a "top", or thin layer of another, more attractive wood (such as maple with a natural "flame" pattern) glued to the top of the basic wood. Guitars constructed like this are often called "flame tops". The body is usually carved or routed to accept the other elements, such as the bridge, pickup, neck, and other electronic components. Most electrics have a polyurethane or nitrocellulose lacquer finish. Other alternative materials to wood are used in guitar body construction. Some of these include carbon composites, plastic material, such as polycarbonate, and aluminum alloys. Bridge The main purpose of the bridge on an acoustic guitar is to transfer the vibration from the strings to the soundboard, which vibrates the air inside of the guitar, thereby amplifying the sound produced by the strings. On all electric, acoustic and original guitars, the bridge holds the strings in place on the body. There are many varied bridge designs. There may be some mechanism for raising or lowering the bridge saddles to adjust the distance between the strings and the fretboard (action), or fine-tuning the intonation of the instrument. Some are spring-loaded and feature a "whammy bar", a removable arm that lets the player modulate the pitch by changing the tension on the strings. The whammy bar is sometimes also called a "tremolo bar". (The effect of rapidly changing pitch is properly called "vibrato". See Tremolo for further discussion of this term.) Some bridges also allow for alternate tunings at the touch of a button. On almost all modern electric guitars, the bridge has saddles that are adjustable for each string so that intonation stays correct up and down the neck. If the open string is in tune, but sharp or flat when frets are pressed, the bridge saddle position can be adjusted with a screwdriver or hex key to remedy the problem. In general, flat notes are corrected by moving the saddle forward and sharp notes by moving it backward. On an instrument correctly adjusted for intonation, the actual length of each string from the nut to the bridge saddle is slightly, but measurably longer than the scale length of the instrument. This additional length is called compensation, which flattens all notes a bit to compensate for the sharping of all fretted notes caused by stretching the string during fretting. Saddle The saddle of a guitar is the part of the bridge that physically supports the strings. It may be one piece (typically on acoustic guitars) or separate pieces, one for each string (electric guitars and basses). The saddle's basic purpose is to provide the endpoint for the string's vibration at the correct location for proper intonation, and on acoustic guitars to transfer the vibrations through the bridge into the top wood of the guitar. Saddles are typically made of plastic or bone for acoustic guitars, though synthetics and some exotic animal tooth variations (e.g. fossilized tooth, ivory, etc. ) have become popular with some players. Electric guitar saddles are typically metal, though some synthetic saddles are available. Pickguard The pickguard, also known as the scratchplate, is usually a piece of laminated plastic or other material that protects the finish of the top of the guitar from damage due to the use of a plectrum ("pick") or fingernails. Electric guitars sometimes mount pickups and electronics on the pickguard. It is a common feature on steel-string acoustic guitars. Some performance styles that use the guitar as a percussion instrument (tapping the top or sides between notes, etc.), such as flamenco, require that a scratchplate or pickguard be fitted to nylon-string instruments. Strings The standard guitar has six strings, but four-, seven-, eight-, nine-, ten-, eleven-, twelve-, thirteen- and eighteen-string guitars are also available. Classical and flamenco guitars historically used gut strings, but these have been superseded by polymer materials, such as nylon and fluorocarbon. Modern guitar strings are constructed from metal, polymers, or animal or plant product materials. Instruments utilizing "steel" strings may have strings made from alloys incorporating steel, nickel or phosphor bronze. Bass strings for both instruments are wound rather than monofilament. Pickups and electronics Pickups are transducers attached to a guitar that detect (or "pick up") string vibrations and convert the mechanical energy of the string into electrical energy. The resultant electrical signal can then be electronically amplified. The most common type of pickup is electromagnetic in design. These contain magnets that are within a coil, or coils, of copper wire. Such pickups are usually placed directly underneath the guitar strings. Electromagnetic pickups work on the same principles and in a similar manner to an electric generator. The vibration of the strings creates a small electric current in the coils surrounding the magnets. This signal current is carried to a guitar amplifier that drives a loudspeaker. Traditional electromagnetic pickups are either single-coil or double-coil. Single-coil pickups are susceptible to noise induced by stray electromagnetic fields, usually mains-frequency (60 or 50 hertz) hum. The introduction of the double-coil humbucker in the mid-1950s solved this problem through the use of two coils, one of which is wired in opposite polarity to cancel or "buck" stray fields. The types and models of pickups used can greatly affect the tone of the guitar. Typically, humbuckers, which are two magnet-coil assemblies attached to each other, are traditionally associated with a heavier sound. Single-coil pickups, one magnet wrapped in
amount of traffic routed through the Gnutella network, making it significantly more scalable. If the user decides to download the file, they negotiate the file transfer. If the node which has the requested file is not firewalled, the querying node can connect to it directly. However, if the node is firewalled, stopping the source node from receiving incoming connections, the client wanting to download a file sends it a so-called push request to the server for the remote client to initiate the connection instead (to push the file). At first, these push requests were routed along the original chain it used to send the query. This was rather unreliable because routes would often break and routed packets are always subject to flow control. push proxies were introduced to address this problem. These are usually the ultrapeers of a leaf node and they are announced in search results. The client connects to one of these push proxies using an HTTP request and the proxy sends a push request to a leaf on behalf of the client. Normally, it is also possible to send a push request over UDP to the push proxy, which is more efficient than using TCP. Push proxies have two advantages: First, ultrapeer-leaf connections are more stable than routes. This makes push requests much more reliable. Second, it reduces the amount of traffic routed through the Gnutella network. Finally, when a user disconnects, the client software saves a list of known nodes. This contains the nodes to which the client was connected and the nodes learned from pong packets. The client uses that as its seed list, when it next starts, thus becoming independent of bootstrap services. In practice, this method of searching on the Gnutella network was often unreliable. Each node is a regular computer user; as such, they are constantly connecting and disconnecting, so the network is never completely stable. Also, the bandwidth cost of searching on Gnutella grew exponentially to the number of connected users, often saturating connections and rendering slower nodes useless. Therefore, search requests would often be dropped, and most queries reached only a very small part of the network. This observation identified the Gnutella network as an unscalable distributed system, and inspired the development of distributed hash tables, which are much more scalable but support only exact-match, rather than keyword, search. To address the problems of bottlenecks, Gnutella developers implemented a tiered system of ultrapeers and leaves. Instead of all nodes being considered equal, nodes entering the network were kept at the 'edge' of the network, as a leaf. Leaves don't provide routing. Nodes which are capable of routing messages are promoted to ultrapeers. Ultrapeers accept leaf connections and route searches and network maintenance messages. This allows searches to propagate further through the network and allows for numerous alterations in topology. This greatly improved efficiency and scalability. Additionally, gnutella adopted a number of other techniques to reduce traffic overhead and make searches more efficient. Most notable are Query Routing Protocol (QRP) and Dynamic Querying (DQ). With QRP, a search reaches only those clients which are likely to have the files, so searches for rare files become far more efficient. With DQ, the search stops as soon as the program has acquired enough search results. This vastly reduces the amount of traffic caused by popular searches. One of the benefits of having Gnutella so decentralized is to make it very difficult to shut the network down and to make it a network in which the users are the only ones who can decide which content will be available. Unlike Napster, where the entire network relied on the central server, Gnutella cannot be shut down by shutting down any one node. A decentralized network prevents bad actors from taking control of the contents of the network and/or manipulating data by controlling the central server. Protocol features and extensions Gnutella once operated on a purely query flooding-based
the modus operandi of Gnutella development today. Among the first independent Gnutella pioneers were Gene Kan and Spencer Kimball, who launched the first portal aimed to assemble the open-source community to work on Gnutella and also developed "GNUbile", one of the first open-source (GNU-GPL) programs to implement the Gnutella protocol. The Gnutella network is a fully distributed alternative to such semi-centralized systems as FastTrack (KaZaA) and the original Napster. The initial popularity of the network was spurred on by Napster's threatened legal demise in early 2001. This growing surge in popularity revealed the limits of the initial protocol's scalability. In early 2001, variations on the protocol (first implemented in proprietary and closed source clients) allowed an improvement in scalability. Instead of treating every user as client and server, some users were now treated as ultrapeers, routing search requests and responses for users connected to them. This allowed the network to grow in popularity. In late 2001, the Gnutella client LimeWire Basic became free and open source. In February 2002, Morpheus, a commercial file sharing group, abandoned its FastTrack-based peer-to-peer software and released a new client based on the free and open source Gnutella client Gnucleus. The word Gnutella today refers not to any one project or piece of software, but to the open protocol used by the various clients. The name is a portmanteau of GNU and Nutella, the brand name of an Italian hazelnut flavored spread: supposedly, Frankel and Pepper ate a lot of Nutella working on the original project, and intended to license their finished program under the GNU General Public License. Gnutella is not associated with the GNU project or GNU's own peer-to-peer network, GNUnet. On October 26, 2010, the popular Gnutella client LimeWire was ordered shut down by Judge Kimba Wood of the United States District Court for the Southern District of New York when she signed a Consent Decree to which recording industry plaintiffs and LimeWire had agreed. This event was the likely cause of a notable drop in the size of the network, because, while negotiating the injunction, LimeWire staff had inserted remote-disabling code into the software. As the injunction came into force, users who had installed affected versions (newer than 5.5.10) were cut off from the P2P network. Since LimeWire was free software, nothing had prevented the creation of forks that omitted the disabling code, as long as LimeWire trademarks were not used. The shutdown did not affect, for example, FrostWire, a fork of LimeWire created in 2004 that carries neither the remote-disabling code nor adware. On November 9, 2010, LimeWire was resurrected by a secret team of developers and named LimeWire Pirate Edition. It was based on LimeWire 5.6 BETA. This version had its server dependencies removed and all the PRO features enabled for free. Design To envision how Gnutella originally worked, imagine a large circle of users (called nodes), each of whom has Gnutella client software. On initial startup, the client software must bootstrap and find at least one other node. Various methods have been used for this, including a pre-existing address list of possibly working nodes shipped with the software, using updated web caches of known nodes (called Gnutella Web Caches), UDP host caches and, rarely, even IRC. Once connected, the client requests a list of working addresses. The client tries to connect to the nodes it was shipped with, as well as nodes it receives from other clients until it reaches a certain quota. It connects to only that many nodes, locally caching the addresses which it has not yet tried and discarding the addresses which it tried and found to be invalid. When the user wants to do a search, the client sends the request to each actively connected node. In version 0.4 of the protocol, the number of actively connected nodes for a client was quite small (around 5). In that version of the protocol, each node forwards the request to all its actively connected nodes, who, in turn, forward the request. This continues until the packet has reached a predetermined number of hops from the sender (maximum 7). Since version 0.6 (2002), Gnutella is a composite network made of leaf nodes and ultra nodes (also called ultrapeers). The leaf nodes are connected to a small number of ultrapeers (typically 3) while each ultrapeer is connected to more than 32 other ultrapeers. With this higher outdegree, the maximum number of hops a query can travel was lowered to 4. Leaves and ultrapeers use the Query Routing Protocol to exchange a Query Routing Table (QRT), a table of 64 Ki-slots and up to 2 Mi-slots consisting of hashed keywords. A leaf node sends its QRT to each of the ultrapeers to which it is connected, and ultrapeers merge the QRT of all their leaves (downsized to 128 Ki-slots) plus their own QRT (if they share files) and exchange that with their own neighbors. Query routing is then done by hashing the words of the query and seeing whether all of them match in the QRT. Ultrapeers do that check before forwarding a query to a leaf node, and also before forwarding the query to a peer ultra node provided
that Lucas would be making the prequels. He began penning more to the story, indicating that the series would be a tragic one, examining Anakin Skywalker's fall to the dark side. Lucas also began to change the status of the prequels relative to the originals; at first, they were supposed to be a "filling-in" of history tangential to the originals, but now he saw that they could form the beginning of one long story that started with Anakin's childhood and ended with his death. This was the final step towards turning the film series into a "Saga". In 1994, Lucas began work on the screenplay of the first prequel, tentatively titled Episode I: The Beginning. In 1997, to celebrate the 20th anniversary of Star Wars, Lucas returned to the original trilogy and made numerous modifications using newly available digital technology, releasing them in theaters as the Star Wars Special Edition. For DVD releases in 2004 and Blu-ray releases in 2011, the trilogy received further revisions to make them congruent with the prequel trilogy. Besides the additions to the Star Wars franchise, Lucas released a Director's Cut of THX 1138 in 2004, with the film re-cut and containing a number of CGI revisions. The first Star Wars prequel was finished and released in 1999 as Episode I – The Phantom Menace, which would be the first film Lucas had directed in over two decades. Following the release of the first prequel, Lucas announced that he would also be directing the next two, and began working on Episode II. The first draft of Episode II was completed just weeks before principal photography, and Lucas hired Jonathan Hales, a writer from The Young Indiana Jones Chronicles, to polish it. It was completed and released in 2002 as Attack of the Clones. The final prequel, Episode III – Revenge of the Sith, began production in 2002 and was released in 2005. Numerous fans and critics considered the prequels inferior to the original trilogy, though they were box office successes. In 2004, Lucas reflected that his transition from independent to corporate filmmaker mirrored the story of Star Wars character Darth Vader in some ways. Lucas collaborated with Jeff Nathanson as a writer of the 2008 film Indiana Jones and the Kingdom of the Crystal Skull, directed by Steven Spielberg. Like the Star Wars prequels, the reception was mixed, with numerous fans and critics once again considering it inferior to its predecessors. From 2008 to 2014, Lucas also served as the creator and executive producer and for a second Star Wars animated series on Cartoon Network, Star Wars: The Clone Wars which premiered with a feature film of the same name before airing its first episode. The supervising director for this series was Dave Filoni, who was chosen by Lucas and closely collaborated with him on its development. Series it bridged the events between Attack of the Clones and Revenge of the Sith. The animated series also featured the last Star Wars stories in which Lucas was majorly involved. In 2012, Lucas served as executive producer for Red Tails, a war film based on the exploits of the Tuskegee Airmen during World War II. He also took over direction of reshoots while director Anthony Hemingway worked on other projects. 2012–present: Semi-retirement In January 2012, Lucas announced his retirement from producing large blockbuster films and instead re-focusing his career on smaller, independently budgeted features. In June 2012, it was announced that producer Kathleen Kennedy, a long-term collaborator with Steven Spielberg and a producer of the Indiana Jones films, had been appointed as co-chair of Lucasfilm Ltd. It was reported that Kennedy would work alongside Lucas, who would remain chief executive and serve as co-chairman for at least one year, after which she would succeed him as the company's sole leader. With the sale of Lucasfilm to Disney, Lucas is currently Disney's second-largest single shareholder after the estate of Steve Jobs. Lucas worked as a creative consultant on the Star Wars sequel trilogy's first film, The Force Awakens. As creative consultant on the film, Lucas's involvement included attending early story meetings; according to Lucas, "I mostly say, 'You can't do this. You can do that.' You know, 'The cars don't have wheels. They fly with antigravity.' There's a million little pieces ... I know all that stuff." Lucas's son Jett told The Guardian that his father was "very torn" about having sold the rights to the franchise, despite having hand-picked Abrams to direct, and that his father was "there to guide" but that "he wants to let it go and become its new generation." Among the materials turned over to the production team were rough story treatments Lucas developed when he considered creating episodes VII–IX himself years earlier; in January 2015, Lucas stated that Disney had discarded his story ideas. The Force Awakens, directed by J. J. Abrams, was released on December 18, 2015. Kathleen Kennedy executive produced the film and its sequels. The new sequel trilogy was jointly produced by Lucasfilm and The Walt Disney Company, which had acquired Lucasfilm in 2012. During an interview with talk show host and journalist Charlie Rose that aired on December 24, 2015, Lucas likened his decision to sell Lucasfilm to Disney to a divorce and outlined the creative differences between him and the producers of The Force Awakens. Lucas described the previous six Star Wars films as his "children" and defended his vision for them, while criticizing The Force Awakens for having a "retro feel", saying, "I worked very hard to make them completely different, with different planets, with different spaceships – you know, to make it new." Lucas also drew some criticism and subsequently apologized for his remark likening Disney to "white slavers". In 2015, Lucas wrote the CGI film Strange Magic, his first musical. The film was produced at Skywalker Ranch. Gary Rydstrom directed the movie. At the same time the sequel trilogy was announced a fifth installment of the Indiana Jones series also entered pre-development phase with Harrison Ford and Steven Spielberg set to return. Lucas originally did not specify whether the selling of Lucasfilm would affect his involvement with the film. In October 2016, Lucas announced his decision to not be involved in the story of the film but would remain an executive producer. In 2016, Rogue One: A Star Wars Story, the first film of a Star Wars anthology series was released. It told the story of the rebels who stole the plans for the Death Star featured in the original Star Wars film, and it was reported that Lucas liked it more than The Force Awakens. The Last Jedi, the second film in the sequel trilogy, was released in 2017; Lucas described the film as "beautifully made". Lucas has had cursory involvement with Solo: A Star Wars Story (2018), the Star Wars streaming series The Mandalorian, and the premiere of the eighth season of Game of Thrones. Lucas met with J. J. Abrams before the latter began writing the script to the sequel trilogy's final film, The Rise of Skywalker, which was released in 2019. Filmmaking Collaboration with John Williams Lucas was also heavily involved and invested in the scoring process for the original Star Wars soundtrack, which was composed by John Williams, on the recommendation of his friend and colleague Steven Spielberg. Whilst initially wanting to use tracks and film music in a similar manner to 2001: A Space Odyssey, which served as the inspiration for the film, Williams advised against this and instead proposed a system of recurring themes (or leitmotifs) to enhance the story in the style of classical composers Gustav Holst, William Walton, and Igor Stravinsky; works that Lucas had used as "temp tracks" for Williams to gain inspiration from. The film, and subsequent sequels and prequels, make use of the Main Title Theme, the Force Theme (less commonly referred to as Obi Wan Kenobi's Theme), the Rebel Alliance Theme, and Princess Leia's Theme (all introduced in this film) repeatedly. Subsequent films also added to the catalogue of themes for different characters, factions, and locations. The score was released to critical acclaim and won Williams his third Academy Award for Best Original Score. The score was listed by the American Film Institute in 2005 as the greatest film score of all time. The professional relationship formed by Lucas and Williams extended through to Williams working on all of Lucas's blockbuster franchise movies: the remaining two films of the Star Wars original trilogy; all three films of prequel trilogy developed over fifteen years later; and the four (to be five) films of the Indiana Jones franchise, in which Williams reunited with his long-time collaborator Spielberg. In his collaborations with Lucas, Williams received six of his fifty-two Academy Award nominations (Star Wars, The Empire Strikes Back, Return of the Jedi, Raiders of the Lost Ark, Indiana Jones and the Temple of Doom, and Indiana Jones and the Last Crusade). After Lucas sold Lucasfilm to Disney, Williams stayed on board with the franchise, and continued to score the remaining three films of the "Skywalker saga" (The Force Awakens, The Last Jedi, and The Rise of Skywalker, for which he received a further three Oscar nominations), after which he announced his "retirement" from the series. Lucas was in attendance for a ceremony honouring Williams as the 44th recipient of the AFI Life Achievement Award, the first composer to receive the honour, and gave a speech in praise of their relationship and his work. In interviews, and most famously at the 40th Anniversary Star Wars Celebration convention, Lucas has repeatedly reaffirmed the importance of Williams to the Star Wars saga, affectionately referring to him as the "secret sauce" of his movies. Philanthropy Lucas is one of the wealthiest celebrities in the world. Lucas has pledged to give half of his fortune to charity as part of an effort called The Giving Pledge led by Bill Gates and Warren Buffett to persuade America's richest individuals to donate their financial wealth to charities. George Lucas Educational Foundation In 1991, The George Lucas Educational Foundation was founded as a nonprofit operating foundation to celebrate and encourage innovation in schools. The Foundation's content is available under the brand Edutopia, in an award-winning web site, social media and via documentary films. Lucas, through his foundation, was one of the leading proponents of the E-rate program in the universal service fund, which was enacted as part of the Telecommunications Act of 1996. On June 24, 2008, Lucas testified before the United States House of Representatives subcommittee on Telecommunications and the Internet as the head of his Foundation to advocate for a free wireless broadband educational network. Proceeds from the sale of Lucasfilm to Disney In 2012, Lucas sold Lucasfilm to The Walt Disney Company for a reported sum of $4.05 billion. It was widely reported at the time that Lucas intends to give the majority of the proceeds from the sale to charity. A spokesperson for Lucasfilm said, "George Lucas has expressed his intention, in the event the deal closes, to donate the majority of the proceeds to his philanthropic endeavors." Lucas also spoke on the matter: "For 41 years, the majority of my time and money has been put into the company. As I start a new chapter in my life, it is gratifying that I have the opportunity to devote more time and resources to philanthropy." Lucas Museum of Narrative Art By June 2013, Lucas was considering establishing a museum, the Lucas Cultural Arts Museum, to be built on Crissy Field near the Golden Gate Bridge in San Francisco, which would display his collection of illustrations and pop art, with an estimated value of more than $1 billion. Lucas offered to pay the estimated $300 million cost of constructing the museum, and would endow it with $400 million when it opened, eventually adding an additional $400 million to its endowment. After being unable to reach an agreement with The Presidio Trust, Lucas turned to Chicago. A potential lakefront site on Museum Campus in Chicago was proposed in May 2014. By June 2014, Chicago had been selected, pending approval of the Chicago Plan Commission, which was granted. The museum project was renamed the Lucas Museum of Narrative Art. On June 24, 2016, Lucas announced that he was abandoning his plans to locate the museum in Chicago, due to a lawsuit by a local preservation group, Friends of the Parks, and would instead build the museum in California. On January 17, 2017, Lucas announced that the museum will be constructed in Exposition Park, Los Angeles, California. Other initiatives In 2005, Lucas gave US$1 million to help build the Martin Luther King Jr. Memorial on the National Mall in Washington D.C. to commemorate American civil rights leader Martin Luther King Jr. On September 19, 2006, the University of Southern California announced that Lucas had donated $175–180 million to his alma mater to expand the film school. It is the largest single donation to USC and the largest gift to a film school anywhere. Previous donations led to the already-existing George Lucas Instructional Building and Marcia Lucas Post-Production building. In 2013, Lucas and his wife Mellody Hobson donated $25 million to the Chicago-based not-for-profit After School Matters, of which Hobson is the chair. On April 15, 2016, it was reported that Lucas had donated between $501,000 and $1 million through the Lucas Family Foundation to the Obama Foundation, which is charged with overseeing the construction of the Barack Obama Presidential Center on Chicago's South Side. Personal life In 1969, Lucas married film editor Marcia Lou Griffin, who went on to win an Academy Award for her editing work on the original Star Wars film. They adopted a daughter, Amanda Lucas, in 1981, and divorced in 1983. Lucas subsequently adopted two more children as a single parent: daughter Katie Lucas, born in 1988, and son Jett Lucas, born in 1993. His three eldest children all appeared in the three Star Wars prequels, as did Lucas himself. Following his divorce, Lucas was in a relationship with singer Linda Ronstadt in the 1980s. Lucas began dating Mellody Hobson, president of Ariel Investments and chair of DreamWorks Animation, in 2006. Lucas and Hobson announced their engagement in January 2013, and married on June 22, 2013, at Lucas's Skywalker Ranch in Marin County, California. They have one daughter together, born via gestational carrier in August 2013. Lucas was born and raised in a Methodist family. The religious and mythical themes in Star Wars were inspired by Lucas's interest in the writings of mythologist Joseph Campbell, and he would eventually come to identify strongly with the Eastern religious philosophies he studied and incorporated into his films, which were a major inspiration for "the Force". Lucas has come to state that his religion is "Buddhist Methodist". He resides in Marin County. Lucas is a major collector of the American illustrator and painter Norman Rockwell. A collection of 57 Rockwell paintings and drawings owned by Lucas and fellow Rockwell collector and film director Steven Spielberg were displayed at the Smithsonian American Art Museum from July 2, 2010, to January 2, 2011, in an exhibition titled Telling Stories. Lucas has said that he is a fan of Seth MacFarlane's hit TV show Family Guy. MacFarlane has said that Lucasfilm was extremely helpful when the Family Guy crew wanted to parody
Star Wars film, Lucas worked extensively as a writer and producer, including on the many Star Wars spinoffs made for film, television, and other media. Lucas acted as executive producer for the next two Star Wars films, commissioning Irvin Kershner to direct The Empire Strikes Back, and Richard Marquand to direct Return of the Jedi, while receiving a story credit on the former and sharing a screenwriting credit with Lawrence Kasdan on the latter. He also acted as story writer and executive producer on all four of the Indiana Jones films, which his colleague and good friend Steven Spielberg directed. Other successful projects where Lucas credited as executive producer and sometimes story writer in this period include Kurosawa's Kagemusha (1980), Twice Upon A Time (1983), Ewoks: Caravan of Courage (1984), Ewoks: Battle for Endor (1985), Mishima: A Life in Four Chapters (1985), Jim Henson's Labyrinth (1986), Don Bluth's The Land Before Time (1988), and the Indiana Jones television spinoff The Young Indiana Jones Chronicles (1992–96). There were unsuccessful projects, however, including More American Graffiti (1979), Willard Huyck's Howard the Duck (1986), which was the biggest flop of Lucas's career, Ron Howard's Willow (1988), Coppola's Tucker: The Man and His Dream (1988), and Mel Smith's Radioland Murders (1994). The animation studio Pixar was founded in 1979 as the Graphics Group, one third of the Computer Division of Lucasfilm. Pixar's early computer graphics research resulted in groundbreaking effects in films such as Star Trek II: The Wrath of Khan and Young Sherlock Holmes, and the group was purchased in 1986 by Steve Jobs shortly after he left Apple Computer. Jobs paid Lucas US$5 million and put US$5 million as capital into the company. The sale reflected Lucas's desire to stop the cash flow losses from his 7-year research projects associated with new entertainment technology tools, as well as his company's new focus on creating entertainment products rather than tools. As of June 1983, Lucas was worth US$60 million, but he met cash-flow difficulties following his divorce that year, concurrent with the sudden dropoff in revenues from Star Wars licenses following the theatrical run of Return of the Jedi. At this point, Lucas had no desire to return to Star Wars, and had unofficially canceled the sequel trilogy. Also in 1983, Lucas and Tomlinson Holman founded the audio company THX Ltd. The company was formerly owned by Lucasfilm and contains equipment for stereo, digital, and theatrical sound for films, and music. Skywalker Sound and Industrial Light & Magic, are the sound and visual effects subdivisions of Lucasfilm, while Lucasfilm Games, later renamed LucasArts, produces products for the gaming industry. 1993–2012: Return to directing, Star Wars and Indiana Jones Having lost much of his fortune in a divorce settlement in 1987, Lucas was reluctant to return to Star Wars. However, the prequels, which were still only a series of basic ideas partially pulled from his original drafts of "The Star Wars", continued to tantalize him with technical possibilities that would make it worthwhile to revisit his older material. When Star Wars became popular once again, in the wake of Dark Horse's comic book line and Timothy Zahn's trilogy of spin-off novels, Lucas realized that there was still a large audience. His children were older, and with the explosion of CGI technology he began to consider directing once again. By 1993, it was announced, in Variety among other sources, that Lucas would be making the prequels. He began penning more to the story, indicating that the series would be a tragic one, examining Anakin Skywalker's fall to the dark side. Lucas also began to change the status of the prequels relative to the originals; at first, they were supposed to be a "filling-in" of history tangential to the originals, but now he saw that they could form the beginning of one long story that started with Anakin's childhood and ended with his death. This was the final step towards turning the film series into a "Saga". In 1994, Lucas began work on the screenplay of the first prequel, tentatively titled Episode I: The Beginning. In 1997, to celebrate the 20th anniversary of Star Wars, Lucas returned to the original trilogy and made numerous modifications using newly available digital technology, releasing them in theaters as the Star Wars Special Edition. For DVD releases in 2004 and Blu-ray releases in 2011, the trilogy received further revisions to make them congruent with the prequel trilogy. Besides the additions to the Star Wars franchise, Lucas released a Director's Cut of THX 1138 in 2004, with the film re-cut and containing a number of CGI revisions. The first Star Wars prequel was finished and released in 1999 as Episode I – The Phantom Menace, which would be the first film Lucas had directed in over two decades. Following the release of the first prequel, Lucas announced that he would also be directing the next two, and began working on Episode II. The first draft of Episode II was completed just weeks before principal photography, and Lucas hired Jonathan Hales, a writer from The Young Indiana Jones Chronicles, to polish it. It was completed and released in 2002 as Attack of the Clones. The final prequel, Episode III – Revenge of the Sith, began production in 2002 and was released in 2005. Numerous fans and critics considered the prequels inferior to the original trilogy, though they were box office successes. In 2004, Lucas reflected that his transition from independent to corporate filmmaker mirrored the story of Star Wars character Darth Vader in some ways. Lucas collaborated with Jeff Nathanson as a writer of the 2008 film Indiana Jones and the Kingdom of the Crystal Skull, directed by Steven Spielberg. Like the Star Wars prequels, the reception was mixed, with numerous fans and critics once again considering it inferior to its predecessors. From 2008 to 2014, Lucas also served as the creator and executive producer and for a second Star Wars animated series on Cartoon Network, Star Wars: The Clone Wars which premiered with a feature film of the same name before airing its first episode. The supervising director for this series was Dave Filoni, who was chosen by Lucas and closely collaborated with him on its development. Series it bridged the events between Attack of the Clones and Revenge of the Sith. The animated series also featured the last Star Wars stories in which Lucas was majorly involved. In 2012, Lucas served as executive producer for Red Tails, a war film based on the exploits of the Tuskegee Airmen during World War II. He also took over direction of reshoots while director Anthony Hemingway worked on other projects. 2012–present: Semi-retirement In January 2012, Lucas announced his retirement from producing large blockbuster films and instead re-focusing his career on smaller, independently budgeted features. In June 2012, it was announced that producer Kathleen Kennedy, a long-term collaborator with Steven Spielberg and a producer of the Indiana Jones films, had been appointed as co-chair of Lucasfilm Ltd. It was reported that Kennedy would work alongside Lucas, who would remain chief executive and serve as co-chairman for at least one year, after which she would succeed him as the company's sole leader. With the sale of Lucasfilm to Disney, Lucas is currently Disney's second-largest single shareholder after the estate of Steve Jobs. Lucas worked as a creative consultant on the Star Wars sequel trilogy's first film, The Force Awakens. As creative consultant on the film, Lucas's involvement included attending early story meetings; according to Lucas, "I mostly say, 'You can't do this. You can do that.' You know, 'The cars don't have wheels. They fly with antigravity.' There's a million little pieces ... I know all that stuff." Lucas's son Jett told The Guardian that his father was "very torn" about having sold the rights to the franchise, despite having hand-picked Abrams to direct, and that his father was "there to guide" but that "he wants to let it go and become its new generation." Among the materials turned over to the production team were rough story treatments Lucas developed when he considered creating episodes VII–IX himself years earlier; in January 2015, Lucas stated that Disney had discarded his story ideas. The Force Awakens, directed by J. J. Abrams, was released on December 18, 2015. Kathleen Kennedy executive produced the film and its sequels. The new sequel trilogy was jointly produced by Lucasfilm and The Walt Disney Company, which had acquired Lucasfilm in 2012. During an interview with talk show host and journalist Charlie Rose that aired on December 24, 2015, Lucas likened his decision to sell Lucasfilm to Disney to a divorce and outlined the creative differences between him and the producers of The Force Awakens. Lucas described the previous six Star Wars films as his "children" and defended his vision for them, while criticizing The Force Awakens for having a "retro feel", saying, "I worked very hard to make them completely different, with different planets, with different spaceships – you know, to make it new." Lucas also drew some criticism and subsequently apologized for his remark likening Disney to "white slavers". In 2015, Lucas wrote the CGI film Strange Magic, his first musical. The film was produced at Skywalker Ranch. Gary Rydstrom directed the movie. At the same time the sequel trilogy was announced a fifth installment of the Indiana Jones series also entered pre-development phase with Harrison Ford and Steven Spielberg set to return. Lucas originally did not specify whether the selling of Lucasfilm would affect his involvement with the film. In October 2016, Lucas announced his decision to not be involved in the story of the film but would remain an executive producer. In 2016, Rogue One: A Star Wars Story, the first film of a Star Wars anthology series was released. It told the story of the rebels who stole the plans for the Death Star featured in the original Star Wars film, and it was reported that Lucas liked it more than The Force Awakens. The Last Jedi, the second film in the sequel trilogy, was released in 2017; Lucas described the film as "beautifully made". Lucas has had cursory involvement with Solo: A Star Wars Story (2018), the Star Wars streaming series The Mandalorian, and the premiere of the eighth season of Game of Thrones. Lucas met with J. J. Abrams before the latter began writing the script to the sequel trilogy's final film, The Rise of Skywalker, which was released in 2019. Filmmaking Collaboration with John Williams Lucas was also heavily involved and invested in the scoring process for the original Star Wars soundtrack, which was composed by John Williams, on the recommendation of his friend and colleague Steven Spielberg. Whilst initially wanting to use tracks and film music in a similar manner to 2001: A Space Odyssey, which served as the inspiration for the film, Williams advised against this and instead proposed a system of recurring themes (or leitmotifs) to enhance the story in the style of classical composers Gustav Holst, William Walton, and Igor Stravinsky; works that Lucas had used as "temp tracks" for Williams to gain inspiration from. The film, and subsequent sequels and prequels, make use of the Main Title Theme, the Force Theme (less commonly referred to as Obi Wan Kenobi's Theme), the Rebel Alliance Theme, and Princess Leia's Theme (all introduced
the only city on the west coast that, along with Marstrand, was granted the rights to trade with merchants from other countries. In the 18th century, fishing was the most important industry. However, in 1731, the Swedish East India Company was founded, and the city flourished due to its foreign trade with highly profitable commercial expeditions to China. The harbour developed into Sweden's main harbour for trade towards the west, and when Swedish emigration to the United States increased, Gothenburg became Sweden's main point of departure for these travellers. The impact of Gothenburg as a main port of embarkation for Swedish emigrants is reflected by Gothenburg, Nebraska, a small Swedish settlement in the United States. With the 19th century, Gothenburg evolved into a modern industrial city that continued on into the 20th century. The population increased tenfold in the century, from 13,000 (1800) to 130,000 (1900). In the 20th century, major companies that developed included SKF (1907) and Volvo (1927). Geography Gothenburg is located on the west coast, in southwestern Sweden, about halfway between the capitals Copenhagen, Denmark, and Oslo, Norway. The location at the mouth of the Göta älv, which feeds into Kattegatt, an arm of the North Sea, has helped the city grow in significance as a trading city. The archipelago of Gothenburg consists of rough, barren rocks and cliffs, which also is typical for the coast of Bohuslän. Due to the Gulf Stream, the city has a mild climate and moderately heavy precipitation. It is the second-largest city in Sweden after the capital Stockholm. The Gothenburg Metropolitan Area (Stor-Göteborg) has 982,360 inhabitants and extends to the municipalities of Ale, Alingsås, Göteborg, Härryda, Kungälv, Lerum, Lilla Edet, Mölndal, Partille, Stenungsund, Tjörn, Öckerö within Västra Götaland County, and Kungsbacka within Halland County. Angered, a suburb outside Gothenburg, consists of Hjällbo, Eriksbo, Rannebergen, Hammarkullen, Gårdsten, and Lövgärdet. It is a Million Programme part of Gothenburg, like Rosengård in Malmö and Botkyrka in Stockholm. Angered had about 50,000 inhabitants in 2015.[?] It lies north of Gothenburg and is isolated from the rest of the city. Bergsjön is another Million Programme suburb north of Gothenburg, it has 14,000 inhabitants. Biskopsgården is the biggest multicultural suburb on the island of Hisingen, which is a part of Gothenburg but separated from the city by the river. Climate Gothenburg has an oceanic (Cfb according to the Köppen climate classification). Despite its northerly latitude, temperatures are quite mild throughout the year and warmer than places at a similar latitude like Stockholm, this is mainly because of the moderating influence of the Gulf Stream. During the summer, daylight extends 18 hours and 5 minutes, but lasts 6 hours and 32 minutes in late December. The climate has become significantly milder in later decades, particularly in summer and winter; July temperatures used to be below Stockholm's 1961–1990 averages, but have since been warmer than that benchmark. Summers are warm and pleasant with average high temperatures of and lows of , but temperatures of occur on many days during the summer. Winters are cold and windy with temperatures of around , though it rarely drops below . Precipitation is regular but generally moderate throughout the year. Snow mainly occurs from December to March, but is not unusual in November and April and can sometimes occur even in October and May. Parks and nature Gothenburg has several parks and nature reserves ranging in size from tens of square metres to hundreds of hectares. It also has many green areas that are not designated as parks or reserves. Selection of parks: Kungsparken, , built between 1839 and 1861, surrounds the canal that circles the city centre. Garden Society of Gothenburg, a park and horticultural garden, is located next to Kungsportsavenyen. Founded in 1842 by the Swedish king Carl XIV Johan and on initiative of the amateur botanist Henric Elof von Normann, the park has a noted rose garden with some 4,000 roses of 1,900 cultivars. Slottsskogen, , was created in 1874 by August Kobb. It has a free "open" zoo that includes harbor seals, penguins, horses, pigs, deer, moose, goats, and many birds. The Natural History Museum (Naturhistoriska Museet) and the city's oldest observatory are located in the park. The annual Way Out West festival is held in the park. Änggårdsbergens naturreservat, , was bought in 1840 by pharmacist Arvid Gren, and donated in 1963 to the city by Sven and Carl Gren Broberg, who stated the area must remain a nature and bird reserve. It lies partly in Mölndal. Delsjöområdets naturreservat, about , has been in use since the 17th century as a farming area; significant forest management was carried out in the late 19th century. Skatås gym and motionscentrum is situated here. Rya Skogs Naturreservat, , became a protected area in 1928. It contains remnants of a defensive wall built in the mid- to late-17th century. Keillers park was donated by James Keiller in 1906. He was the son of Scottish Alexander Keiller, who founded the Götaverken shipbuilding company. S A Hedlunds park: Sven Adolf Hedlund, newspaper publisher and politician, bought the Bjurslätt farm in 1857, and in 1928 it was given to the city. Hisingsparken is Gothenburg's largest park. Flunsåsparken, built in 1950, has many free activities during the summer such as concerts and theatre. Gothenburg Botanical Garden, , opened in 1923. It won an award in 2003, and in 2006 was third in "The most beautiful garden in Europe" competition. It has around 16,000 species of plants and trees. The greenhouses contain around 4,500 species including 1,600 orchids. It is considered to be one of the most important botanical gardens in Europe with three stars in the French Guide Rouge. Architecture Very few houses are left from the 17th century when the city was founded, since all but the military and royal houses were built of wood. A rare exception is the Skansen Kronan. The first major architecturally interesting period is the 18th century when the East India Company made Gothenburg an important trade city. Imposing stone houses in Neo-Classical style were erected around the canals. One example from this period is the East India House, which today houses the Göteborg City Museum. In the 19th century, the wealthy bourgeoisie began to move outside the city walls which had protected the city. The style now was an eclectic, academic, somewhat overdecorated style which the middle-class favoured. The working class lived in the overcrowded city district Haga in wooden houses. In the 19th century, the first comprehensive town plan after the founding of city was created, which led to the construction of the main street, Kungsportsavenyen. Perhaps the most significant type of houses of the city, Landshövdingehusen, were built in the end of the 19th century – three-storey houses with the first floor in stone and the other two in wood. The early 20th century, characterized by the National Romantic style, was rich in architectural achievements. Masthugg Church is a noted example of the style of this period. In the early 1920s, on the city's 300th anniversary, the Götaplatsen square with its Neoclassical look was built. After this, the predominant style in Gothenburg and rest of Sweden was Functionalism which especially dominated the suburbs such as Västra Frölunda and Bergsjön. The Swedish functionalist architect Uno Åhrén served as city planner from 1932 through 1943. In the 1950s, the big stadium Ullevi was built when Sweden hosted the 1958 FIFA World Cup. The modern architecture of the city has been formed by such architects as Gert Wingårdh, who started as a Post-modernist in the 1980s. Gustaf Adolf Square is a town square located in central Gothenburg. Noted buildings on the square include Gothenburg City Hall (formerly the stock exchange, opened in 1849) and the Nordic Classicism law court. The main canal of Gothenburg also flanks the square. Characteristic buildings The Gothenburg Central Station is in the centre of the city, next to Nordstan and Drottningtorget. The building has been renovated and expanded numerous times since the grand opening in October 1858. In 2003, a major reconstruction was finished which brought the 19th-century building into the 21st century expanding the capacity for trains, travellers, and shopping. Not far from the central station is the Skanskaskrapan, or more commonly known as "The Lipstick". It is high with 22 floors and coloured in red-white stripes. The skyscraper was designed by Ralph Erskine and built by Skanska in the late 1980s as the headquarters for the company. By the shore of the Göta Älv at Lilla Bommen is The Göteborg Opera. It was completed in 1994. The architect Jan Izikowitz was inspired by the landscape and described his vision as "Something that makes your mind float over the squiggling landscape like the wings of a seagull." Feskekörka, or Fiskhallen, is an indoor fishmarket by the Rosenlundskanalen in central Gothenburg. Feskekörkan was opened on 1November 1874 and its name from the building's resemblance to a Gothic church. The Gothenburg city hall is in the Beaux-Arts architectural style. The Gothenburg Synagogue at Stora Nygatan, near Drottningtorget, was built in 1855 according to the designs of the German architect August Krüger. The Gunnebo House is a country house located to the south of Gothenburg, in Mölndal. It was built in a neoclassical architecture towards the end of the 18th century. Created in the early 1900s was the Vasa Church. It is located in Vasastan and is built of granite in a neo-Romanesque style. Another noted construction is Brudaremossen TV Tower, one of the few partially guyed towers in the world. Culture The sea, trade, and industrial history of the city are evident in the cultural life of Gothenburg. It is also a popular destination for tourists on the Swedish west coast. Museums Many of the cultural institutions, as well as hospitals and the university, were created by donations from rich merchants and industrialists, for example the Röhsska Museum. On 29December 2004, the Museum of World Culture opened near Korsvägen. Museums include the Gothenburg Museum of Art, and several museums of sea and navigation history, natural history, the sciences, and East India. Aeroseum, close to the Göteborg City Airport, is an aircraft museum in a former military underground air force base. The Volvo museum has exhibits of the history of Volvo and the development from 1927 until today. Products shown include cars, trucks, marine engines, and buses. Universeum is a public science centre that opened in 2001, the largest of its kind in Scandinavia. It is divided into six sections, each containing experimental workshops and a collection of reptiles, fish, and insects. Universeum occasionally host debates between Swedish secondary-school students and Nobel Prize laureates or other scholars. Leisure and entertainment The most noted attraction is the amusement park Liseberg, located in the central part of the city. It is the largest amusement park in Scandinavia by number of rides, and was chosen as one of the top ten amusement parks in the world (2005) by Forbes. It is the most popular attraction in Sweden by number of visitors per year (more than 3 million). There are a number of independent theatre ensembles in the city, besides institutions such as Gothenburg City Theatre, Backa Theatre (youth theatre), and Folkteatern. The main boulevard is called Kungsportsavenyn (commonly known as Avenyn, "The Avenue"). It is about long and starts at Götaplatsen – which is the location of the Gothenburg Museum of Art, the city's theatre, and the city library, as well as the concert hall – and stretches all the way to Kungsportsplatsen in the old city centre of Gothenburg, crossing a canal and a small park. The Avenyn was created in the 1860s and 1870s as a result of an international architecture contest, and is the product of a period of extensive town planning and remodelling. Avenyn has Gothenburg's highest concentration of pubs and clubs. Gothenburg's largest shopping centre (8th largest in Sweden), Nordstan, is located in central Gothenburg. Gothenburg's Haga district is known for its picturesque wooden houses and its cafés serving the well-known Haga bulle – a large cinnamon roll similar to the kanelbulle. Five Gothenburg restaurants have a star in the 2008 Michelin Guide: 28 +, Basement, Fond, Kock & Vin, Fiskekrogen, and Sjömagasinet. The city has a number of star chefs – over the past decade, seven of the Swedish Chef of the Year awards have been won by people from Gothenburg. The Gustavus Adolphus pastry, eaten every 6November in Sweden, Gustavus Adolphus Day, is especially connected to, and appreciated in, Gothenburg because the city was founded by King Gustavus Adolphus. One of Gothenburg's most popular natural tourist attractions is the southern Gothenburg archipelago, which is a set of several islands that can be reached by ferry boats mainly operating from Saltholmen. Within the archipelago are the Älvsborg fortress, Vinga and Styrsö islands. Festivals and fairs The annual Gothenburg Film Festival, is the largest film festival in Scandinavia. The Gothenburg Book Fair, held each year in September. It is the largest literary festival in Scandinavia, and the second largest book fair in Europe. A radical bookfair is held at the same time at the Syndikalistiskt Forum. The International Science Festival in Gothenburg is an annual festival since April 1997, in central Gothenburg with thought-provoking science activities for the public. The festival is visited by about people each year. This makes it the largest popular-science event in Sweden and one of the leading popular-science events in Europe. Citing the financial crisis, the International Federation of Library Associations and Institutions moved the 2010 World Library and Information Congress, previously to be held in Brisbane, Australia, to Gothenburg. The event took place on 10–15August 2010. Music Gothenburg has a diverse music community—the Gothenburg Symphony Orchestra is the best-known in classical music. Gothenburg also was the birthplace of the Swedish composer Kurt Atterberg. The first internationally successfully Swedish group, instrumental rock group The Spotnicks came from Gothenburg. Bands such as The Soundtrack of Our Lives and Ace of Base are well-known pop representatives of the city. During the 1970s, Gothenburg had strong roots in the Swedish progressive movement (progg) with such groups as Nationalteatern, Nynningen, and Motvind. The record company Nacksving and the editorial office for the magazine Musikens Makt which also were part of the progg movement were located in Gothenburg during this time as well. There is also an active indie scene in Gothenburg. For example, the musician Jens Lekman was born in the suburb of Angered and named his 2007 release Night Falls Over Kortedala after another suburb, Kortedala. Other internationally acclaimed indie artists include the electro pop duos Studio, The Knife, Air France, The Tough Alliance, indie rock band Love is All, songwriter José González, and pop singer El Perro del Mar, as well as genre-bending quartet Little Dragon fronted by vocalist Yukimi Nagano. Another son of the city is one of Sweden's most popular singers, Håkan Hellström, who often includes many places from the city in his songs. The glam rock group Supergroupies derives from Gothenburg. Gothenburg's own commercially successful At the Gates, In Flames, and Dark Tranquillity are credited with pioneering melodic death metal. Other well-known bands of the Gothenburg scene are thrash metal band The Haunted, progressive power metal band Evergrey, and power metal bands HammerFall and Dream Evil. Many music festivals take place in the city every year. The Metaltown Festival is a two-day festival featuring heavy metal music bands, held in Gothenburg. It has been arranged annually since 2004, taking place at the Frihamnen venue. In June 2012, the festival included bands such as In Flames, Marilyn Manson, Slayer, Lamb of God, and Mastodon. Another popular festival, Way Out West, focuses more on rock, electronic, and hip-hop genres. Sports As in all of Sweden, a variety of sports are followed, including football, ice hockey, basketball, handball, floorball, baseball, and figure skating. A varied amateur and professional sports clubs scene exists. Gothenburg is the birthplace of football in Sweden as the first football match in Sweden was played there in 1892. The city's three major football clubs, IFK Göteborg, Örgryte IS, and GAIS share a total of 34 Swedish championships between them. IFK has also won the UEFA Cup twice. Other notable clubs include BK Häcken (football), Göteborg HC (women's ice hockey), Pixbo Wallenstam IBK (floorball), multiple national handball champion Redbergslids IK, and four-time national ice hockey champion Frölunda HC, Gothenburg had a professional basketball team, Gothia Basket, until 2010 when it ceased. The bandy department of GAIS, GAIS Bandy, played the first season in the highest division Elitserien last season. The group stage match between the main rivals Sweden and Russia in the 2013 Bandy World Championship was played at Arena Heden in central Gothenburg. The city's most notable sports venues are Scandinavium, and Ullevi (multisport) and the newly built Gamla Ullevi (football). The 2003 World Allround Speed Skating Championships were held in Rudhallen, Sweden's only indoor speed-skating arena. It is a part of Ruddalens IP, which also has a bandy field and several football fields. The only Swedish heavyweight champion of the world in boxing, Ingemar Johansson, who took the title from Floyd Paterson in 1959, was from Gothenburg. Gothenburg has hosted a number of international sporting events including the 1958 FIFA World Cup, the 1983 European Cup Winners' Cup Final, an NFL preseason game on 14August 1988 between the Chicago Bears and the Minnesota Vikings, the 1992 European Football Championship, the 1993 and the 2002 World Men's Handball Championship, the 1995 World Championships in Athletics, the 1997 World Championships in Swimming (short track), the 2002 Ice Hockey World Championships, the 2004 UEFA Cup final, the 2006 European Championships in Athletics, and the 2008 World Figure Skating Championships. Annual events held in the city are the Gothia Cup and the Göteborgsvarvet. The annual Gothia Cup, is the world's largest football tournament with regards to the number of participants: in 2011, a total of 35,200 players from 1,567 teams and 72 nations participated. Gothenburg hosted the XIII FINA World Masters Championships in 2010. Diving,
the big stadium Ullevi was built when Sweden hosted the 1958 FIFA World Cup. The modern architecture of the city has been formed by such architects as Gert Wingårdh, who started as a Post-modernist in the 1980s. Gustaf Adolf Square is a town square located in central Gothenburg. Noted buildings on the square include Gothenburg City Hall (formerly the stock exchange, opened in 1849) and the Nordic Classicism law court. The main canal of Gothenburg also flanks the square. Characteristic buildings The Gothenburg Central Station is in the centre of the city, next to Nordstan and Drottningtorget. The building has been renovated and expanded numerous times since the grand opening in October 1858. In 2003, a major reconstruction was finished which brought the 19th-century building into the 21st century expanding the capacity for trains, travellers, and shopping. Not far from the central station is the Skanskaskrapan, or more commonly known as "The Lipstick". It is high with 22 floors and coloured in red-white stripes. The skyscraper was designed by Ralph Erskine and built by Skanska in the late 1980s as the headquarters for the company. By the shore of the Göta Älv at Lilla Bommen is The Göteborg Opera. It was completed in 1994. The architect Jan Izikowitz was inspired by the landscape and described his vision as "Something that makes your mind float over the squiggling landscape like the wings of a seagull." Feskekörka, or Fiskhallen, is an indoor fishmarket by the Rosenlundskanalen in central Gothenburg. Feskekörkan was opened on 1November 1874 and its name from the building's resemblance to a Gothic church. The Gothenburg city hall is in the Beaux-Arts architectural style. The Gothenburg Synagogue at Stora Nygatan, near Drottningtorget, was built in 1855 according to the designs of the German architect August Krüger. The Gunnebo House is a country house located to the south of Gothenburg, in Mölndal. It was built in a neoclassical architecture towards the end of the 18th century. Created in the early 1900s was the Vasa Church. It is located in Vasastan and is built of granite in a neo-Romanesque style. Another noted construction is Brudaremossen TV Tower, one of the few partially guyed towers in the world. Culture The sea, trade, and industrial history of the city are evident in the cultural life of Gothenburg. It is also a popular destination for tourists on the Swedish west coast. Museums Many of the cultural institutions, as well as hospitals and the university, were created by donations from rich merchants and industrialists, for example the Röhsska Museum. On 29December 2004, the Museum of World Culture opened near Korsvägen. Museums include the Gothenburg Museum of Art, and several museums of sea and navigation history, natural history, the sciences, and East India. Aeroseum, close to the Göteborg City Airport, is an aircraft museum in a former military underground air force base. The Volvo museum has exhibits of the history of Volvo and the development from 1927 until today. Products shown include cars, trucks, marine engines, and buses. Universeum is a public science centre that opened in 2001, the largest of its kind in Scandinavia. It is divided into six sections, each containing experimental workshops and a collection of reptiles, fish, and insects. Universeum occasionally host debates between Swedish secondary-school students and Nobel Prize laureates or other scholars. Leisure and entertainment The most noted attraction is the amusement park Liseberg, located in the central part of the city. It is the largest amusement park in Scandinavia by number of rides, and was chosen as one of the top ten amusement parks in the world (2005) by Forbes. It is the most popular attraction in Sweden by number of visitors per year (more than 3 million). There are a number of independent theatre ensembles in the city, besides institutions such as Gothenburg City Theatre, Backa Theatre (youth theatre), and Folkteatern. The main boulevard is called Kungsportsavenyn (commonly known as Avenyn, "The Avenue"). It is about long and starts at Götaplatsen – which is the location of the Gothenburg Museum of Art, the city's theatre, and the city library, as well as the concert hall – and stretches all the way to Kungsportsplatsen in the old city centre of Gothenburg, crossing a canal and a small park. The Avenyn was created in the 1860s and 1870s as a result of an international architecture contest, and is the product of a period of extensive town planning and remodelling. Avenyn has Gothenburg's highest concentration of pubs and clubs. Gothenburg's largest shopping centre (8th largest in Sweden), Nordstan, is located in central Gothenburg. Gothenburg's Haga district is known for its picturesque wooden houses and its cafés serving the well-known Haga bulle – a large cinnamon roll similar to the kanelbulle. Five Gothenburg restaurants have a star in the 2008 Michelin Guide: 28 +, Basement, Fond, Kock & Vin, Fiskekrogen, and Sjömagasinet. The city has a number of star chefs – over the past decade, seven of the Swedish Chef of the Year awards have been won by people from Gothenburg. The Gustavus Adolphus pastry, eaten every 6November in Sweden, Gustavus Adolphus Day, is especially connected to, and appreciated in, Gothenburg because the city was founded by King Gustavus Adolphus. One of Gothenburg's most popular natural tourist attractions is the southern Gothenburg archipelago, which is a set of several islands that can be reached by ferry boats mainly operating from Saltholmen. Within the archipelago are the Älvsborg fortress, Vinga and Styrsö islands. Festivals and fairs The annual Gothenburg Film Festival, is the largest film festival in Scandinavia. The Gothenburg Book Fair, held each year in September. It is the largest literary festival in Scandinavia, and the second largest book fair in Europe. A radical bookfair is held at the same time at the Syndikalistiskt Forum. The International Science Festival in Gothenburg is an annual festival since April 1997, in central Gothenburg with thought-provoking science activities for the public. The festival is visited by about people each year. This makes it the largest popular-science event in Sweden and one of the leading popular-science events in Europe. Citing the financial crisis, the International Federation of Library Associations and Institutions moved the 2010 World Library and Information Congress, previously to be held in Brisbane, Australia, to Gothenburg. The event took place on 10–15August 2010. Music Gothenburg has a diverse music community—the Gothenburg Symphony Orchestra is the best-known in classical music. Gothenburg also was the birthplace of the Swedish composer Kurt Atterberg. The first internationally successfully Swedish group, instrumental rock group The Spotnicks came from Gothenburg. Bands such as The Soundtrack of Our Lives and Ace of Base are well-known pop representatives of the city. During the 1970s, Gothenburg had strong roots in the Swedish progressive movement (progg) with such groups as Nationalteatern, Nynningen, and Motvind. The record company Nacksving and the editorial office for the magazine Musikens Makt which also were part of the progg movement were located in Gothenburg during this time as well. There is also an active indie scene in Gothenburg. For example, the musician Jens Lekman was born in the suburb of Angered and named his 2007 release Night Falls Over Kortedala after another suburb, Kortedala. Other internationally acclaimed indie artists include the electro pop duos Studio, The Knife, Air France, The Tough Alliance, indie rock band Love is All, songwriter José González, and pop singer El Perro del Mar, as well as genre-bending quartet Little Dragon fronted by vocalist Yukimi Nagano. Another son of the city is one of Sweden's most popular singers, Håkan Hellström, who often includes many places from the city in his songs. The glam rock group Supergroupies derives from Gothenburg. Gothenburg's own commercially successful At the Gates, In Flames, and Dark Tranquillity are credited with pioneering melodic death metal. Other well-known bands of the Gothenburg scene are thrash metal band The Haunted, progressive power metal band Evergrey, and power metal bands HammerFall and Dream Evil. Many music festivals take place in the city every year. The Metaltown Festival is a two-day festival featuring heavy metal music bands, held in Gothenburg. It has been arranged annually since 2004, taking place at the Frihamnen venue. In June 2012, the festival included bands such as In Flames, Marilyn Manson, Slayer, Lamb of God, and Mastodon. Another popular festival, Way Out West, focuses more on rock, electronic, and hip-hop genres. Sports As in all of Sweden, a variety of sports are followed, including football, ice hockey, basketball, handball, floorball, baseball, and figure skating. A varied amateur and professional sports clubs scene exists. Gothenburg is the birthplace of football in Sweden as the first football match in Sweden was played there in 1892. The city's three major football clubs, IFK Göteborg, Örgryte IS, and GAIS share a total of 34 Swedish championships between them. IFK has also won the UEFA Cup twice. Other notable clubs include BK Häcken (football), Göteborg HC (women's ice hockey), Pixbo Wallenstam IBK (floorball), multiple national handball champion Redbergslids IK, and four-time national ice hockey champion Frölunda HC, Gothenburg had a professional basketball team, Gothia Basket, until 2010 when it ceased. The bandy department of GAIS, GAIS Bandy, played the first season in the highest division Elitserien last season. The group stage match between the main rivals Sweden and Russia in the 2013 Bandy World Championship was played at Arena Heden in central Gothenburg. The city's most notable sports venues are Scandinavium, and Ullevi (multisport) and the newly built Gamla Ullevi (football). The 2003 World Allround Speed Skating Championships were held in Rudhallen, Sweden's only indoor speed-skating arena. It is a part of Ruddalens IP, which also has a bandy field and several football fields. The only Swedish heavyweight champion of the world in boxing, Ingemar Johansson, who took the title from Floyd Paterson in 1959, was from Gothenburg. Gothenburg has hosted a number of international sporting events including the 1958 FIFA World Cup, the 1983 European Cup Winners' Cup Final, an NFL preseason game on 14August 1988 between the Chicago Bears and the Minnesota Vikings, the 1992 European Football Championship, the 1993 and the 2002 World Men's Handball Championship, the 1995 World Championships in Athletics, the 1997 World Championships in Swimming (short track), the 2002 Ice Hockey World Championships, the 2004 UEFA Cup final, the 2006 European Championships in Athletics, and the 2008 World Figure Skating Championships. Annual events held in the city are the Gothia Cup and the Göteborgsvarvet. The annual Gothia Cup, is the world's largest football tournament with regards to the number of participants: in 2011, a total of 35,200 players from 1,567 teams and 72 nations participated. Gothenburg hosted the XIII FINA World Masters Championships in 2010. Diving, swimming, synchronized swimming and open-water competitions were held on 28July to 7August. The water polo events were played on the neighboring city of Borås. Gothenburg is also home to the Gothenburg Sharks, a professional baseball team in the Elitserien division of baseball in Sweden. With around 25,000 sailboats and yachts scattered about the city, sailing is a popular sports activity in the region, particularly because of the nearby Gothenburg archipelago. In June 2015, the Volvo Ocean Race, professional sailing's leading crewed offshore race, concluded in Gothenburg, as well as an event in the 2015–2016 America's Cup World Series in August 2015. The Gothenburg Amateur Diving Club (Göteborgs amatördykarklubb) has been operating since October 1938. Economy Due to Gothenburg's advantageous location in the centre of Scandinavia, trade and shipping have always played a major role in the city's economic history, and they continue to do so. Gothenburg port has come to be the largest harbour in Scandinavia. Apart from trade, the second pillar of Gothenburg has traditionally been manufacturing and industry, which significantly contributes to the city's wealth. Major companies operating plants in the area include SKF, Volvo (both cars and trucks), and Ericsson. Volvo Cars is the largest employer in Gothenburg, not including jobs in supply companies. The blue-collar industries which have dominated the city for long are still important factors in the city's economy, but they are being gradually replaced by high-tech industries. Banking and finance are also important, as well as the event and tourist industry. Gothenburg is the terminus of the Valdemar-Göteborg gas pipeline, which brings natural gas from the North Sea fields to Sweden, through Denmark. Historically, Gothenburg was home base from the 18th century of the Swedish East India Company. From its founding until the late 1970s, the city was a world leader in shipbuilding, with such shipyards as Eriksbergs Mekaniska Verkstad, Götaverken, Arendalsvarvet, and Lindholmens varv. Gothenburg is classified as a global city by GaWC, with a ranking of Gamma. The city has been ranked as the 12th-most inventive city in the world by Forbes. Government Gothenburg became a city municipality with an elected city council when the first Swedish local government acts were implemented in 1863. The municipality has an assembly consisting of 81 members, elected every fourth year. Political decisions depend on citizens
only county in Sweden that is not governed by a county council. The municipality handles the tasks that are otherwise handled by the county council, mainly health care and public transport. Like other counties, Gotland has a County Administrative Board that oversees implementation of the Swedish state government. Both the County Administrative Board and the municipality have their seat in the largest city Visby, with over 22,000 inhabitants. Province The provinces of Sweden are no longer officially administrative units, but are used when reporting population size, politics, etc. In this case the province, the county and the municipality all have identical borders and cover an area of 3151 km² Administration Gotland is the only Swedish county that is not administered by a county council. Instead, the municipality is tasked with the responsibilities of a county, including public health care and public transport. The main aim of the County Administrative Board is to fulfil the goals set in national politics by the Riksdag and the Government, to coordinate the interests and promote the
only Swedish county that is not administered by a county council. Instead, the municipality is tasked with the responsibilities of a county, including public health care and public transport. The main aim of the County Administrative Board is to fulfil the goals set in national politics by the Riksdag and the Government, to coordinate the interests and promote the development of the county, to establish regional goals and safeguard the due process of law in the handling of each case. The County Administrative Board is a Government agency headed by a Governor. Mats Löfving is the regional police chief for both Stockholm and Gotland Counties. Politics During a trial period the County Council provisions for Gotland has been evolved to provisions for a Regional Council, meaning that
became the first worldwide radio navigation system. Limitations of these systems drove the need for a more universal navigation solution with greater accuracy. Although there were wide needs for accurate navigation in military and civilian sectors, almost none of those was seen as justification for the billions of dollars it would cost in research, development, deployment, and operation of a constellation of navigation satellites. During the Cold War arms race, the nuclear threat to the existence of the United States was the one need that did justify this cost in the view of the United States Congress. This deterrent effect is why GPS was funded. It is also the reason for the ultra-secrecy at that time. The nuclear triad consisted of the United States Navy's submarine-launched ballistic missiles (SLBMs) along with United States Air Force (USAF) strategic bombers and intercontinental ballistic missiles (ICBMs). Considered vital to the nuclear deterrence posture, accurate determination of the SLBM launch position was a force multiplier. Precise navigation would enable United States ballistic missile submarines to get an accurate fix of their positions before they launched their SLBMs. The USAF, with two thirds of the nuclear triad, also had requirements for a more accurate and reliable navigation system. The U.S. Navy and U.S. Air Force were developing their own technologies in parallel to solve what was essentially the same problem. To increase the survivability of ICBMs, there was a proposal to use mobile launch platforms (comparable to the Soviet SS-24 and SS-25) and so the need to fix the launch position had similarity to the SLBM situation. In 1960, the Air Force proposed a radio-navigation system called MOSAIC (MObile System for Accurate ICBM Control) that was essentially a 3-D LORAN. A follow-on study, Project 57, was performed in 1963 and it was "in this study that the GPS concept was born." That same year, the concept was pursued as Project 621B, which had "many of the attributes that you now see in GPS" and promised increased accuracy for Air Force bombers as well as ICBMs. Updates from the Navy TRANSIT system were too slow for the high speeds of Air Force operation. The Naval Research Laboratory (NRL) continued making advances with their Timation (Time Navigation) satellites, first launched in 1967, second launched in 1969, with the third in 1974 carrying the first atomic clock into orbit and the fourth launched in 1977. Another important predecessor to GPS came from a different branch of the United States military. In 1964, the United States Army orbited its first Sequential Collation of Range (SECOR) satellite used for geodetic surveying. The SECOR system included three ground-based transmitters at known locations that would send signals to the satellite transponder in orbit. A fourth ground-based station, at an undetermined position, could then use those signals to fix its location precisely. The last SECOR satellite was launched in 1969. Development With these parallel developments in the 1960s, it was realized that a superior system could be developed by synthesizing the best technologies from 621B, Transit, Timation, and SECOR in a multi-service program. Satellite orbital position errors, induced by variations in the gravity field and radar refraction among others, had to be resolved. A team led by Harold L Jury of Pan Am Aerospace Division in Florida from 1970 to 1973, used real-time data assimilation and recursive estimation to do so, reducing systematic and residual errors to a manageable level to permit accurate navigation. During Labor Day weekend in 1973, a meeting of about twelve military officers at the Pentagon discussed the creation of a Defense Navigation Satellite System (DNSS). It was at this meeting that the real synthesis that became GPS was created. Later that year, the DNSS program was named Navstar. Navstar is often erroneously considered an acronym for "NAVigation System Using Timing and Ranging" but was never considered as such by the GPS Joint Program Office (TRW may have once advocated for a different navigational system that used that acronym). With the individual satellites being associated with the name Navstar (as with the predecessors Transit and Timation), a more fully encompassing name was used to identify the constellation of Navstar satellites, Navstar-GPS. Ten "Block I" prototype satellites were launched between 1978 and 1985 (an additional unit was destroyed in a launch failure). The effect of the ionosphere on radio transmission was investigated in a geophysics laboratory of Air Force Cambridge Research Laboratory, renamed to Air Force Geophysical Research Lab (AFGRL) in 1974. AFGRL developed the Klobuchar model for computing ionospheric corrections to GPS location. Of note is work done by Australian space scientist Elizabeth Essex-Cohen at AFGRL in 1974. She was concerned with the curving of the paths of radio waves (atmospheric refraction) traversing the ionosphere from NavSTAR satellites. After Korean Air Lines Flight 007, a Boeing 747 carrying 269 people, was shot down in 1983 after straying into the USSR's prohibited airspace, in the vicinity of Sakhalin and Moneron Islands, President Ronald Reagan issued a directive making GPS freely available for civilian use, once it was sufficiently developed, as a common good. The first Block II satellite was launched on February 14, 1989, and the 24th satellite was launched in 1994. The GPS program cost at this point, not including the cost of the user equipment but including the costs of the satellite launches, has been estimated at US$5 billion (equivalent to $ billion in ). Initially, the highest-quality signal was reserved for military use, and the signal available for civilian use was intentionally degraded, in a policy known as Selective Availability. This changed with President Bill Clinton signing on May 1, 2000, a policy directive to turn off Selective Availability to provide the same accuracy to civilians that was afforded to the military. The directive was proposed by the U.S. Secretary of Defense, William Perry, in view of the widespread growth of differential GPS services by private industry to improve civilian accuracy. Moreover, the U.S. military was actively developing technologies to deny GPS service to potential adversaries on a regional basis. Since its deployment, the U.S. has implemented several improvements to the GPS service, including new signals for civil use and increased accuracy and integrity for all users, all the while maintaining compatibility with existing GPS equipment. Modernization of the satellite system has been an ongoing initiative by the U.S. Department of Defense through a series of satellite acquisitions to meet the growing needs of the military, civilians, and the commercial market. As of early 2015, high-quality, FAA grade, Standard Positioning Service (SPS) GPS receivers provided horizontal accuracy of better than , although many factors such as receiver and antenna quality and atmospheric issues can affect this accuracy. GPS is owned and operated by the United States government as a national resource. The Department of Defense is the steward of GPS. The Interagency GPS Executive Board (IGEB) oversaw GPS policy matters from 1996 to 2004. After that, the National Space-Based Positioning, Navigation and Timing Executive Committee was established by presidential directive in 2004 to advise and coordinate federal departments and agencies on matters concerning the GPS and related systems. The executive committee is chaired jointly by the Deputy Secretaries of Defense and Transportation. Its membership includes equivalent-level officials from the Departments of State, Commerce, and Homeland Security, the Joint Chiefs of Staff and NASA. Components of the executive office of the president participate as observers to the executive committee, and the FCC chairman participates as a liaison. The U.S. Department of Defense is required by law to "maintain a Standard Positioning Service (as defined in the federal radio navigation plan and the standard positioning service signal specification) that will be available on a continuous, worldwide basis," and "develop measures to prevent hostile use of GPS and its augmentations without unduly disrupting or degrading civilian uses." Timeline and modernization In 1972, the USAF Central Inertial Guidance Test Facility (Holloman AFB) conducted developmental flight tests of four prototype GPS receivers in a Y configuration over White Sands Missile Range, using ground-based pseudo-satellites. In 1978, the first experimental Block-I GPS satellite was launched. In 1983, after Soviet interceptor aircraft shot down the civilian airliner KAL 007 that strayed into prohibited airspace because of navigational errors, killing all 269 people on board, U.S. President Ronald Reagan announced that GPS would be made available for civilian uses once it was completed, although it had been previously published [in Navigation magazine], and that the CA code (Coarse/Acquisition code) would be available to civilian users. By 1985, ten more experimental Block-I satellites had been launched to validate the concept. Beginning in 1988, command and control of these satellites was moved from Onizuka AFS, California to the 2nd Satellite Control Squadron (2SCS) located at Falcon Air Force Station in Colorado Springs, Colorado. On February 14, 1989, the first modern Block-II satellite was launched. The Gulf War from 1990 to 1991 was the first conflict in which the military widely used GPS. In 1991, a project to create a miniature GPS receiver successfully ended, replacing the previous military receivers with a handheld receiver. In 1992, the 2nd Space Wing, which originally managed the system, was inactivated and replaced by the 50th Space Wing. By December 1993, GPS achieved initial operational capability (IOC), with a full constellation (24 satellites) available and providing the Standard Positioning Service (SPS). Full Operational Capability (FOC) was declared by Air Force Space Command (AFSPC) in April 1995, signifying full availability of the military's secure Precise Positioning Service (PPS). In 1996, recognizing the importance of GPS to civilian users as well as military users, U.S. President Bill Clinton issued a policy directive declaring GPS a dual-use system and establishing an Interagency GPS Executive Board to manage it as a national asset. In 1998, United States Vice President Al Gore announced plans to upgrade GPS with two new civilian signals for enhanced user accuracy and reliability, particularly with respect to aviation safety, and in 2000 the United States Congress authorized the effort, referring to it as GPS III. On May 2, 2000 "Selective Availability" was discontinued as a result of the 1996 executive order, allowing civilian users to receive a non-degraded signal globally. In 2004, the United States government signed an agreement with the European Community establishing cooperation related to GPS and Europe's Galileo system. In 2004, United States President George W. Bush updated the national policy and replaced the executive board with the National Executive Committee for Space-Based Positioning, Navigation, and Timing. November 2004, Qualcomm announced successful tests of assisted GPS for mobile phones. In 2005, the first modernized GPS satellite was launched and began transmitting a second civilian signal (L2C) for enhanced user performance. On September 14, 2007, the aging mainframe-based Ground Segment Control System was transferred to the new Architecture Evolution Plan. On May 19, 2009, the United States Government Accountability Office issued a report warning that some GPS satellites could fail as soon as 2010. On May 21, 2009, the Air Force Space Command allayed fears of GPS failure, saying "There's only a small risk we will not continue to exceed our performance standard." On January 11, 2010, an update of ground control systems caused a software incompatibility with 8,000 to 10,000 military receivers manufactured by a division of Trimble Navigation Limited of Sunnyvale, Calif. On February 25, 2010, the U.S. Air Force awarded the contract to develop the GPS Next Generation Operational Control System (OCX) to improve accuracy and availability of GPS navigation signals, and serve as a critical part of GPS modernization. Awards On February 10, 1993, the National Aeronautic Association selected the GPS Team as winners of the 1992 Robert J. Collier Trophy, the US's most prestigious aviation award. This team combines researchers from the Naval Research Laboratory, the USAF, the Aerospace Corporation, Rockwell International Corporation, and IBM Federal Systems Company. The citation honors them "for the most significant development for safe and efficient navigation and surveillance of air and spacecraft since the introduction of radio navigation 50 years ago." Two GPS developers received the National Academy of Engineering Charles Stark Draper Prize for 2003: Ivan Getting, emeritus president of The Aerospace Corporation and an engineer at MIT, established the basis for GPS, improving on the World War II land-based radio system called LORAN (Long-range Radio Aid to Navigation). Bradford Parkinson, professor of aeronautics and astronautics at Stanford University, conceived the present satellite-based system in the early 1960s and developed it in conjunction with the U.S. Air Force. Parkinson served twenty-one years in the Air Force, from 1957 to 1978, and retired with the rank of colonel. GPS developer Roger L. Easton received the National Medal of Technology on February 13, 2006. Francis X. Kane (Col. USAF, ret.) was inducted into the U.S. Air Force Space and Missile Pioneers Hall of Fame at Lackland A.F.B., San Antonio, Texas, March 2, 2010, for his role in space technology development and the engineering design concept of GPS conducted as part of Project 621B. In 1998, GPS technology was inducted into the Space Foundation Space Technology Hall of Fame. On October 4, 2011, the International Astronautical Federation (IAF) awarded the Global Positioning System (GPS) its 60th Anniversary Award, nominated by IAF member, the American Institute for Aeronautics and Astronautics (AIAA). The IAF Honors and Awards Committee recognized the uniqueness of the GPS program and the exemplary role it has played in building international collaboration for the benefit of humanity. On December 6, 2018, Gladys West was inducted into the Air Force Space and Missile Pioneers Hall of Fame in recognition of her work on an extremely accurate geodetic Earth model, which was ultimately used to determine the orbit of the GPS constellation. On February 12, 2019, four founding members of the project were awarded the Queen Elizabeth Prize for Engineering with the chair of the awarding board stating "Engineering is the foundation of civilisation; there is no other foundation; it makes things happen. And that's exactly what today's Laureates have done - they've made things happen. They've re-written, in a major way, the infrastructure of our world." Basic concept Fundamentals The GPS receiver calculates its own four-dimensional position in spacetime based on data received from multiple GPS satellites. Each satellite carries an accurate record of its position and time, and transmits that data to the receiver. The satellites carry very stable atomic clocks that are synchronized with one another and with ground clocks. Any drift from time maintained on the ground is corrected daily. In the same manner, the satellite locations are known with great precision. GPS receivers have clocks as well, but they are less stable and less precise. Since the speed of radio waves is constant and independent of the satellite speed, the time delay between when the satellite transmits a signal and the receiver receives it is proportional to the distance from the satellite to the receiver. At a minimum, four satellites must be in view of the receiver for it to compute four unknown quantities (three position coordinates and the deviation of its own clock from satellite time). More detailed description Each GPS satellite continually broadcasts a signal (carrier wave with modulation) that includes: A pseudorandom code (sequence of ones and zeros) that is known to the receiver. By time-aligning a receiver-generated version and the receiver-measured version of the code, the time of arrival (TOA) of a defined point in the code sequence, called an epoch, can be found in the receiver clock time scale A message that includes the time of transmission (TOT) of the code epoch (in GPS time scale) and the satellite position at that time Conceptually, the receiver measures the TOAs (according to its own clock) of four satellite signals. From the TOAs and the TOTs, the receiver forms four time of flight (TOF) values, which are (given the speed of light) approximately equivalent to receiver-satellite ranges plus time difference between the receiver and GPS satellites multiplied by speed of light, which are called pseudo-ranges. The receiver then computes its three-dimensional position and clock deviation from the four TOFs. In practice the receiver position (in three dimensional Cartesian coordinates with origin at the Earth's center) and the offset of the receiver clock relative to the GPS time are computed simultaneously, using the navigation equations to process the TOFs. The receiver's Earth-centered solution location is usually converted to latitude, longitude and height relative to an ellipsoidal Earth model. The height may then be further converted to height relative to the geoid, which is essentially mean sea level. These coordinates may be displayed, such as on a moving map display, or recorded or used by some other system, such as a vehicle guidance system. User-satellite geometry Although usually not formed explicitly in the receiver processing, the conceptual time differences of arrival (TDOAs) define the measurement geometry. Each TDOA corresponds to a hyperboloid of revolution (see Multilateration). The line connecting the two satellites involved (and its extensions) forms the axis of the hyperboloid. The receiver is located at the point where three hyperboloids intersect. It is sometimes incorrectly said that the user location is at the intersection of three spheres. While simpler to visualize, this is the case only if the receiver has a clock synchronized with the satellite clocks (i.e., the receiver measures true ranges to the satellites rather than range differences). There are marked performance benefits to the user carrying a clock synchronized with the satellites. Foremost is that only three satellites are needed to compute a position solution. If it were an essential part of the GPS concept that all users needed to carry a synchronized clock, a smaller number of satellites could be deployed, but the cost and complexity of the user equipment would increase. Receiver in continuous operation The description above is representative of a receiver start-up situation. Most receivers have a track algorithm, sometimes called a tracker, that combines sets of satellite measurements collected at different times—in effect, taking advantage of the fact that successive receiver positions are usually close to each other. After a set of measurements are processed, the tracker predicts the receiver location corresponding to the next set of satellite measurements. When the new measurements are collected, the receiver uses a weighting scheme to combine the new measurements with the tracker prediction. In general, a tracker can (a) improve receiver position and time accuracy, (b) reject bad measurements, and (c) estimate receiver speed and direction. The disadvantage of a tracker is that changes in speed or direction can be computed only with a delay, and that derived direction becomes inaccurate when the distance traveled between two position measurements drops below or near the random error of position measurement. GPS units can use measurements of the Doppler shift of the signals received to compute velocity accurately. More advanced navigation systems use additional sensors like a compass or an inertial navigation system to complement GPS. Non-navigation applications GPS requires four or more satellites to be visible for accurate navigation. The solution of the navigation equations gives the position of the receiver along with the difference between the time kept by the receiver's on-board clock and the true time-of-day, thereby eliminating the need for a more precise and possibly impractical receiver based clock. Applications for GPS such as time transfer, traffic signal timing, and synchronization of cell phone base stations, make use of this cheap and highly accurate timing. Some GPS applications use this time for display, or, other than for the basic position calculations, do not use it at all. Although four satellites are required for normal operation, fewer apply in special cases. If one variable is already known, a receiver can determine its position using only three satellites. For example, a ship on the open ocean usually has a known elevation close to 0m, and the elevation of an aircraft may be known. Some GPS receivers may use additional clues or assumptions such as reusing the last known altitude, dead reckoning, inertial navigation, or including information from the vehicle computer, to give a (possibly degraded) position when fewer than four satellites are visible. Structure The current GPS consists of three major segments. These are the space segment, a control segment, and a user segment. The U.S. Space Force develops, maintains, and operates the space and control segments. GPS satellites broadcast signals from space, and each GPS receiver uses these signals to calculate its three-dimensional location (latitude, longitude, and altitude) and the current time. Space segment The space segment (SS) is composed of 24 to 32 satellites, or Space Vehicles (SV), in medium Earth orbit, and also includes the payload adapters to the boosters required to launch them into orbit. The GPS design originally called for 24 SVs, eight each in three approximately circular orbits, but this was modified to six orbital planes with four satellites each. The six orbit planes have approximately 55° inclination (tilt relative to the Earth's equator) and are separated by 60° right ascension of the ascending node (angle along the equator from a reference point to the orbit's intersection). The orbital period is one-half a sidereal day, i.e., 11 hours and 58 minutes so that the satellites pass over the same locations or almost the same locations every day. The orbits are arranged so that at least six satellites are always within line of sight from everywhere on the Earth's surface (see animation at right). The result of this objective is that the four satellites are not evenly spaced (90°) apart within each orbit. In general terms, the angular difference between satellites in each orbit is 30°, 105°, 120°, and 105° apart, which sum to 360°. Orbiting at an altitude of approximately ; orbital radius of approximately , each SV makes two complete orbits each sidereal day, repeating the same ground track each day. This was very helpful during development because even with only four satellites, correct alignment means all four are visible from one spot for a few hours each day. For military operations, the ground track repeat can be used to ensure good coverage in combat zones. , there are 31 satellites in the GPS constellation, 27 of which are in use at a given time with the rest allocated as stand-bys. A 32nd was launched in 2018, but as of July 2019 is still in evaluation. More decommissioned satellites are in orbit and available as spares. The additional satellites improve the precision of GPS receiver calculations by providing redundant measurements. With the increased number of satellites, the constellation was changed to a nonuniform arrangement. Such an arrangement was shown to improve accuracy but also improves reliability and availability of the system, relative to a uniform system, when multiple satellites fail. With the expanded constellation, nine satellites are usually visible at any time from any point on the Earth with a clear horizon, ensuring considerable redundancy over the minimum four satellites needed for a position. Control segment The control segment (CS) is composed of: a master control station (MCS), an alternative master control station, four dedicated ground antennas, and six dedicated monitor stations. The MCS can also access Satellite Control Network (SCN) ground antennas (for additional command and control capability) and NGA (National Geospatial-Intelligence Agency) monitor stations. The flight paths of the satellites are tracked by dedicated U.S. Space Force monitoring stations in Hawaii, Kwajalein Atoll, Ascension Island, Diego Garcia, Colorado Springs, Colorado and Cape Canaveral, along with shared NGA monitor stations operated in England, Argentina, Ecuador, Bahrain, Australia and Washington DC. The tracking information is sent to the MCS at Schriever Space Force Base ESE of Colorado Springs, which is operated by the 2nd Space Operations Squadron (2 SOPS) of the U.S. Space Force. Then 2 SOPS contacts each GPS satellite regularly with a navigational update using dedicated or shared (AFSCN) ground antennas (GPS dedicated ground antennas are located at Kwajalein, Ascension Island, Diego Garcia, and Cape Canaveral). These updates synchronize the atomic clocks on board the satellites to within a few nanoseconds of each other, and adjust the ephemeris of each satellite's internal orbital model. The updates are created by a Kalman filter that uses inputs from the ground monitoring stations, space weather information, and various other inputs. Satellite maneuvers are not precise by GPS standards—so to change a satellite's orbit, the satellite must be marked unhealthy, so receivers don't use it. After the satellite maneuver, engineers track the new orbit from the ground, upload the new ephemeris, and mark the satellite healthy again. The operation control segment (OCS) currently serves as the control segment of record. It provides the operational capability that supports GPS users and keeps the GPS operational and performing within specification. OCS successfully replaced the legacy 1970s-era mainframe computer at Schriever Air Force Base in September 2007. After installation, the system helped enable upgrades and provide a foundation for a new security architecture that supported U.S. armed forces. OCS will continue to be the ground control system of record until the new segment, Next Generation GPS Operation Control System (OCX), is fully developed and functional. The new capabilities provided by OCX will be the cornerstone for revolutionizing GPS's mission capabilities, enabling U.S. Space Force to greatly enhance GPS operational services to U.S. combat forces, civil partners and myriad domestic and international users. The GPS OCX program also will reduce cost, schedule and technical risk. It is designed to provide 50% sustainment cost savings through efficient software architecture and Performance-Based Logistics. In addition, GPS OCX is expected to cost millions less than the cost to upgrade OCS while providing four times the capability. The GPS OCX program represents a critical part of GPS modernization and provides significant information assurance improvements over the current GPS OCS program. OCX will have the ability to control and manage GPS legacy satellites as well as the next generation of GPS III satellites, while enabling the full array of military signals. Built on a flexible architecture that can rapidly
launched in the late 1990s. The U.S. Federal Communications Commission (FCC) mandated the feature in either the handset or in the towers (for use in triangulation) in 2002 so emergency services could locate 911 callers. Third-party software developers later gained access to GPS APIs from Nextel upon launch, followed by Sprint in 2006, and Verizon soon thereafter. Clock synchronization: the accuracy of GPS time signals (±10 ns) is second only to the atomic clocks they are based on, and is used in applications such as GPS disciplined oscillators. Disaster relief/emergency services: many emergency services depend upon GPS for location and timing capabilities. GPS-equipped radiosondes and dropsondes: measure and calculate the atmospheric pressure, wind speed and direction up to from the Earth's surface. Radio occultation for weather and atmospheric science applications. Fleet tracking: used to identify, locate and maintain contact reports with one or more fleet vehicles in real-time. Geodesy: determination of Earth orientation parameters including the daily and sub-daily polar motion, and length-of-day variabilities, Earth's center-of-mass - geocenter motion, and low-degree gravity field parameters. Geofencing: vehicle tracking systems, person tracking systems, and pet tracking systems use GPS to locate devices that are attached to or carried by a person, vehicle, or pet. The application can provide continuous tracking and send notifications if the target leaves a designated (or "fenced-in") area. Geotagging: applies location coordinates to digital objects such as photographs (in Exif data) and other documents for purposes such as creating map overlays with devices like Nikon GP-1 GPS aircraft tracking GPS for mining: the use of RTK GPS has significantly improved several mining operations such as drilling, shoveling, vehicle tracking, and surveying. RTK GPS provides centimeter-level positioning accuracy. GPS data mining: It is possible to aggregate GPS data from multiple users to understand movement patterns, common trajectories and interesting locations. GPS tours: location determines what content to display; for instance, information about an approaching point of interest. Navigation: navigators value digitally precise velocity and orientation measurements, as well as precise positions in real-time with a support of orbit and clock corrections. Orbit determination of low-orbiting satellites with GPS receiver installed on board, such as GOCE, GRACE, Jason-1, Jason-2, TerraSAR-X, TanDEM-X, CHAMP, Sentinel-3, and some cubesats, e.g., CubETH. Phasor measurements: GPS enables highly accurate timestamping of power system measurements, making it possible to compute phasors. Recreation: for example, Geocaching, Geodashing, GPS drawing, waymarking, and other kinds of location based mobile games such as Pokémon Go. Reference frames: realization and densification of the terrestrial reference frames in the framework of Global Geodetic Observing System. Co-location in space between Satellite laser ranging and microwave observations for deriving global geodetic parameters. Robotics: self-navigating, autonomous robots using GPS sensors, which calculate latitude, longitude, time, speed, and heading. Sport: used in football and rugby for the control and analysis of the training load. Surveying: surveyors use absolute locations to make maps and determine property boundaries. Tectonics: GPS enables direct fault motion measurement of earthquakes. Between earthquakes GPS can be used to measure crustal motion and deformation to estimate seismic strain buildup for creating seismic hazard maps. Telematics: GPS technology integrated with computers and mobile communications technology in automotive navigation systems. Restrictions on civilian use The U.S. government controls the export of some civilian receivers. All GPS receivers capable of functioning above above sea level and , or designed or modified for use with unmanned missiles and aircraft, are classified as munitions (weapons)—which means they require State Department export licenses. This rule applies even to otherwise purely civilian units that only receive the L1 frequency and the C/A (Coarse/Acquisition) code. Disabling operation above these limits exempts the receiver from classification as a munition. Vendor interpretations differ. The rule refers to operation at both the target altitude and speed, but some receivers stop operating even when stationary. This has caused problems with some amateur radio balloon launches that regularly reach . These limits only apply to units or components exported from the United States. A growing trade in various components exists, including GPS units from other countries. These are expressly sold as ITAR-free. Military As of 2009, military GPS applications include: Navigation: Soldiers use GPS to find objectives, even in the dark or in unfamiliar territory, and to coordinate troop and supply movement. In the United States armed forces, commanders use the Commander's Digital Assistant and lower ranks use the Soldier Digital Assistant. Target tracking: Various military weapons systems use GPS to track potential ground and air targets before flagging them as hostile. These weapon systems pass target coordinates to precision-guided munitions to allow them to engage targets accurately. Military aircraft, particularly in air-to-ground roles, use GPS to find targets. Missile and projectile guidance: GPS allows accurate targeting of various military weapons including ICBMs, cruise missiles, precision-guided munitions and artillery shells. Embedded GPS receivers able to withstand accelerations of 12,000 g or about have been developed for use in howitzer shells. Search and rescue. Reconnaissance: Patrol movement can be managed more closely. GPS satellites carry a set of nuclear detonation detectors consisting of an optical sensor called a bhangmeter, an X-ray sensor, a dosimeter, and an electromagnetic pulse (EMP) sensor (W-sensor), that form a major portion of the United States Nuclear Detonation Detection System. General William Shelton has stated that future satellites may drop this feature to save money. GPS type navigation was first used in war in the 1991 Persian Gulf War, before GPS was fully developed in 1995, to assist Coalition Forces to navigate and perform maneuvers in the war. The war also demonstrated the vulnerability of GPS to being jammed, when Iraqi forces installed jamming devices on likely targets that emitted radio noise, disrupting reception of the weak GPS signal. GPS's vulnerability to jamming is a threat that continues to grow as jamming equipment and experience grows. GPS signals have been reported to have been jammed many times over the years for military purposes. Russia seems to have several objectives for this behavior, such as intimidating neighbors while undermining confidence in their reliance on American systems, promoting their GLONASS alternative, disrupting Western military exercises, and protecting assets from drones. China uses jamming to discourage US surveillance aircraft near the contested Spratly Islands. North Korea has mounted several major jamming operations near its border with South Korea and offshore, disrupting flights, shipping and fishing operations. Iranian Armed Forces disrupted the civilian airliner plane Flight PS752's GPS when it shot down the aircraft. Timekeeping Leap seconds While most clocks derive their time from Coordinated Universal Time (UTC), the atomic clocks on the satellites are set to "GPS time". The difference is that GPS time is not corrected to match the rotation of the Earth, so it does not contain leap seconds or other corrections that are periodically added to UTC. GPS time was set to match UTC in 1980, but has since diverged. The lack of corrections means that GPS time remains at a constant offset with International Atomic Time (TAI) (TAI - GPS = 19 seconds). Periodic corrections are performed to the on-board clocks to keep them synchronized with ground clocks. The GPS navigation message includes the difference between GPS time and UTC. GPS time is 18 seconds ahead of UTC because of the leap second added to UTC on December 31, 2016. Receivers subtract this offset from GPS time to calculate UTC and specific time zone values. New GPS units may not show the correct UTC time until after receiving the UTC offset message. The GPS-UTC offset field can accommodate 255 leap seconds (eight bits). Accuracy GPS time is theoretically accurate to about 14 nanoseconds, due to the clock drift relative to International Atomic Time that the atomic clocks in GPS transmitters experience Most receivers lose some accuracy in their interpretation of the signals and are only accurate to about 100 nanoseconds. Format As opposed to the year, month, and day format of the Gregorian calendar, the GPS date is expressed as a week number and a seconds-into-week number. The week number is transmitted as a ten-bit field in the C/A and P(Y) navigation messages, and so it becomes zero again every 1,024 weeks (19.6 years). GPS week zero started at 00:00:00 UTC (00:00:19 TAI) on January 6, 1980, and the week number became zero again for the first time at 23:59:47 UTC on August 21, 1999 (00:00:19 TAI on August 22, 1999). It happened the second time at 23:59:42 UTC on April 6, 2019. To determine the current Gregorian date, a GPS receiver must be provided with the approximate date (to within 3,584 days) to correctly translate the GPS date signal. To address this concern in the future the modernized GPS civil navigation (CNAV) message will use a 13-bit field that only repeats every 8,192 weeks (157 years), thus lasting until 2137 (157 years after GPS week zero). Communication The navigational signals transmitted by GPS satellites encode a variety of information including satellite positions, the state of the internal clocks, and the health of the network. These signals are transmitted on two separate carrier frequencies that are common to all satellites in the network. Two different encodings are used: a public encoding that enables lower resolution navigation, and an encrypted encoding used by the U.S. military. Message format {|class="wikitable" style="float:right; margin:0 0 0.5em 1em;" border="1" |+ ! Subframes !! Description |- | 1 || Satellite clock,GPS time relationship |- | 2–3 || Ephemeris(precise satellite orbit) |- | 4–5 || Almanac component(satellite network synopsis,error correction) |} Each GPS satellite continuously broadcasts a navigation message on L1 (C/A and P/Y) and L2 (P/Y) frequencies at a rate of 50 bits per second (see bitrate). Each complete message takes 750 seconds ( minutes) to complete. The message structure has a basic format of a 1500-bit-long frame made up of five subframes, each subframe being 300 bits (6 seconds) long. Subframes 4 and 5 are subcommutated 25 times each, so that a complete data message requires the transmission of 25 full frames. Each subframe consists of ten words, each 30 bits long. Thus, with 300 bits in a subframe times 5 subframes in a frame times 25 frames in a message, each message is 37,500 bits long. At a transmission rate of 50-bit/s, this gives 750 seconds to transmit an entire almanac message (GPS). Each 30-second frame begins precisely on the minute or half-minute as indicated by the atomic clock on each satellite. The first subframe of each frame encodes the week number and the time within the week, as well as the data about the health of the satellite. The second and the third subframes contain the ephemeris – the precise orbit for the satellite. The fourth and fifth subframes contain the almanac, which contains coarse orbit and status information for up to 32 satellites in the constellation as well as data related to error correction. Thus, to obtain an accurate satellite location from this transmitted message, the receiver must demodulate the message from each satellite it includes in its solution for 18 to 30 seconds. To collect all transmitted almanacs, the receiver must demodulate the message for 732 to 750 seconds or minutes. All satellites broadcast at the same frequencies, encoding signals using unique code-division multiple access (CDMA) so receivers can distinguish individual satellites from each other. The system uses two distinct CDMA encoding types: the coarse/acquisition (C/A) code, which is accessible by the general public, and the precise (P(Y)) code, which is encrypted so that only the U.S. military and other NATO nations who have been given access to the encryption code can access it. The ephemeris is updated every 2 hours and is sufficiently stable for 4 hours, with provisions for updates every 6 hours or longer in non-nominal conditions. The almanac is updated typically every 24 hours. Additionally, data for a few weeks following is uploaded in case of transmission updates that delay data upload. Satellite frequencies {|class="wikitable" style="float:right; width:30em; margin:0 0 0.5em 1em;" border="1" |+ ! Band !! Frequency !! Description |- | L1 || 1575.42 MHz || Coarse-acquisition (C/A) and encrypted precision (P(Y)) codes, plus the L1 civilian (L1C) and military (M) codes on Block III and newer satellites. |- | L2 || 1227.60 MHz || P(Y) code, plus the L2C and military codes on the Block IIR-M and newer satellites. |- | L3 || 1381.05 MHz || Used for nuclear detonation (NUDET) detection. |- | L4 || 1379.913 MHz || Being studied for additional ionospheric correction. |- | L5 || 1176.45 MHz || Used as a civilian safety-of-life (SoL) signal on Block IIF and newer satellites. |} All satellites broadcast at the same two frequencies, 1.57542 GHz (L1 signal) and 1.2276 GHz (L2 signal). The satellite network uses a CDMA spread-spectrum technique where the low-bitrate message data is encoded with a high-rate pseudo-random (PRN) sequence that is different for each satellite. The receiver must be aware of the PRN codes for each satellite to reconstruct the actual message data. The C/A code, for civilian use, transmits data at 1.023 million chips per second, whereas the P code, for U.S. military use, transmits at 10.23 million chips per second. The actual internal reference of the satellites is 10.22999999543 MHz to compensate for relativistic effects that make observers on the Earth perceive a different time reference with respect to the transmitters in orbit. The L1 carrier is modulated by both the C/A and P codes, while the L2 carrier is only modulated by the P code. The P code can be encrypted as a so-called P(Y) code that is only available to military equipment with a proper decryption key. Both the C/A and P(Y) codes impart the precise time-of-day to the user. The L3 signal at a frequency of 1.38105 GHz is used to transmit data from the satellites to ground stations. This data is used by the United States Nuclear Detonation (NUDET) Detection System (USNDS) to detect, locate, and report nuclear detonations (NUDETs) in the Earth's atmosphere and near space. One usage is the enforcement of nuclear test ban treaties. The L4 band at 1.379913 GHz is being studied for additional ionospheric correction. The L5 frequency band at 1.17645 GHz was added in the process of GPS modernization. This frequency falls into an internationally protected range for aeronautical navigation, promising little or no interference under all circumstances. The first Block IIF satellite that provides this signal was launched in May 2010. On February 5th 2016, the 12th and final Block IIF satellite was launched. The L5 consists of two carrier components that are in phase quadrature with each other. Each carrier component is bi-phase shift key (BPSK) modulated by a separate bit train. "L5, the third civil GPS signal, will eventually support safety-of-life applications for aviation and provide improved availability and accuracy." In 2011, a conditional waiver was granted to LightSquared to operate a terrestrial broadband service near the L1 band. Although LightSquared had applied for a license to operate in the 1525 to 1559 band as early as 2003 and it was put out for public comment, the FCC asked LightSquared to form a study group with the GPS community to test GPS receivers and identify issue that might arise due to the larger signal power from the LightSquared terrestrial network. The GPS community had not objected to the LightSquared (formerly MSV and SkyTerra) applications until November 2010, when LightSquared applied for a modification to its Ancillary Terrestrial Component (ATC) authorization. This filing (SAT-MOD-20101118-00239) amounted to a request to run several orders of magnitude more power in the same frequency band for terrestrial base stations, essentially repurposing what was supposed to be a "quiet neighborhood" for signals from space as the equivalent of a cellular network. Testing in the first half of 2011 has demonstrated that the impact of the lower 10 MHz of spectrum is minimal to GPS devices (less than 1% of the total GPS devices are affected). The upper 10 MHz intended for use by LightSquared may have some impact on GPS devices. There is some concern that this may seriously degrade the GPS signal for many consumer uses. Aviation Week magazine reports that the latest testing (June 2011) confirms "significant jamming" of GPS by LightSquared's system. Demodulation and decoding Because all of the satellite signals are modulated onto the same L1 carrier frequency, the signals must be separated after demodulation. This is done by assigning each satellite a unique binary sequence known as a Gold code. The signals are decoded after demodulation using addition of the Gold codes corresponding to the satellites monitored by the receiver. If the almanac information has previously been acquired, the receiver picks the satellites to listen for by their PRNs, unique numbers in the range 1 through 32. If the almanac information is not in memory, the receiver enters a search mode until a lock is obtained on one of the satellites. To obtain a lock, it is necessary that there be an unobstructed line of sight from the receiver to the satellite. The receiver can then acquire the almanac and determine the satellites it should listen for. As it detects each satellite's signal, it identifies it by its distinct C/A code pattern. There can be a delay of up to 30 seconds before the first estimate of position because of the need to read the ephemeris data. Processing of the navigation message enables the determination of the time of transmission and the satellite position at this time. For more information see Demodulation and Decoding, Advanced. Navigation equations Problem description The receiver uses messages received from satellites to determine the satellite positions and time sent. The x, y, and z components of satellite position and the time sent (s) are designated as [xi, yi, zi, si] where the subscript i denotes the satellite and has the value 1, 2, ..., n, where n ≥ 4. When the time of message reception indicated by the on-board receiver clock is t̃i, the true reception time is , where b is the receiver's clock bias from the much more accurate GPS clocks employed by the satellites. The receiver clock bias is the same for all received satellite signals (assuming the satellite clocks are all perfectly synchronized). The message's transit time is , where si is the satellite time. Assuming the message traveled at the speed of light, c, the distance traveled is . For n satellites, the equations to satisfy are: where di is the geometric distance or range between receiver and satellite i (the values without subscripts are the x, y, and z components of receiver position): Defining pseudoranges as , we see they are biased versions of the true range: . Since the equations have four unknowns [x, y, z, b]—the three components of GPS receiver position and the clock bias—signals from at least four satellites are necessary to attempt solving these equations. They can be solved by algebraic or numerical methods. Existence and uniqueness of GPS solutions are discussed by Abell and Chaffee. When n is greater than four, this system is overdetermined and a fitting method must be used. The amount of error in the results varies with the received satellites' locations in the sky, since certain configurations (when the received satellites are close together in the sky) cause larger errors. Receivers usually calculate a running estimate of the error in the calculated position. This is done by multiplying the basic resolution of the receiver by quantities called the geometric dilution of position (GDOP) factors, calculated from the relative sky directions of the satellites used. The receiver location is expressed in a specific coordinate system, such as latitude and longitude using the WGS 84 geodetic datum or a country-specific system. Geometric interpretation The GPS equations can be solved by numerical and analytical methods. Geometrical interpretations can enhance the understanding of these solution methods. Spheres The measured ranges, called pseudoranges, contain clock errors. In a simplified idealization in which the ranges are synchronized, these true ranges represent the radii of spheres, each centered on one of the transmitting satellites. The solution for the position of the receiver is then at the intersection of the surfaces of these spheres; see trilateration (more generally, true-range multilateration). Signals from at minimum three satellites are required, and their three spheres would typically intersect at two points. One of the points is the location of the receiver, and the other moves rapidly in successive measurements and would not usually be on Earth's surface. In practice, there are many sources of inaccuracy besides clock bias, including random errors as well as the potential for precision loss
to the east, Austria to the southeast, and Switzerland to the south-southwest. France, Luxembourg and Belgium are situated to the west, with the Netherlands to the northwest. Germany is also bordered by the North Sea and, at the north-northeast, by the Baltic Sea. German territory covers , consisting of of land and of water. Elevation ranges from the mountains of the Alps (highest point: the Zugspitze at ) in the south to the shores of the North Sea () in the northwest and the Baltic Sea () in the northeast. The forested uplands of central Germany and the lowlands of northern Germany (lowest point: in the municipality Neuendorf-Sachsenbande, Wilstermarsch at below sea level) are traversed by such major rivers as the Rhine, Danube and Elbe. Significant natural resources include iron ore, coal, potash, timber, lignite, uranium, copper, natural gas, salt, and nickel. Climate Most of Germany has a temperate climate, ranging from oceanic in the north to continental in the east and southeast. Winters range from the cold in the Southern Alps to mild and are generally overcast with limited precipitation, while summers can vary from hot and dry to cool and rainy. The northern regions have prevailing westerly winds that bring in moist air from the North Sea, moderating the temperature and increasing precipitation. Conversely, the southeast regions have more extreme temperatures. From February 2019 – 2020, average monthly temperatures in Germany ranged from a low of in January 2020 to a high of in June 2019. Average monthly precipitation ranged from 30 litres per square metre in February and April 2019 to 125 litres per square metre in February 2020. Average monthly hours of sunshine ranged from 45 in November 2019 to 300 in June 2019. The highest temperature ever recorded in Germany was 42.6 °C on 25 July 2019 in Lingen and the lowest was −37.8 °C on 12 February 1929 in Wolnzach. Biodiversity The territory of Germany can be divided into five terrestrial ecoregions: Atlantic mixed forests, Baltic mixed forests, Central European mixed forests, Western European broadleaf forests, and Alps conifer and mixed forests. 51% of Germany's land area is devoted to agriculture, while 30% is forested and 14% is covered by settlements or infrastructure. Plants and animals include those generally common to Central Europe. According to the National Forest Inventory, beeches, oaks, and other deciduous trees constitute just over 40% of the forests; roughly 60% are conifers, particularly spruce and pine. There are many species of ferns, flowers, fungi, and mosses. Wild animals include roe deer, wild boar, mouflon (a subspecies of wild sheep), fox, badger, hare, and small numbers of the Eurasian beaver. The blue cornflower was once a German national symbol. The 16 national parks in Germany include the Jasmund National Park, the Vorpommern Lagoon Area National Park, the Müritz National Park, the Wadden Sea National Parks, the Harz National Park, the Hainich National Park, the Black Forest National Park, the Saxon Switzerland National Park, the Bavarian Forest National Park and the Berchtesgaden National Park. In addition, there are 17 Biosphere Reserves, and 105 nature parks. More than 400 zoos and animal parks operate in Germany. The Berlin Zoo, which opened in 1844, is the oldest in Germany, and claims the most comprehensive collection of species in the world. Politics Germany is a federal, parliamentary, representative democratic republic. Federal legislative power is vested in the parliament consisting of the (Federal Diet) and (Federal Council), which together form the legislative body. The is elected through direct elections using the mixed-member proportional representation system. The members of the represent and are appointed by the governments of the sixteen federated states. The German political system operates under a framework laid out in the 1949 constitution known as the (Basic Law). Amendments generally require a two-thirds majority of both the and the ; the fundamental principles of the constitution, as expressed in the articles guaranteeing human dignity, the separation of powers, the federal structure, and the rule of law, are valid in perpetuity. The president, currently Frank-Walter Steinmeier, is the head of state and invested primarily with representative responsibilities and powers. He is elected by the (federal convention), an institution consisting of the members of the and an equal number of state delegates. The second-highest official in the German order of precedence is the (President of the Bundestag), who is elected by the and responsible for overseeing the daily sessions of the body. The third-highest official and the head of government is the chancellor, who is appointed by the after being elected by the party or coalition with the most seats in the . The chancellor, currently Olaf Scholz, is the head of government and exercises executive power through his Cabinet. Since 1949, the party system has been dominated by the Christian Democratic Union and the Social Democratic Party of Germany. So far every chancellor has been a member of one of these parties. However, the smaller liberal Free Democratic Party and the Alliance '90/The Greens have also been junior partners in coalition governments. Since 2007, the left-wing populist party The Left has been a staple in the German , though they have never been part of the federal government. In the 2017 German federal election, the right-wing populist Alternative for Germany gained enough votes to attain representation in the parliament for the first time. Constituent states Germany is a federal state and comprises sixteen constituent states which are collectively referred to as . Each state () has its own constitution, and is largely autonomous in regard to its internal organisation. Germany is divided into 401 districts () at a municipal level; these consist of 294 rural districts and 107 urban districts. Law Germany has a civil law system based on Roman law with some references to Germanic law. The (Federal Constitutional Court) is the German Supreme Court responsible for constitutional matters, with power of judicial review. Germany's supreme court system is specialised: for civil and criminal cases, the highest court of appeal is the inquisitorial Federal Court of Justice, and for other affairs the courts are the Federal Labour Court, the Federal Social Court, the Federal Finance Court and the Federal Administrative Court. Criminal and private laws are codified on the national level in the and the respectively. The German penal system seeks the rehabilitation of the criminal and the protection of the public. Except for petty crimes, which are tried before a single professional judge, and serious political crimes, all charges are tried before mixed tribunals on which lay judges () sit side by side with professional judges. Germany has a low murder rate with 1.18 murders per 100,000 . In 2018, the overall crime rate fell to its lowest since 1992. Foreign relations Germany has a network of 227 diplomatic missions abroad and maintains relations with more than 190 countries. Germany is a member of NATO, the OECD, the G8, the G20, the World Bank and the IMF. It has played an influential role in the European Union since its inception and has maintained a strong alliance with France and all neighbouring countries since 1990. Germany promotes the creation of a more unified European political, economic and security apparatus. The governments of Germany and the United States are close political allies. Cultural ties and economic interests have crafted a bond between the two countries resulting in Atlanticism. The development policy of Germany is an independent area of foreign policy. It is formulated by the Federal Ministry for Economic Cooperation and Development and carried out by the implementing organisations. The German government sees development policy as a joint responsibility of the international community. It was the world's second-biggest aid donor in 2019 after the United States. Military Germany's military, the , is organised into the (Army and special forces ), (Navy), (Air Force), (Joint Medical Service) and (Joint Support Service) branches. In absolute terms, German military expenditure is the eighth-highest in the world. In 2018, military spending was at $49.5 billion, about 1.2% of the country's GDP, well below the NATO target of 2%. , the has a strength of 184,001 active soldiers and 80,947 civilians. Reservists are available to the armed forces and participate in defence exercises and deployments abroad. Until 2011, military service was compulsory for men at age 18, but this has been officially suspended and replaced with a voluntary service. Since 2001 women may serve in all functions of service without restriction. According to the Stockholm International Peace Research Institute, Germany was the fourth-largest exporter of major arms in the world from 2014 to 2018. In peacetime, the is commanded by the Minister of Defence. In state of defence, the Chancellor would become commander-in-chief of the . The role of the is described in the Constitution of Germany as defensive only. But after a ruling of the Federal Constitutional Court in 1994, the term "defence" has been defined to not only include protection of the borders of Germany, but also crisis reaction and conflict prevention, or more broadly as guarding the security of Germany anywhere in the world. the German military has about 3,600 troops stationed in foreign countries as part of international peacekeeping forces, including about 1,200 supporting operations against Daesh, 980 in the NATO-led Resolute Support Mission in Afghanistan, and 800 in Kosovo. Economy Germany has a social market economy with a highly skilled labour force, a low level of corruption, and a high level of innovation. It is the world's third-largest exporter and third-largest importer of goods, and has the largest economy in Europe, which is also the world's fourth-largest economy by nominal GDP, and the fifth-largest by PPP. Its GDP per capita measured in purchasing power standards amounts to 121% of the EU27 average (100%). The service sector contributes approximately 69% of the total GDP, industry 31%, and agriculture 1% . The unemployment rate published by Eurostat amounts to 3.2% , which is the fourth-lowest in the EU. Germany is part of the European single market which represents more than 450 million consumers. In 2017, the country accounted for 28% of the Eurozone economy according to the International Monetary Fund. Germany introduced the common European currency, the Euro, in 2002. Its monetary policy is set by the European Central Bank, which is headquartered in Frankfurt. Being home to the modern car, the automotive industry in Germany is regarded as one of the most competitive and innovative in the world, and is the fourth-largest by production. The top ten exports of Germany are vehicles, machinery, chemical goods, electronic products, electrical equipments, pharmaceuticals, transport equipments, basic metals, food products, and rubber and plastics. Of the world's 500 largest stock-market-listed companies measured by revenue in 2019, the Fortune Global 500, 29 are headquartered in Germany. 30 major Germany-based companies are included in the DAX, the German stock market index which is operated by Frankfurt Stock Exchange. Well-known international brands include Mercedes-Benz, BMW, Volkswagen, Audi, Siemens, Allianz, Adidas, Porsche, Bosch and Deutsche Telekom. Berlin is a hub for startup companies and has become the leading location for venture capital funded firms in the European Union. Germany is recognised for its large portion of specialised small and medium enterprises, known as the model. These companies represent 48% global market leaders in their segments, labelled hidden champions. Research and development efforts form an integral part of the German economy. In 2018 Germany ranked fourth globally in terms of number of science and engineering research papers published. Germany was ranked 9th in the Global Innovation Index in 2019 and 2020. Research institutions in Germany include the Max Planck Society, the Helmholtz Association, and the Fraunhofer Society and the Leibniz Association. Germany is the largest contributor to the European Space Agency. Infrastructure With its central position in Europe, Germany is a transport hub for the continent. Its road network is among the densest in Europe. The motorway (Autobahn) is widely known for having no general federally mandated speed limit for some classes of vehicles. The Intercity Express or ICE train network serves major German cities as well as destinations in neighbouring countries with speeds up to . The largest German airports are Frankfurt Airport and Munich Airport. The Port of Hamburg is one of the top twenty largest container ports in the world. , Germany was the world's seventh-largest consumer of energy. The government and the nuclear power industry agreed to phase out all nuclear power plants by 2021. It meets the country's power demands using 40% renewable sources. Germany is committed to the Paris Agreement and several other treaties promoting biodiversity, low emission standards, and water management. The country's household recycling rate is among the highest in the world—at around 65%. The country's greenhouse gas emissions per capita were the ninth-highest in the EU . The German energy transition () is the recognised move to a sustainable economy by means of energy efficiency and renewable energy. Tourism Germany is the ninth most visited country in the world , with 37.4 million visits. Berlin has become the third most visited city destination in Europe. Domestic and international travel and tourism combined directly contribute over €105.3 billion to German GDP. Including indirect and induced impacts, the industry supports 4.2 million jobs. Germany's most visited and popular landmarks include Cologne Cathedral, the Brandenburg Gate, the Reichstag, the Dresden Frauenkirche, Neuschwanstein Castle, Heidelberg Castle, the Wartburg, and Sanssouci Palace. The Europa-Park near Freiburg is Europe's second most popular theme park resort. Demographics With a population of 80.2 million according to the 2011 German Census, rising to 83.1 million , Germany is the most populous country in the European Union, the second most populous country in Europe after Russia, and the nineteenth most populous country in the world. Its population density stands at 227 inhabitants per square kilometre (588 per square mile). The overall life expectancy in Germany at birth is 80.19 years (77.93 years for males and 82.58 years for females). The fertility rate of 1.41 children born per woman (2011 estimates) is below the replacement rate of 2.1 and is one of the lowest fertility rates in the world. Since the 1970s, Germany's death rate has exceeded its birth rate. However, Germany is witnessing increased birth rates and migration rates since the beginning of the 2010s. Germany has the third oldest population in the world, with an average age of 47.4 years. Four sizeable groups of people are referred to as "national minorities" because their ancestors have lived in their respective regions for centuries: There is a Danish minority in the northernmost state of Schleswig-Holstein; the Sorbs, a Slavic population, are in the Lusatia region of Saxony and Brandenburg; the Roma and Sinti live throughout the country; and the Frisians are concentrated in Schleswig-Holstein's western coast and in the north-western part of Lower Saxony. After the United States, Germany is the second most popular immigration destination in the world. The majority of migrants live in western Germany, in particular in urban areas. Of the country's residents, 18.6 million people (22.5%) were of immigrant or partially immigrant descent in 2016 (including persons descending or partially descending from ethnic German repatriates). In 2015, the Population Division of the United Nations Department of Economic and Social Affairs listed Germany as host to the second-highest number of international migrants worldwide, about 5% or 12 million of all 244 million migrants. , Germany ranks seventh amongst EU countries in terms of the percentage of migrants in the country's population, at 13.1%. Germany has a number of large cities. There are 11 officially recognised metropolitan regions. The country's largest city is Berlin, while its largest urban area is the Ruhr. Religion According to the 2011 census, Christianity was the largest religion in Germany, with 66.8% of respondents identifying as Christian, of which 3.8% were not church members. 31.7% declared themselves as Protestants, including members of the Evangelical Church in Germany (which encompasses Lutheran, Reformed, and administrative or confessional unions of both traditions) and the free churches (); 31.2% declared themselves as Roman Catholics, and Orthodox believers constituted 1.3%. According to data from 2016, the Catholic Church and the Evangelical Church claimed 28.5% and 27.5%, respectively, of the population. Islam is the second-largest religion in the country. In the 2011 census, 1.9% of respondents (1.52 million people) gave their religion as Islam, but this figure is deemed unreliable because a disproportionate number of adherents of this faith (and other religions, such as Judaism) are likely to have made use of their right not to answer the question. Most of the Muslims are Sunnis and Alevites from Turkey, but there are a small number of Shi'ites, Ahmadiyyas and other denominations. Other religions comprise less than one percent of Germany's population. A study in 2018 estimated that 38% of the population are not members of any religious organization or denomination, though up to a third may still consider themselves religious. Irreligion in Germany is strongest in the former East Germany, which used to be predominantly Protestant before the enforcement of state atheism, and in major metropolitan areas. Languages German is the official and predominant spoken language in Germany. It is one of 24 official and working languages of the European Union, and one of the three procedural languages of the European Commission. German is the most widely spoken first language in the European Union, with around 100 million native speakers. Recognised native minority languages in Germany are Danish, Low German, Low Rhenish, Sorbian, Romany, North Frisian and Saterland Frisian; they are officially protected by the European Charter for Regional or Minority Languages. The most used immigrant languages are Turkish, Arabic, Kurdish, Polish, the Balkan languages and Russian. Germans are typically multilingual: 67% of German citizens claim to be able to communicate in at least one foreign language and 27% in at least two. Education Responsibility for educational supervision in Germany is primarily organised within the individual states. Optional kindergarten education is provided for all children between three and six years old, after which school attendance is compulsory for at least nine years. Primary education usually lasts for four to six years. Secondary schooling is divided into tracks based on whether students pursue academic or vocational education. A system of apprenticeship called leads to a skilled qualification which is almost comparable to an academic degree. It allows students in vocational training to learn in a company as well as in a state-run trade school. This model is well regarded and reproduced all around the world. Most of the German universities are public institutions, and students traditionally study without fee payment. The general requirement for university is the . According to an OECD report in 2014, Germany is the world's third leading destination for international study. The established universities in Germany include some of the oldest in the world, with Heidelberg University (established in 1386) being the oldest. The Humboldt University of Berlin, founded in 1810 by the liberal educational reformer Wilhelm von Humboldt, became the academic model for many Western universities. In the contemporary era Germany has developed eleven Universities of Excellence. Health Germany's system of hospitals, called , dates from medieval times, and today, Germany has the world's oldest universal health care system, dating from Bismarck's social legislation of the 1880s. Since the 1880s, reforms and provisions have ensured a balanced health care system. The population is covered by a health insurance plan provided by statute, with criteria allowing some groups to opt for a private health insurance contract. According to the World Health Organization, Germany's health care system was 77% government-funded and 23% privately funded . In 2014, Germany spent 11.3% of its GDP on health care. Germany ranked 20th in the world in 2013 in life expectancy with 77 years for men and 82 years for women, and it had a very low infant mortality rate (4 per 1,000 live births). , the principal cause of death was cardiovascular disease, at 37%. Obesity in Germany has been increasingly cited as a major health issue. A 2014 study showed that 52 percent of the adult German population was overweight or obese. Culture Culture in German states has been shaped by major intellectual and popular currents in Europe, both religious and secular. Historically, Germany has been called ('the land of poets and thinkers'), because of the major role its scientists, writers and philosophers have played in the development of Western thought. A global opinion poll for the BBC revealed that Germany is recognised for having the most positive influence in the world in 2013 and 2014. Germany is well known for such folk festival traditions as Oktoberfest and Christmas customs, which include Advent wreaths, Christmas pageants, Christmas trees, Stollen cakes, and other practices. UNESCO inscribed 41 properties in Germany on the World Heritage List. There are a number of public holidays in Germany determined by each state; 3 October has been a national day of Germany since 1990, celebrated as the (German Unity Day). Music German classical music includes works by some of the world's most well-known composers. Dieterich Buxtehude, Johann Sebastian Bach and Georg Friedrich Händel were influential composers of the Baroque period. Ludwig van Beethoven was a crucial figure in the transition between the Classical and Romantic eras. Carl Maria von Weber, Felix Mendelssohn, Robert Schumann and Johannes Brahms were significant Romantic composers. Richard Wagner was known for his operas. Richard Strauss was a leading composer of the late Romantic and early modern eras. Karlheinz Stockhausen and Wolfgang Rihm are important composers of the 20th and early 21st centuries. As of 2013, Germany was the second-largest music market in Europe, and fourth-largest in the world. German popular music of the 20th and 21st centuries includes the movements of Neue Deutsche Welle, pop, Ostrock, heavy metal/rock, punk, pop rock, indie, Volksmusik (folk music), schlager pop and German hip hop. German electronic music gained global influence, with Kraftwerk and Tangerine Dream pioneering in this genre. DJs and artists of the techno and house music scenes of Germany have become well known (e.g. Paul van Dyk, Felix Jaehn, Paul Kalkbrenner, Robin Schulz and Scooter). Art and design German painters have influenced Western art.
and water management. The country's household recycling rate is among the highest in the world—at around 65%. The country's greenhouse gas emissions per capita were the ninth-highest in the EU . The German energy transition () is the recognised move to a sustainable economy by means of energy efficiency and renewable energy. Tourism Germany is the ninth most visited country in the world , with 37.4 million visits. Berlin has become the third most visited city destination in Europe. Domestic and international travel and tourism combined directly contribute over €105.3 billion to German GDP. Including indirect and induced impacts, the industry supports 4.2 million jobs. Germany's most visited and popular landmarks include Cologne Cathedral, the Brandenburg Gate, the Reichstag, the Dresden Frauenkirche, Neuschwanstein Castle, Heidelberg Castle, the Wartburg, and Sanssouci Palace. The Europa-Park near Freiburg is Europe's second most popular theme park resort. Demographics With a population of 80.2 million according to the 2011 German Census, rising to 83.1 million , Germany is the most populous country in the European Union, the second most populous country in Europe after Russia, and the nineteenth most populous country in the world. Its population density stands at 227 inhabitants per square kilometre (588 per square mile). The overall life expectancy in Germany at birth is 80.19 years (77.93 years for males and 82.58 years for females). The fertility rate of 1.41 children born per woman (2011 estimates) is below the replacement rate of 2.1 and is one of the lowest fertility rates in the world. Since the 1970s, Germany's death rate has exceeded its birth rate. However, Germany is witnessing increased birth rates and migration rates since the beginning of the 2010s. Germany has the third oldest population in the world, with an average age of 47.4 years. Four sizeable groups of people are referred to as "national minorities" because their ancestors have lived in their respective regions for centuries: There is a Danish minority in the northernmost state of Schleswig-Holstein; the Sorbs, a Slavic population, are in the Lusatia region of Saxony and Brandenburg; the Roma and Sinti live throughout the country; and the Frisians are concentrated in Schleswig-Holstein's western coast and in the north-western part of Lower Saxony. After the United States, Germany is the second most popular immigration destination in the world. The majority of migrants live in western Germany, in particular in urban areas. Of the country's residents, 18.6 million people (22.5%) were of immigrant or partially immigrant descent in 2016 (including persons descending or partially descending from ethnic German repatriates). In 2015, the Population Division of the United Nations Department of Economic and Social Affairs listed Germany as host to the second-highest number of international migrants worldwide, about 5% or 12 million of all 244 million migrants. , Germany ranks seventh amongst EU countries in terms of the percentage of migrants in the country's population, at 13.1%. Germany has a number of large cities. There are 11 officially recognised metropolitan regions. The country's largest city is Berlin, while its largest urban area is the Ruhr. Religion According to the 2011 census, Christianity was the largest religion in Germany, with 66.8% of respondents identifying as Christian, of which 3.8% were not church members. 31.7% declared themselves as Protestants, including members of the Evangelical Church in Germany (which encompasses Lutheran, Reformed, and administrative or confessional unions of both traditions) and the free churches (); 31.2% declared themselves as Roman Catholics, and Orthodox believers constituted 1.3%. According to data from 2016, the Catholic Church and the Evangelical Church claimed 28.5% and 27.5%, respectively, of the population. Islam is the second-largest religion in the country. In the 2011 census, 1.9% of respondents (1.52 million people) gave their religion as Islam, but this figure is deemed unreliable because a disproportionate number of adherents of this faith (and other religions, such as Judaism) are likely to have made use of their right not to answer the question. Most of the Muslims are Sunnis and Alevites from Turkey, but there are a small number of Shi'ites, Ahmadiyyas and other denominations. Other religions comprise less than one percent of Germany's population. A study in 2018 estimated that 38% of the population are not members of any religious organization or denomination, though up to a third may still consider themselves religious. Irreligion in Germany is strongest in the former East Germany, which used to be predominantly Protestant before the enforcement of state atheism, and in major metropolitan areas. Languages German is the official and predominant spoken language in Germany. It is one of 24 official and working languages of the European Union, and one of the three procedural languages of the European Commission. German is the most widely spoken first language in the European Union, with around 100 million native speakers. Recognised native minority languages in Germany are Danish, Low German, Low Rhenish, Sorbian, Romany, North Frisian and Saterland Frisian; they are officially protected by the European Charter for Regional or Minority Languages. The most used immigrant languages are Turkish, Arabic, Kurdish, Polish, the Balkan languages and Russian. Germans are typically multilingual: 67% of German citizens claim to be able to communicate in at least one foreign language and 27% in at least two. Education Responsibility for educational supervision in Germany is primarily organised within the individual states. Optional kindergarten education is provided for all children between three and six years old, after which school attendance is compulsory for at least nine years. Primary education usually lasts for four to six years. Secondary schooling is divided into tracks based on whether students pursue academic or vocational education. A system of apprenticeship called leads to a skilled qualification which is almost comparable to an academic degree. It allows students in vocational training to learn in a company as well as in a state-run trade school. This model is well regarded and reproduced all around the world. Most of the German universities are public institutions, and students traditionally study without fee payment. The general requirement for university is the . According to an OECD report in 2014, Germany is the world's third leading destination for international study. The established universities in Germany include some of the oldest in the world, with Heidelberg University (established in 1386) being the oldest. The Humboldt University of Berlin, founded in 1810 by the liberal educational reformer Wilhelm von Humboldt, became the academic model for many Western universities. In the contemporary era Germany has developed eleven Universities of Excellence. Health Germany's system of hospitals, called , dates from medieval times, and today, Germany has the world's oldest universal health care system, dating from Bismarck's social legislation of the 1880s. Since the 1880s, reforms and provisions have ensured a balanced health care system. The population is covered by a health insurance plan provided by statute, with criteria allowing some groups to opt for a private health insurance contract. According to the World Health Organization, Germany's health care system was 77% government-funded and 23% privately funded . In 2014, Germany spent 11.3% of its GDP on health care. Germany ranked 20th in the world in 2013 in life expectancy with 77 years for men and 82 years for women, and it had a very low infant mortality rate (4 per 1,000 live births). , the principal cause of death was cardiovascular disease, at 37%. Obesity in Germany has been increasingly cited as a major health issue. A 2014 study showed that 52 percent of the adult German population was overweight or obese. Culture Culture in German states has been shaped by major intellectual and popular currents in Europe, both religious and secular. Historically, Germany has been called ('the land of poets and thinkers'), because of the major role its scientists, writers and philosophers have played in the development of Western thought. A global opinion poll for the BBC revealed that Germany is recognised for having the most positive influence in the world in 2013 and 2014. Germany is well known for such folk festival traditions as Oktoberfest and Christmas customs, which include Advent wreaths, Christmas pageants, Christmas trees, Stollen cakes, and other practices. UNESCO inscribed 41 properties in Germany on the World Heritage List. There are a number of public holidays in Germany determined by each state; 3 October has been a national day of Germany since 1990, celebrated as the (German Unity Day). Music German classical music includes works by some of the world's most well-known composers. Dieterich Buxtehude, Johann Sebastian Bach and Georg Friedrich Händel were influential composers of the Baroque period. Ludwig van Beethoven was a crucial figure in the transition between the Classical and Romantic eras. Carl Maria von Weber, Felix Mendelssohn, Robert Schumann and Johannes Brahms were significant Romantic composers. Richard Wagner was known for his operas. Richard Strauss was a leading composer of the late Romantic and early modern eras. Karlheinz Stockhausen and Wolfgang Rihm are important composers of the 20th and early 21st centuries. As of 2013, Germany was the second-largest music market in Europe, and fourth-largest in the world. German popular music of the 20th and 21st centuries includes the movements of Neue Deutsche Welle, pop, Ostrock, heavy metal/rock, punk, pop rock, indie, Volksmusik (folk music), schlager pop and German hip hop. German electronic music gained global influence, with Kraftwerk and Tangerine Dream pioneering in this genre. DJs and artists of the techno and house music scenes of Germany have become well known (e.g. Paul van Dyk, Felix Jaehn, Paul Kalkbrenner, Robin Schulz and Scooter). Art and design German painters have influenced Western art. Albrecht Dürer, Hans Holbein the Younger, Matthias Grünewald and Lucas Cranach the Elder were important German artists of the Renaissance, Johann Baptist Zimmermann of the Baroque, Caspar David Friedrich and Carl Spitzweg of Romanticism, Max Liebermann of Impressionism and Max Ernst of Surrealism. Several German art groups formed in the 20th century; (The Bridge) and (The Blue Rider) influenced the development of expressionism in Munich and Berlin. The New Objectivity arose in response to expressionism during the Weimar Republic. After World War II, broad trends in German art include neo-expressionism and the New Leipzig School. Architectural contributions from Germany include the Carolingian and Ottonian styles, which were precursors of Romanesque. Brick Gothic is a distinctive medieval style that evolved in Germany. Also in Renaissance and Baroque art, regional and typically German elements evolved (e.g. Weser Renaissance). Vernacular architecture in Germany is often identified by its timber framing () traditions and varies across regions, and among carpentry styles. When industrialisation spread across Europe, classicism and a distinctive style of historicism developed in Germany, sometimes referred to as style. Expressionist architecture developed in the 1910s in Germany and influenced Art Deco and other modern styles. Germany was particularly important in the early modernist movement: it is the home of Werkbund initiated by Hermann Muthesius (New Objectivity), and of the Bauhaus movement founded by Walter Gropius. Ludwig Mies van der Rohe became one of the world's most renowned architects in the second half of the 20th century; he conceived of the glass façade skyscraper. Renowned contemporary architects and offices include Pritzker Prize winners Gottfried Böhm and Frei Otto. German designers became early leaders of modern product design. The Berlin Fashion Week and the fashion trade fair Bread & Butter are held twice a year. Literature and philosophy German literature can be traced back to the Middle Ages and the works of writers such as Walther von der Vogelweide and Wolfram von Eschenbach. Well-known German authors include Johann Wolfgang von Goethe, Friedrich Schiller, Gotthold Ephraim Lessing and Theodor Fontane. The collections of folk tales published by the Brothers Grimm popularised German folklore on an international level. The Grimms also gathered and codified regional variants of the German language, grounding their work in historical principles; their , or German Dictionary, sometimes called the Grimm dictionary, was begun in 1838 and the first volumes published in 1854. Influential authors of the 20th century include Gerhart Hauptmann, Thomas Mann, Hermann Hesse, Heinrich Böll and Günter Grass. The German book market is the third-largest in the world, after the United States and China. The Frankfurt Book Fair is the most important in the world for international deals and trading, with a tradition spanning over 500 years. The Leipzig Book Fair also retains a major position in Europe. German philosophy is historically significant: Gottfried Leibniz's contributions to rationalism; the enlightenment philosophy by Immanuel Kant; the establishment of classical German idealism by Johann Gottlieb Fichte, Georg Wilhelm Friedrich Hegel and Friedrich Wilhelm Joseph Schelling; Arthur Schopenhauer's composition of metaphysical pessimism; the formulation of communist theory by Karl Marx and Friedrich Engels; Friedrich Nietzsche's development of perspectivism; Gottlob Frege's contributions to the dawn of analytic philosophy; Martin Heidegger's works on Being; Oswald Spengler's historical philosophy; the development of the Frankfurt School has been particularly influential. Media The largest internationally operating media companies in Germany are the Bertelsmann enterprise, Axel Springer SE and ProSiebenSat.1 Media. Germany's television market is the largest in Europe, with some 38 million TV households. Around 90% of German households have cable or satellite TV, with a variety of free-to-view public and commercial channels. There are more than 300 public and private radio stations in Germany; Germany's national radio network is the Deutschlandradio and the public Deutsche Welle
is considerable, with several located in different parts of the Zona Viva. The area around the East market is being redeveloped. Within the financial district are the tallest buildings in the country including: Club Premier, Tinttorento, Atlantis building, Atrium, Tikal Futura, Building of Finances, Towers Building Batteries, Torres Botticelli, Tadeus, building of the INTECAP, Royal Towers, Towers Geminis, Industrial Bank towers, Holiday Inn Hotel, Premier of the Americas, among many others to be used for offices, apartments etc. Also included are projects such as Zona Pradera and Interamerica's World Financial Center. One of the most outstanding mayors was the engineer Martin Prado Vélez, who took over in 1949, and ruled the city during the reformist Presidents Juan José Arévalo and Jacobo Arbenz Guzman, although he was not a member of the ruling party at the time and was elected due his well-known capabilities. Of cobanero origin, married with Marta Cobos, he studied at the University of San Carlos; under his tenure, among other modernist works of the city, infrastructure projects included El Incienso bridge, the construction of the Roosevelt Avenue, the main road axis from East to West of the city, the town hall building, and numerous road works which meant the widening of the colonial city, its order in the cardinal points and the generation of a ring road with the first cloverleaf interchange in the city. In an attempt to control the rapid growth of the city, the municipal government (Municipalidad de Guatemala) headed by longtime Mayor Álvaro Arzú, has implemented a plan to focus growth along important arterial roads and apply Transit-oriented development (TOD) characteristics. This plan denominated POT (Plan de Ordenamiento Territorial) aims to allow taller building structures of mixed uses to be built next to large arterial roads and gradually decline in height and density moving away from such. It is also worth mentioning, that due to the airport being in the south of the city, height limits based on aeronautical considerations have been applied to the construction code. This limits the maximum height for a building, at in Zone 10, up to in Zone 1. Climate Despite its location in the tropics, Guatemala City's relatively high altitude moderates average temperatures. The city has a tropical savanna climate (Köppen Aw) bordering on a subtropical highland climate (Cwb). Guatemala City is generally very warm, almost springlike, throughout the course of the year. It occasionally gets hot during the dry season, but not as hot and humid as in Central American cities at sea level. The hottest month is April. The rainy season extends from May to October, coinciding with the tropical storm and hurricane season in the western Atlantic Ocean and Caribbean Sea, while the dry season extends from November to April. The city can at times be windy, which also leads to lower ambient temperatures. The city's average annual temperature ranges are during the day and at night; its average relative humidity is 82% in the morning and 58% in the evening; and its average dew point is . Volcanic activity Four stratovolcanoes are visible from the city, two of them active. The nearest and most active is Pacaya, which at times erupts a considerable amount of ash. These volcanoes lie to the south of the Valle de la Ermita, providing a natural barrier between Guatemala City and the Pacific lowlands that define the southern regions of Guatemala. Agua, Fuego, Pacaya and Acatenango comprise a line of 33 stratovolcanoes that stretches across the breadth of Guatemala, from the Salvadorian border to the Mexican border. Earthquakes Lying on the Ring of Fire, the Guatemalan highlands and the Valle de la Ermita are frequently shaken by large earthquakes. The last large tremor to hit the Guatemala City region occurred in the 1976, on the Motagua Fault, a left-lateral strike-slip fault that forms the boundary between the Caribbean Plate and the North American Plate. The 1976 event registered 7.5 on the moment magnitude scale. Smaller, less severe tremors are frequently felt in Guatemala City and environs. Mudslides Torrential downpours, similar to the more famous monsoons, occur frequently in the Valle de la Ermita during the rainy season, leading to flash floods that sometimes inundate the city. Due to these heavy rainfalls, some of the slums perched on the steep edges of the canyons that criss-cross the Valle de la Ermita are washed away and buried under mudslides, as in October 2005. Tropical waves, tropical storms and hurricanes sometimes strike the Guatemalan highlands, which also bring torrential rains to the Guatemala City region and trigger these deadly mudslides. Piping pseudokarst In February 2007, a very large, deep circular hole with vertical walls opened in northeastern Guatemala City (), killing five people. This sinkhole, which is classified by geologists as either a "piping feature" or "piping pseudokarst", was deep, and apparently was created by fluid from a sewer eroding the loose volcanic ash, limestone, and other pyroclastic deposits that underlie Guatemala City. As a result, one thousand people were evacuated from the area. This piping feature has since been mitigated by City Hall by providing proper maintenance to the sewerage collection system and plans to develop the site have been proposed. However, critics believe municipal authorities have neglected needed maintenance on the city's aging sewerage system, and have speculated that more dangerous piping features are likely to develop unless action is taken. 3 years later the 2010 Guatemala City sinkhole arose. Demographics It is estimated that the population of Guatemala City proper is about 1 million, while its urban area is almost 3 million. The growth of the city's population has been robust, abetted by the mass migration of Guatemalans from the rural hinterlands to the largest and most vibrant regional economy in Guatemala. The inhabitants of Guatemala City are incredibly diverse given the size of the city, with those of Spanish and Mestizo descent being the most numerous. Guatemala City also has sizable indigenous populations, divided among the 23 distinct Mayan groups present in Guatemala. The numerous Mayan languages are now spoken in certain quarters of Guatemala City, making the city a linguistically rich area. Foreigners and foreign immigrants comprise the final distinct group of Guatemala City inhabitants, representing a very small minority among the city's denizens. Due to mass migration from impoverished rural districts wracked with political instability, Guatemala City's population has exploded since the 1970s, severely straining the existing bureaucratic and physical infrastructure of the city. As a result, chronic traffic congestion, shortages of safe potable water in some areas of the city, and a sudden and prolonged surge in crime have become perennial problems. The infrastructure, although continuing to grow and improve in some areas, is lagging in relation to the increasing population of rural migrants, who tend to be poorer. Communications Guatemala City is headquarters to many communications and telecom companies, among them Tigo, Claro-Telgua, and Movistar-Telefónica. These companies also offer cable television, internet services and telephone access. Due to Guatemala City's large and concentrated consumer base in comparison to the rest of the country, these telecom and communications companies provide most of their services and offerings within the confines of the city. There are also seven local television channels, in addition to numerous international channels. The international channels range from children's programming, like Nickelodeon and the Disney Channel, to more adult offerings, such as E! and HBO. While international programming is dominated by entertainment from the United States, domestic programming is dominated by shows from Mexico. Due to its small and relatively income-restricted domestic market, Guatemala City produces very little in the way of its own programming outside of local news and sports. Economy and Finance Guatemala City, as the capital, is home to Guatemala's central bank, from which Guatemala's monetary and fiscal policies are formulated and promulgated. Guatemala City is also headquarters to numerous regional private banks, among them CitiBank, Banco Agromercantil, Banco Promerica, Banco Industrial, Banco GyT Continental, Banco de Antigua, Banco Reformador, Banrural, Grupo Financiero de Occidente, BAC Credomatic, and Banco Internacional. By far the richest and most powerful regional economy within Guatemala, Guatemala City is the largest market for goods and services, which provides the greatest number of investment opportunities for public and private investors in all of Guatemala. Financing for these investments is provided by the regional private banks, as well as through foreign direct investment mostly coming from the United States. Guatemala City's ample consumer base and service sector is represented by the large department store chains present in the city, among them Siman, Hiper Paiz & Paiz (Walmart), Price Smart, ClubCo, Cemaco, Sears and Office Depot. Places of interest by zones Guatemala City is divided into 22 zones in accordance with the urban layout plan designed by Raúl Aguilar Batres. Each zone has its own streets and avenues, facilitating navigation within the city. Zones are numbered 1 through 25. However, numbers 20, 22 and 23 have not been designated to zones, thus these zones do not exist within the city proper. Transportation Renovated and expanded, La Aurora International Airport lies to the south of the city center. La Aurora serves as Guatemala's principal air hub. Public transport is provided by buses and supplemented by a BRT system. The three main highways that bisect and serve Guatemala start in the city. (CA9 Transoceanic Highway - Puerto San Jose to Puerto Santo Tomas de Castilla-, CA1 Panamerican Highway - from the Mexican border to Salvadorian border - and to Peten.) Construction of freeways and underpasses by the municipal government, the implementation of reversible lanes during peak rush-hour traffic, as well as the establishment of the Department of Metropolitan Transit Police (PMT), has helped improve traffic flow in the city. Despite these municipal efforts, the Guatemala City metropolitan area still faces growing traffic congestion. A BRT (bus rapid transit) system called Transmetro, consisting of special-purpose lanes for high-capacity buses, began operating in 2007, and aimed to improve traffic flow in the city through the implementation of an efficient mass transit system. The system consists of five lines. It is expected to be expanded around 10 lines, with some over-capacity expected lines being considered for Light Metro or Heavy Metro. Traditional buses are now required to discharge passengers at transfer stations at the city's edge to board the Transmetro. This is being implemented as new Transmetro lines become established. In conjunction with the new mass transit implementation in
example, the INGUAT Office on "7a Av. 1-17, Zona 4" is a building which is located on Avenida 7, 17 meters away from the intersection with Calle 1, toward Calle 2 in zone 4. 7a Av. 1-17, Zona 4; and 7a Av. 1-17, Zona 10, are two radically different addresses. Short streets/avenues do not get new sequenced number, for example, 6A Calle is a short street between 6a and 7a. Some "avenidas" or "Calles" have a name in addition to their number, if it is very wide, for example Avenida la Reforma is an avenue which separates Zone 9 and 10 and Calle Montúfar is Calle 12 in Zone 9. Calle 1 Avenida 1 Zona 1 is the center of every city in Guatemala. Zone One is the Historic Center, (Centro Histórico), lying in the very heart of the city, the location of many important historic buildings including the Palacio Nacional de la Cultura (National Palace of Culture), the Metropolitan Cathedral, the National Congress, the Casa Presidencial (Presidential House), the National Library and Plaza de la Constitución (Constitution Plaza, old Central Park). Efforts to revitalize this important part of the city have been undertaken by the municipal government. Besides the parks, the city offers a portfolio of entertainment in the region, focused on the so-called Zona Viva and the Calzada Roosevelt as well as four degrees North. Casino activity is considerable, with several located in different parts of the Zona Viva. The area around the East market is being redeveloped. Within the financial district are the tallest buildings in the country including: Club Premier, Tinttorento, Atlantis building, Atrium, Tikal Futura, Building of Finances, Towers Building Batteries, Torres Botticelli, Tadeus, building of the INTECAP, Royal Towers, Towers Geminis, Industrial Bank towers, Holiday Inn Hotel, Premier of the Americas, among many others to be used for offices, apartments etc. Also included are projects such as Zona Pradera and Interamerica's World Financial Center. One of the most outstanding mayors was the engineer Martin Prado Vélez, who took over in 1949, and ruled the city during the reformist Presidents Juan José Arévalo and Jacobo Arbenz Guzman, although he was not a member of the ruling party at the time and was elected due his well-known capabilities. Of cobanero origin, married with Marta Cobos, he studied at the University of San Carlos; under his tenure, among other modernist works of the city, infrastructure projects included El Incienso bridge, the construction of the Roosevelt Avenue, the main road axis from East to West of the city, the town hall building, and numerous road works which meant the widening of the colonial city, its order in the cardinal points and the generation of a ring road with the first cloverleaf interchange in the city. In an attempt to control the rapid growth of the city, the municipal government (Municipalidad de Guatemala) headed by longtime Mayor Álvaro Arzú, has implemented a plan to focus growth along important arterial roads and apply Transit-oriented development (TOD) characteristics. This plan denominated POT (Plan de Ordenamiento Territorial) aims to allow taller building structures of mixed uses to be built next to large arterial roads and gradually decline in height and density moving away from such. It is also worth mentioning, that due to the airport being in the south of the city, height limits based on aeronautical considerations have been applied to the construction code. This limits the maximum height for a building, at in Zone 10, up to in Zone 1. Climate Despite its location in the tropics, Guatemala City's relatively high altitude moderates average temperatures. The city has a tropical savanna climate (Köppen Aw) bordering on a subtropical highland climate (Cwb). Guatemala City is generally very warm, almost springlike, throughout the course of the year. It occasionally gets hot during the dry season, but not as hot and humid as in Central American cities at sea level. The hottest month is April. The rainy season extends from May to October, coinciding with the tropical storm and hurricane season in the western Atlantic Ocean and Caribbean Sea, while the dry season extends from November to April. The city can at times be windy, which also leads to lower ambient temperatures. The city's average annual temperature ranges are during the day and at night; its average relative humidity is 82% in the morning and 58% in the evening; and its average dew point is . Volcanic activity Four stratovolcanoes are visible from the city, two of them active. The nearest and most active is Pacaya, which at times erupts a considerable amount of ash. These volcanoes lie to the south of the Valle de la Ermita, providing a natural barrier between Guatemala City and the Pacific lowlands that define the southern regions of Guatemala. Agua, Fuego, Pacaya and Acatenango comprise a line of 33 stratovolcanoes that stretches across the breadth of Guatemala, from the Salvadorian border to the Mexican border. Earthquakes Lying on the Ring of Fire, the Guatemalan highlands and the Valle de la Ermita are frequently shaken by large earthquakes. The last large tremor to hit the Guatemala City region occurred in the 1976, on the Motagua Fault, a left-lateral strike-slip fault that forms the boundary between the Caribbean Plate and the North American Plate. The 1976 event registered 7.5 on the moment magnitude scale. Smaller, less severe tremors are frequently felt in Guatemala City and environs. Mudslides Torrential downpours, similar to the more famous monsoons, occur frequently in the Valle de la Ermita during the rainy season, leading to flash floods that sometimes inundate the city. Due to these heavy rainfalls, some of the slums perched on the steep edges of the canyons that criss-cross the Valle de la Ermita are
an extensive collection of free software (383 packages as of January 2022), which can be used as an operating system or can be used in parts with other operating systems. The use of the completed GNU tools led to the family of operating systems popularly known as Linux. Most of GNU is licensed under the GNU Project's own General Public License (GPL). GNU is also the project within which the free software concept originated. Richard Stallman, the founder of the project, views GNU as a "technical means to a social end". Relatedly, Lawrence Lessig states in his introduction to the second edition of Stallman's book Free Software, Free Society that in it Stallman has written about "the social aspects of software and how Free Software can create community and social justice". Name GNU is a recursive acronym for "GNU's Not Unix!", chosen because GNU's design is Unix-like, but differs from Unix by being free software and containing no Unix code. Stallman chose the name by using various plays on words, including the song The Gnu. History Development of the GNU operating system was initiated by Richard Stallman while he worked at MIT Artificial Intelligence Laboratory. It was called the GNU Project, and was publicly announced on September 27, 1983, on the net.unix-wizards and net.usoft newsgroups by Stallman. Software development began on January 5, 1984, when Stallman quit his job at the Lab so that they could not claim ownership or interfere with distributing GNU components as free software. The goal was to bring a completely free software operating system into existence. Stallman wanted computer users to be free to study the source code of the software they use, share software with other people, modify the behavior of software, and publish their modified versions of the software. This philosophy was published as the GNU Manifesto in March 1985. Richard Stallman's experience with the Incompatible Timesharing System (ITS), an early operating system written in assembly language that became obsolete due to discontinuation of PDP-10, the computer architecture for which ITS was written, led to a decision that a portable system was necessary. It was thus decided that the development would be started using C and Lisp as system programming languages, and that GNU would be compatible with Unix. At the time, Unix was already a popular proprietary operating system. The design of Unix was modular, so it could be reimplemented piece by piece. Much of the needed software had to be written from scratch, but existing compatible third-party free software components were also used such as the TeX typesetting system, the X Window System, and the Mach microkernel that forms the basis of the GNU Mach core of GNU Hurd (the official kernel of GNU). With the exception of the aforementioned third-party components, most of GNU has been written by volunteers; some in their spare time, some paid by companies, educational institutions, and other non-profit organizations. In October 1985, Stallman set up the Free Software Foundation (FSF). In the late 1980s and 1990s, the FSF hired software developers to write the software needed for GNU. As GNU gained prominence, interested businesses began contributing to development or selling GNU software and technical support. The most prominent and successful of these was Cygnus Solutions, now part of Red Hat. Components The system's basic components include the GNU Compiler Collection (GCC), the GNU C library (glibc), and GNU Core Utilities (coreutils), but also the GNU Debugger (GDB), GNU Binary Utilities (binutils), the GNU Bash shell. GNU developers have contributed to Linux ports of GNU applications and utilities, which are now also widely used on other operating systems such as BSD variants, Solaris and macOS. Many GNU programs have been ported to other operating systems, including proprietary platforms such as Microsoft Windows and macOS. GNU programs have been shown to be
The GNU Project has endorsed Linux-libre distributions, such as gNewSense, Trisquel and Parabola GNU/Linux-libre. With non-GNU kernels Because of the development status of Hurd, GNU is usually paired with other kernels such as Linux or FreeBSD. Whether the combination of GNU libraries with external kernels is a GNU operating system with a kernel (e.g. GNU with Linux), because the GNU collection renders the kernel into a usable operating system as understood in modern software development, or whether the kernel is an operating system unto itself with a GNU layer on top (i.e. Linux with GNU), because the kernel can operate a machine without GNU, is a matter of ongoing debate. The FSF maintains that an operating system built using the Linux kernel and GNU tools and utilities should be considered a variant of GNU, and promotes the term GNU/Linux for such systems (leading to the GNU/Linux naming controversy). This view is not exclusive to the FSF. Notably, Debian, one of the biggest and oldest Linux distributions, refers to itself as Debian GNU/Linux. Copyright, GNU licenses, and stewardship The GNU Project recommends that contributors assign the copyright for GNU packages to the Free Software Foundation, though the Free Software Foundation considers it acceptable to release small changes to an existing project to the public domain. However, this is not required; package maintainers may retain copyright to the GNU packages they maintain, though since only the copyright holder may enforce the license used (such as the GNU GPL), the copyright holder in this case enforces it rather than the Free Software Foundation. For the development of needed software, Stallman wrote a license called the GNU General Public License (first called Emacs General Public License), with the goal to guarantee users freedom to share and change free software. Stallman wrote this license after his experience with James Gosling and a program called UniPress, over a controversy around software code use in the GNU Emacs program. For most of the 80s, each GNU package had its own license: the Emacs General Public License, the GCC General Public License, etc. In 1989, FSF published a single license they could use for all their software, and which could be used by non-GNU projects: the GNU General Public License (GPL). This license is now used by most of GNU software, as well as a large number of free software programs that are not part of the GNU Project; it also historically has been the most commonly used free software license (though recently challenged by the MIT license). It gives all recipients of a program the right to run, copy, modify and distribute it, while forbidding them from imposing further restrictions on any copies they distribute. This idea is often referred to as copyleft. In 1991, the GNU Lesser General Public License (LGPL), then known as the Library General Public License, was written for the GNU C Library to allow it to be linked with proprietary software. 1991 also saw the release of version 2 of the GNU GPL. The GNU Free Documentation License (FDL), for documentation, followed in 2000. The GPL and LGPL were revised to version 3 in 2007, adding clauses to protect users against hardware restrictions that prevent users from running modified software on their own devices. Besides GNU's packages, the GNU Project's licenses are used by many unrelated projects, such as the Linux kernel, often used with GNU software. A minority of the software used by most of Linux distributions, such as the X Window System, is licensed under permissive free software licenses. Logo The logo for GNU is a gnu head. Originally drawn by Etienne Suvasa, a bolder and simpler version designed by Aurelio Heckert is now preferred. It appears in GNU software and in printed and electronic documentation for the GNU Project, and is also used in Free Software Foundation materials. There was also a modified version of the official logo. It was created by the Free Software Foundation in September 2013 in order to commemorate the 30th anniversary of
is usually by the steady transformation of a whole species into a new one (through a process called anagenesis). In this view no clear line of demarcation exists between an ancestral species and a descendant species, unless splitting occurs. Punctuated gradualism is a microevolutionary hypothesis that refers to a species that has "relative stasis over a considerable part of its total duration [and] underwent periodic, relatively rapid, morphologic change that did not lead to lineage branching". It is one of the three common models of evolution. While the traditional model of palaeontology, the phylogenetic model, states that features evolved slowly without any direct association with speciation, the relatively newer and more controversial idea of punctuated equilibrium claims that major evolutionary changes don't happen over a gradual period but in localized, rare, rapid events of branching speciation. Punctuated gradualism is considered to be a variation of these models, lying somewhere in between the phyletic gradualism model and the punctuated equilibrium model. It states that speciation is not needed for a lineage to rapidly evolve from one equilibrium to another but may show rapid transitions between long-stable states. Contradictorial gradualism is the paraconsistent treatment of fuzziness developed by Lorenzo Peña which regards true contradictions as situations wherein a state of affairs enjoys only partial existence. Gradualism in social change implemented through reformist means is a moral principle to which the Fabian Society is committed. In a more general way, reformism is the assumption that gradual changes through
from reformism, with the former insisting that short-term goals need to be formulated and implemented in such a way that they inevitably lead into long-term goals. It is most commonly associated with the libertarian socialist concept of dual power and is seen as a middle way between reformism and revolutionism. Martin Luther King Jr. was opposed to the idea of gradualism as a method of eliminating segregation. The United States government wanted to try to integrate African-Americans and European-Americans slowly into the same society, but many believed it was a way for the government to put off actually doing anything about racial segregation: Linguistics and language change In linguistics, language change is seen as gradual, the product of chain reactions and subject to cyclic drift. The view that creole languages are the product of catastrophism is heavily disputed. Morality Christianity Buddhism, Theravada and Yoga Gradualism is the approach of certain schools of Buddhism and other Eastern philosophies (e.g. Theravada or Yoga), that enlightenment can be achieved step by step, through an arduous practice. The opposite approach, that insight is attained all at once, is called subitism. The debate on the issue was very important to the history of the development of Zen, which rejected gradualism, and to the establishment of the opposite approach within the Tibetan Buddhism, after the Debate of Samye. It was continued in other schools of Indian and Chinese philosophy. Types Phyletic gradualism is a model of evolution which theorizes that most speciation is slow, uniform and gradual. When evolution occurs in this mode, it is usually by the steady transformation of a whole species into a new one (through a process called anagenesis). In this view no clear line of demarcation exists between an ancestral species and a descendant species, unless splitting occurs. Punctuated gradualism is a microevolutionary hypothesis that refers to a species that has "relative stasis over a considerable part of its total duration [and] underwent periodic, relatively rapid, morphologic change that did not lead to lineage branching". It is one of the three common models of evolution. While the traditional model of palaeontology, the phylogenetic model, states that features evolved slowly without any direct association with speciation, the relatively newer and more controversial idea of punctuated
write the Greek language Greek Orthodox Church, several Churches of the Eastern Orthodox Church Ancient Greece, the ancient civilization before the end of Antiquity Other uses Greek (play), 1980 play by Steven Berkoff Greek (opera), 1988 opera by Mark-Antony Turnage, based on Steven Berkoff's play Greek (TV series) (also stylized GRΣΣK), 2007 ABC Family channel's comedy-drama television series set at a fictitious college's fictional Greek system Greeks (finance), quantities representing the sensitivity of the price of derivatives Greeking, a style of displaying or rendering text or symbols in a computer display or typographic layout Greek-letter organizations (GLOs), social organizations for undergraduate students at North American colleges Greek Theatre (Los Angeles), a theatre located at Griffith Park in Los Angeles, California Greek Revival, an architectural movement of the late 18th and early 19th centuries Greek love, a term referring variously to male bonding, homosexuality, pederasty and anal sex The Greek, a fictional character on the HBO drama The Wire The Greeks (book), a 1951 non-fiction book on classical Greece
at North American colleges Greek Theatre (Los Angeles), a theatre located at Griffith Park in Los Angeles, California Greek Revival, an architectural movement of the late 18th and early 19th centuries Greek love, a term referring variously to male bonding, homosexuality, pederasty and anal sex The Greek, a fictional character on the HBO drama The Wire The Greeks (book), a 1951 non-fiction book on classical Greece by HDF Kitto Greeks, a group of scholars in 16th-century England who were part of the Grammarians' War See also Greeks (disambiguation) Greek dialects (disambiguation) Hellenic (disambiguation) Names of the Greeks, terms for the Greek people Name of Greece, names for the country Greek to me, an idiom for something not understandable Language and nationality
of the above characteristics were not present in Proto-Germanic but developed later as areal features that spread from language to language: Germanic umlaut only affected the North and West Germanic languages (which represent all modern Germanic languages) but not the now-extinct East Germanic languages, such as Gothic, nor Proto-Germanic, the common ancestor of all Germanic languages. The large inventory of vowel qualities is a later development, due to a combination of Germanic umlaut and the tendency in many Germanic languages for pairs of long/short vowels of originally identical quality to develop distinct qualities, with the length distinction sometimes eventually lost. Proto-Germanic had only five distinct vowel qualities, although there were more actual vowel phonemes because length and possibly nasality were phonemic. In modern German, long-short vowel pairs still exist but are also distinct in quality. Proto-Germanic probably had a more general S-O-V-I word order. However, the tendency toward V2 order may have already been present in latent form and may be related to Wackernagel's Law, an Indo-European law dictating that sentence clitics must be placed second. Roughly speaking, Germanic languages differ in how conservative or how progressive each language is with respect to an overall trend toward analyticity. Some, such as Icelandic and, to a lesser extent, German, have preserved much of the complex inflectional morphology inherited from Proto-Germanic (and in turn from Proto-Indo-European). Others, such as English, Swedish, and Afrikaans, have moved toward a largely analytic type. Linguistic developments The subgroupings of the Germanic languages are defined by shared innovations. It is important to distinguish innovations from cases of linguistic conservatism. That is, if two languages in a family share a characteristic that is not observed in a third language, that is evidence of common ancestry of the two languages only if the characteristic is an innovation compared to the family's proto-language. The following innovations are common to the Northwest Germanic languages (all but Gothic): The lowering of /u/ to /o/ in initial syllables before /a/ in the following syllable: → bode, Icelandic "messages" ("a-Umlaut", traditionally called Brechung) "Labial umlaut" in unstressed medial syllables (the conversion of /a/ to /u/ and /ō/ to /ū/ before /m/, or /u/ in the following syllable) The conversion of /ē1/ into /ā/ (vs. Gothic /ē/) in stressed syllables. In unstressed syllables, West Germanic also has this change, but North Germanic has shortened the vowel to /e/, then raised it to /i/. This suggests it was an areal change. The raising of final /ō/ to /u/ (Gothic lowers it to /a/). It is kept distinct from the nasal /ǭ/, which is not raised. The monophthongization of /ai/ and /au/ to /ē/ and /ō/ in non-initial syllables (however, evidence for the development of /au/ in medial syllables is lacking). The development of an intensified demonstrative ending in /s/ (reflected in English "this" compared to "the") Introduction of a distinct ablaut grade in Class VII strong verbs, while Gothic uses reduplication (e.g. Gothic haihait; ON, OE hēt, preterite of the Gmc verb *haitan "to be called") as part of a comprehensive reformation of the Gmc Class VII from a reduplicating to a new ablaut pattern, which presumably started in verbs beginning with vowel or /h/ (a development which continues the general trend of de-reduplication in Gmc); there are forms (such as OE dial. heht instead of hēt) which retain traces of reduplication even in West and North Germanic The following innovations are also common to the Northwest Germanic languages but represent areal changes: Proto-Germanic /z/ > /r/ (e.g. Gothic dius; ON dȳr, OHG tior, OE dēor, "wild animal"); note that this is not present in Proto-Norse and must be ordered after West Germanic loss of final /z/ Germanic umlaut The following innovations are common to the West Germanic languages: Loss of final /z/. In single-syllable words, Old High German retains it (as /r/), while it disappears in the other West Germanic languages. Change of [ð] (fricative allophone of /d/) to stop [d] in all environments. Change of /lþ/ to stop /ld/ (except word-finally). West Germanic gemination of consonants, except r, before /j/. This only occurred in short-stemmed words due to Sievers' law. Gemination of /p/, /t/, /k/ and /h/ is also observed before liquids. Labiovelar consonants become plain velar when non-initial. A particular type of umlaut /e-u-i/ > /i-u-i/. Changes to the 2nd person singular past-tense: Replacement of the past-singular stem vowel with the past-plural stem vowel, and substitution of the ending -t with -ī. Short forms (*stān, stēn, *gān, gēn) of the verbs for "stand" and "go"; but note that Crimean Gothic also has gēn. The development of a gerund. The following innovations are common to the Ingvaeonic subgroup of the West Germanic languages, which includes English, Frisian, and in a few cases Dutch and Low German, but not High German: The so-called Ingvaeonic nasal spirant law, with loss of /n/ before voiceless fricatives: e.g. *munþ, *gans > Old English mūþ, gōs > "mouth, goose", but German Mund, Gans. The loss of the Germanic reflexive pronoun . Dutch has reclaimed the reflexive pronoun from Middle High German . The reduction of the three Germanic verbal plural forms into one form ending in -þ. The development of Class III weak verbs into a relic class consisting of four verbs (*sagjan "to say", *hugjan "to think", *habjan "to have", *libjan "to live"; cf. the numerous Old High German verbs in -ēn). The split of the Class II weak verb ending *-ō- into *-ō-/-ōja- (cf. Old English -ian < -ōjan, but Old High German -ōn). Development of a plural ending *-ōs in a-stem nouns (note, Gothic also has -ōs, but this is an independent development, caused by terminal devoicing of *-ōz; Old Frisian has -ar, which is thought to be a late borrowing from Danish). Cf. modern English plural -(e)s, but German plural -e. Possibly, the monophthongization of Germanic *ai to ē/ā (this may represent independent changes in Old Saxon and Anglo-Frisian). The following innovations are common to the Anglo-Frisian subgroup of the Ingvaeonic languages: Raising of nasalized a, ā into o, ō. Anglo-Frisian brightening: Fronting of non-nasal a, ā to æ,ǣ when not followed by n or m. Metathesis of CrV into CVr, where C represents any consonant and V any vowel. Monophthongization of ai into ā. Common linguistic features Phonology The oldest Germanic languages all share a number of features, which are assumed to be inherited from Proto-Germanic. Phonologically, it includes the important sound changes known as Grimm's Law and Verner's Law, which introduced a large number of fricatives; late Proto-Indo-European had only one, /s/. The main vowel developments are the merging (in most circumstances) of long and short /a/ and /o/, producing short /a/ and long /ō/. That likewise affected the diphthongs, with PIE /ai/ and /oi/ merging into /ai/ and PIE /au/ and /ou/ merging into /au/. PIE /ei/ developed into long /ī/. PIE long /ē/ developed into a vowel denoted as /ē1/ (often assumed to be phonetically ), while a new, fairly uncommon long vowel /ē2/ developed in varied and not completely understood circumstances. Proto-Germanic had no front rounded vowels, but all Germanic languages except for Gothic subsequently developed them through the process of i-umlaut. Proto-Germanic developed a strong stress accent on the first syllable of the root, but remnants of the original free PIE accent are visible due to Verner's Law, which was sensitive to this accent. That caused a steady erosion of vowels in unstressed syllables. In Proto-Germanic, that had progressed only to the point that absolutely-final short vowels (other than /i/ and /u/) were lost and absolutely-final long vowels were shortened, but all of the early literary languages show a more advanced state of vowel loss. This ultimately resulted in some languages (like Modern English) losing practically all vowels following the main stress and the consequent rise of a very large number of monosyllabic words. Table of outcomes The following table shows the main outcomes of Proto-Germanic vowels and consonants in the various older languages. For vowels, only the outcomes in stressed syllables are shown. Outcomes in unstressed syllables are quite different, vary from language to language and depend on a number of other factors (such as whether the syllable was medial or final, whether the syllable was open or closed and (in some cases) whether the preceding syllable was light or heavy). Notes: C- means before a vowel (word-initially, or sometimes after a consonant). -C- means between vowels. -C means after a vowel (word-finally or before a consonant). Word-final outcomes generally occurred after deletion of final short vowels, which occurred shortly after Proto-Germanic and is reflected in the history of all written languages except for Proto-Norse. The above three are given in the order C-, -C-, -C. If one is omitted, the previous one applies. For example, f, -[v]- means that [v] occurs after a vowel regardless of what follows. Something like a(…u) means "a if /u/ occurs in the next syllable". Something like a(n) means "a if /n/ immediately follows". Something like (n)a means "a if /n/ immediately precedes". Morphology The oldest Germanic languages have the typical complex inflected morphology of old Indo-European languages, with four or five noun cases; verbs marked for person, number, tense and mood; multiple noun and verb classes; few or no articles; and rather free word order. The old Germanic languages are famous for having only two tenses (present and past), with three PIE past-tense aspects (imperfect, aorist, and perfect/stative) merged into one and no new tenses (future, pluperfect, etc.) developing. There were three moods: indicative, subjunctive (developed from the PIE optative mood) and imperative. Gothic verbs had a number of archaic features inherited from PIE that were lost in the other Germanic languages with few traces, including dual endings, an inflected passive voice (derived from the PIE mediopassive voice), and a class of verbs with reduplication in the past tense (derived from the PIE perfect). The complex tense system of modern English (e.g. In three months, the house will still be being built or If you had not acted so stupidly, we would never have been caught) is almost entirely due to subsequent developments (although paralleled in many of the other Germanic languages). Among the primary innovations in Proto-Germanic are the preterite present verbs, a special set of verbs whose present tense looks like the past tense of other verbs and which is the origin of most modal verbs in English; a past-tense ending; (in the so-called "weak verbs", marked with -ed in English) that appears variously as /d/ or /t/, often assumed to be derived from the verb "to do"; and two separate sets of adjective endings, originally corresponding to a distinction between indefinite semantics ("a man", with a combination of PIE adjective and pronoun endings) and definite semantics ("the man", with endings derived from PIE n-stem nouns). Note that most modern Germanic languages have lost most of the inherited inflectional morphology as a result of the steady attrition of unstressed endings triggered by the strong initial stress. (Contrast, for example, the Balto-Slavic languages, which have largely kept the Indo-European pitch accent and consequently preserved much of the inherited morphology.) Icelandic and to a lesser extent modern German best preserve the Proto–Germanic inflectional system, with four noun cases, three genders, and well-marked verbs. English and Afrikaans are at the other extreme, with almost no remaining inflectional morphology. The following shows a typical masculine a-stem noun, Proto-Germanic *fiskaz ("fish"), and its development in the various old literary languages: Strong vs. weak nouns and adjectives Originally, adjectives in Proto-Indo-European followed the same declensional classes as nouns. The most common class (the o/ā class) used a combination of o-stem endings for masculine and neuter genders and ā-stems ending for feminine genders, but other common classes (e.g. the i class and u class) used endings from a single vowel-stem declension for all genders, and various other classes existed that were based on other declensions. A quite different set of "pronominal" endings was used for pronouns, determiners, and words with related semantics (e.g., "all", "only"). An important innovation in Proto-Germanic was the development of two separate sets of adjective endings, originally corresponding to a distinction between indefinite semantics ("a man") and definite semantics ("the man"). The endings of indefinite adjectives were derived from a combination of pronominal endings with one of the common vowel-stem adjective declensions – usually the o/ā class (often termed the a/ō class in the specific context of the Germanic languages) but sometimes the i or u classes. Definite adjectives, however, had endings based on n-stem nouns. Originally both types of adjectives could be used by themselves, but already by Proto-Germanic times a pattern evolved whereby definite adjectives had to be accompanied by a determiner with definite semantics (e.g., a definite article, demonstrative pronoun, possessive pronoun, or the like), while indefinite adjectives were used in other circumstances (either accompanied by a word with indefinite semantics such as "a", "one", or "some" or unaccompanied). In the 19th century, the two types of adjectives – indefinite and definite – were respectively termed "strong" and "weak", names which are still commonly used. These names were based on the appearance of the two sets of endings in modern German. In German, the distinctive case endings formerly present on nouns have largely disappeared, with the result that the load of distinguishing one case from another is almost entirely carried by determiners and adjectives. Furthermore, due to regular sound change, the various definite (n-stem) adjective endings coalesced to the point where only two endings (-e and -en) remain in modern German to express the sixteen possible inflectional categories of the language (masculine/feminine/neuter/plural crossed with nominative/accusative/dative/genitive – modern German merges all genders in
outside of the Germanic languages. English doesn't make extensive use of discourse particles; an example would be the word 'just', which the speaker can use to express surprise. Note that some of the above characteristics were not present in Proto-Germanic but developed later as areal features that spread from language to language: Germanic umlaut only affected the North and West Germanic languages (which represent all modern Germanic languages) but not the now-extinct East Germanic languages, such as Gothic, nor Proto-Germanic, the common ancestor of all Germanic languages. The large inventory of vowel qualities is a later development, due to a combination of Germanic umlaut and the tendency in many Germanic languages for pairs of long/short vowels of originally identical quality to develop distinct qualities, with the length distinction sometimes eventually lost. Proto-Germanic had only five distinct vowel qualities, although there were more actual vowel phonemes because length and possibly nasality were phonemic. In modern German, long-short vowel pairs still exist but are also distinct in quality. Proto-Germanic probably had a more general S-O-V-I word order. However, the tendency toward V2 order may have already been present in latent form and may be related to Wackernagel's Law, an Indo-European law dictating that sentence clitics must be placed second. Roughly speaking, Germanic languages differ in how conservative or how progressive each language is with respect to an overall trend toward analyticity. Some, such as Icelandic and, to a lesser extent, German, have preserved much of the complex inflectional morphology inherited from Proto-Germanic (and in turn from Proto-Indo-European). Others, such as English, Swedish, and Afrikaans, have moved toward a largely analytic type. Linguistic developments The subgroupings of the Germanic languages are defined by shared innovations. It is important to distinguish innovations from cases of linguistic conservatism. That is, if two languages in a family share a characteristic that is not observed in a third language, that is evidence of common ancestry of the two languages only if the characteristic is an innovation compared to the family's proto-language. The following innovations are common to the Northwest Germanic languages (all but Gothic): The lowering of /u/ to /o/ in initial syllables before /a/ in the following syllable: → bode, Icelandic "messages" ("a-Umlaut", traditionally called Brechung) "Labial umlaut" in unstressed medial syllables (the conversion of /a/ to /u/ and /ō/ to /ū/ before /m/, or /u/ in the following syllable) The conversion of /ē1/ into /ā/ (vs. Gothic /ē/) in stressed syllables. In unstressed syllables, West Germanic also has this change, but North Germanic has shortened the vowel to /e/, then raised it to /i/. This suggests it was an areal change. The raising of final /ō/ to /u/ (Gothic lowers it to /a/). It is kept distinct from the nasal /ǭ/, which is not raised. The monophthongization of /ai/ and /au/ to /ē/ and /ō/ in non-initial syllables (however, evidence for the development of /au/ in medial syllables is lacking). The development of an intensified demonstrative ending in /s/ (reflected in English "this" compared to "the") Introduction of a distinct ablaut grade in Class VII strong verbs, while Gothic uses reduplication (e.g. Gothic haihait; ON, OE hēt, preterite of the Gmc verb *haitan "to be called") as part of a comprehensive reformation of the Gmc Class VII from a reduplicating to a new ablaut pattern, which presumably started in verbs beginning with vowel or /h/ (a development which continues the general trend of de-reduplication in Gmc); there are forms (such as OE dial. heht instead of hēt) which retain traces of reduplication even in West and North Germanic The following innovations are also common to the Northwest Germanic languages but represent areal changes: Proto-Germanic /z/ > /r/ (e.g. Gothic dius; ON dȳr, OHG tior, OE dēor, "wild animal"); note that this is not present in Proto-Norse and must be ordered after West Germanic loss of final /z/ Germanic umlaut The following innovations are common to the West Germanic languages: Loss of final /z/. In single-syllable words, Old High German retains it (as /r/), while it disappears in the other West Germanic languages. Change of [ð] (fricative allophone of /d/) to stop [d] in all environments. Change of /lþ/ to stop /ld/ (except word-finally). West Germanic gemination of consonants, except r, before /j/. This only occurred in short-stemmed words due to Sievers' law. Gemination of /p/, /t/, /k/ and /h/ is also observed before liquids. Labiovelar consonants become plain velar when non-initial. A particular type of umlaut /e-u-i/ > /i-u-i/. Changes to the 2nd person singular past-tense: Replacement of the past-singular stem vowel with the past-plural stem vowel, and substitution of the ending -t with -ī. Short forms (*stān, stēn, *gān, gēn) of the verbs for "stand" and "go"; but note that Crimean Gothic also has gēn. The development of a gerund. The following innovations are common to the Ingvaeonic subgroup of the West Germanic languages, which includes English, Frisian, and in a few cases Dutch and Low German, but not High German: The so-called Ingvaeonic nasal spirant law, with loss of /n/ before voiceless fricatives: e.g. *munþ, *gans > Old English mūþ, gōs > "mouth, goose", but German Mund, Gans. The loss of the Germanic reflexive pronoun . Dutch has reclaimed the reflexive pronoun from Middle High German . The reduction of the three Germanic verbal plural forms into one form ending in -þ. The development of Class III weak verbs into a relic class consisting of four verbs (*sagjan "to say", *hugjan "to think", *habjan "to have", *libjan "to live"; cf. the numerous Old High German verbs in -ēn). The split of the Class II weak verb ending *-ō- into *-ō-/-ōja- (cf. Old English -ian < -ōjan, but Old High German -ōn). Development of a plural ending *-ōs in a-stem nouns (note, Gothic also has -ōs, but this is an independent development, caused by terminal devoicing of *-ōz; Old Frisian has -ar, which is thought to be a late borrowing from Danish). Cf. modern English plural -(e)s, but German plural -e. Possibly, the monophthongization of Germanic *ai to ē/ā (this may represent independent changes in Old Saxon and Anglo-Frisian). The following innovations are common to the Anglo-Frisian subgroup of the Ingvaeonic languages: Raising of nasalized a, ā into o, ō. Anglo-Frisian brightening: Fronting of non-nasal a, ā to æ,ǣ when not followed by n or m. Metathesis of CrV into CVr, where C represents any consonant and V any vowel. Monophthongization of ai into ā. Common linguistic features Phonology The oldest Germanic languages all share a number of features, which are assumed to be inherited from Proto-Germanic. Phonologically, it includes the important sound changes known as Grimm's Law and Verner's Law, which introduced a large number of fricatives; late Proto-Indo-European had only one, /s/. The main vowel developments are the merging (in most circumstances) of long and short /a/ and /o/, producing short /a/ and long /ō/. That likewise affected the diphthongs, with PIE /ai/ and /oi/ merging into /ai/ and PIE /au/ and /ou/ merging into /au/. PIE /ei/ developed into long /ī/. PIE long /ē/ developed into a vowel denoted as /ē1/ (often assumed to be phonetically ), while a new, fairly uncommon long vowel /ē2/ developed in varied and not completely understood circumstances. Proto-Germanic had no front rounded vowels, but all Germanic languages except for Gothic subsequently developed them through the process of i-umlaut. Proto-Germanic developed a strong stress accent on the first syllable of the root, but remnants of the original free PIE accent are visible due to Verner's Law, which was sensitive to this accent. That caused a steady erosion of vowels in unstressed syllables. In Proto-Germanic, that had progressed only to the point that absolutely-final short vowels (other than /i/ and /u/) were lost and absolutely-final long vowels were shortened, but all of the early literary languages show a more advanced state of vowel loss. This ultimately resulted in some languages (like Modern English) losing practically all vowels following the main stress and the consequent rise of a very large number of monosyllabic words. Table of outcomes The following table shows the main outcomes of Proto-Germanic vowels and consonants in the various older languages. For vowels, only the outcomes in stressed syllables are shown. Outcomes in unstressed syllables are quite different, vary from language to language and depend on a number of other factors (such as whether the syllable was medial or final, whether the syllable was open or closed and (in some cases) whether the preceding syllable was light or heavy). Notes: C- means before a vowel (word-initially, or sometimes after a consonant). -C- means between vowels. -C means after a vowel (word-finally or before a consonant). Word-final outcomes generally occurred after deletion of final short vowels, which occurred shortly after Proto-Germanic and is reflected in the history of all written languages except for Proto-Norse. The above three are given in the order C-, -C-, -C. If one is omitted, the previous one applies. For example, f, -[v]- means that [v] occurs after a vowel regardless of what follows. Something like a(…u) means "a if /u/ occurs in the next syllable". Something like a(n) means "a if /n/ immediately follows". Something like (n)a means "a if /n/ immediately precedes". Morphology The oldest Germanic languages have the typical complex inflected morphology of old Indo-European languages, with four or five noun cases; verbs marked for person, number, tense and mood; multiple noun and verb classes; few or no articles; and rather free word order. The old Germanic languages are famous for having only two tenses (present and past), with three PIE past-tense aspects (imperfect, aorist, and perfect/stative) merged into one and no new tenses (future, pluperfect, etc.) developing. There were three moods: indicative, subjunctive (developed from the PIE optative mood) and imperative. Gothic verbs had a number of archaic features inherited from PIE that were lost in the other Germanic languages with few traces, including dual endings, an inflected passive voice (derived from the PIE mediopassive voice), and a class of verbs with reduplication in the past tense (derived from the PIE perfect). The complex tense system of modern English (e.g. In three months, the house will still be being built or If you had not acted so stupidly, we would never have been caught) is almost entirely due to subsequent developments (although paralleled in many of the other Germanic languages). Among the primary innovations in Proto-Germanic are the preterite present verbs, a special set of verbs whose present tense looks like the past tense of other verbs and which is the origin of most modal verbs in English; a past-tense ending; (in the so-called "weak verbs", marked with -ed in English) that appears variously as /d/ or /t/, often assumed to be derived from the verb "to do"; and two separate sets of adjective endings, originally corresponding to a distinction between indefinite semantics ("a man", with a combination of PIE adjective and pronoun endings) and definite semantics ("the man", with endings derived from PIE n-stem nouns). Note that most modern Germanic languages have lost most of the inherited inflectional morphology as a result of the steady attrition of unstressed endings triggered by the strong initial stress. (Contrast, for example, the Balto-Slavic languages, which have largely kept the Indo-European pitch accent and consequently preserved much of the inherited morphology.) Icelandic and to a lesser extent modern German best preserve the Proto–Germanic inflectional system, with four noun cases, three genders, and well-marked verbs. English and Afrikaans are at the other extreme, with almost no remaining inflectional morphology. The following shows a typical masculine a-stem noun, Proto-Germanic *fiskaz ("fish"), and its development in the various old literary languages: Strong vs. weak nouns and adjectives Originally, adjectives in Proto-Indo-European followed the same declensional classes as nouns. The most common class (the o/ā class) used a combination of o-stem endings for masculine and neuter genders and ā-stems ending for feminine genders, but other common classes (e.g. the i class and u class) used endings from a single vowel-stem declension for all genders, and various other classes existed that were based on other declensions. A quite different set of "pronominal" endings was used for pronouns, determiners, and words with related semantics (e.g., "all", "only"). An important innovation in Proto-Germanic was the development of two separate sets of adjective endings, originally corresponding to a distinction between indefinite semantics ("a man") and definite semantics ("the man"). The endings of indefinite adjectives were derived from a combination of pronominal endings with one of the common vowel-stem adjective declensions – usually the o/ā class (often termed the a/ō class in the specific context of the Germanic languages) but sometimes the i or u classes. Definite adjectives, however, had endings based on n-stem nouns. Originally both types of adjectives could be used by themselves, but already by Proto-Germanic times a pattern evolved whereby definite adjectives had to be accompanied by a determiner with definite semantics (e.g., a definite article, demonstrative pronoun, possessive pronoun, or the like), while indefinite adjectives were used in other circumstances (either accompanied by a word with indefinite semantics such as "a", "one", or "some" or unaccompanied). In the 19th century, the two types of adjectives – indefinite and definite – were respectively termed "strong" and "weak", names which are still commonly used. These names were based on the appearance of the two sets of endings in modern German. In German, the distinctive case endings formerly present on nouns have largely disappeared, with the result that the load of distinguishing one case from another is almost entirely carried by determiners and adjectives. Furthermore, due to regular sound change, the various definite (n-stem) adjective endings coalesced to the point where only two endings (-e and -en) remain in modern German to express the sixteen possible inflectional categories of the language (masculine/feminine/neuter/plural crossed with nominative/accusative/dative/genitive – modern German merges all genders in the plural). The indefinite (a/ō-stem) adjective endings were less affected by sound change, with six endings remaining (-, -e, -es, -er, -em, -en), cleverly distributed in a way that is capable of expressing the various inflectional categories without too much ambiguity. As a result, the definite endings were thought of as too "weak" to carry inflectional meaning and in need of "strengthening" by the presence of an accompanying determiner, while the indefinite endings were viewed as "strong" enough to indicate the inflectional categories even when standing alone. (This view is enhanced by the fact that modern German largely uses weak-ending adjectives when accompanying an indefinite article, and hence the indefinite/definite distinction no longer clearly applies.) By analogy, the terms "strong" and "weak" were extended to the corresponding noun classes, with a-stem and ō-stem nouns termed "strong" and n-stem nouns termed "weak". However, in Proto-Germanic – and still in Gothic, the most conservative Germanic language – the terms "strong" and "weak" are not clearly appropriate. For one thing, there were a large number of noun declensions. The a-stem, ō-stem, and n-stem declensions were the most common and represented targets into which the other declensions were eventually absorbed, but this process occurred only gradually. Originally the n-stem declension was not a single declension but a set of separate declensions (e.g., -an, -ōn, -īn) with related endings, and these endings were in no way any "weaker" than the endings of any other declensions. (For example, among the eight possible inflectional categories of a noun — singular/plural crossed with nominative/accusative/dative/genitive — masculine an-stem nouns in Gothic include seven endings, and feminine ōn-stem nouns include six endings, meaning there is very little ambiguity of "weakness" in these endings and in fact much less than in the German "strong" endings.) Although it is possible to group the various noun declensions into three basic categories — vowel-stem, n-stem, and other-consonant-stem (a.k.a. "minor declensions") — the vowel-stem nouns do not display any sort of unity in their endings that supports grouping them together with each other but separate from the n-stem endings. It is only in later languages that the binary distinction between "strong" and "weak" nouns become more relevant. In Old English, the n-stem nouns form a single, clear class, but the masculine a-stem and feminine ō-stem nouns have little in common with each other, and neither has much similarity to the small class of u-stem nouns. Similarly, in Old Norse, the masculine a-stem and feminine ō-stem nouns have little in common with each other, and the continuations of the masculine an-stem and feminine ōn/īn-stem nouns are also quite distinct. It is only in Middle Dutch and modern German that the various vowel-stem nouns have merged to the point that a binary strong/weak distinction clearly applies. As a result, newer grammatical descriptions of the Germanic languages often avoid the terms "strong" and "weak" except in conjunction with German itself, preferring instead to use the terms "indefinite" and "definite" for adjectives and
2020, approximately 15.4 million people were enrolled in learning German across all levels of education worldwide. This number has decreased from a peak of 20.1 million in 2000. Within the EU, not counting countries where it is an official language, German as a foreign language is most popular in Eastern and Northern Europe, namely the Czech Republic, Croatia, Denmark, the Netherlands, Slovakia, Hungary, Slovenia, Sweden, Poland, and Bosnia and Herzegovina. German was once, and to some extent still is, a lingua franca in those parts of Europe. Standard High German The basis of Standard High German developed with the Luther Bible and the chancery language spoken by the Saxon court. However, there are places where the traditional regional dialects have been replaced by new vernaculars based on Standard High German; that is the case in large stretches of Northern Germany but also in major cities in other parts of the country. It is important to note, however, that the colloquial Standard High German differs from the formal written language, especially in grammar and syntax, in which it has been influenced by dialectal speech. Standard High German differs regionally among German-speaking countries in vocabulary and some instances of pronunciation and even grammar and orthography. This variation must not be confused with the variation of local dialects. Even though the regional varieties of Standard High German are only somewhat influenced by the local dialects, they are very distinct. Standard High German is thus considered a pluricentric language. In most regions, the speakers use a continuum from more dialectal varieties to more standard varieties depending on the circumstances. Varieties In German linguistics, German dialects are distinguished from varieties of Standard High German. The varieties of Standard High German refer to the different local varieties of the pluricentric Standard High German. They differ only slightly in lexicon and phonology. In certain regions, they have replaced the traditional German dialects, especially in Northern Germany. German Standard German Austrian Standard German Swiss Standard German In the German-speaking parts of Switzerland, mixtures of dialect and standard are very seldom used, and the use of Standard High German is largely restricted to the written language. About 11% of the Swiss residents speak Standard High German at home, but this is mainly due to German immigrants. This situation has been called a medial diglossia. Swiss Standard German is used in the Swiss education system, while Austrian German is officially used in the Austrian education system. Dialects The German dialects are the traditional local varieties of the language; many of them are not mutually intelligibile with standard German, and they have great differences in lexicon, phonology, and syntax. If a narrow definition of language based on mutual intelligibility is used, many German dialects are considered to be separate languages (for instance in the Ethnologue). However, such a point of view is unusual in German linguistics. The German dialect continuum is traditionally divided most broadly into High German and Low German, also called Low Saxon. However, historically, High German dialects and Low Saxon/Low German dialects do not belong to the same language. Nevertheless, in today's Germany, Low Saxon/Low German is often perceived as a dialectal variation of Standard German on a functional level even by many native speakers. The variation among the German dialects is considerable, with often only neighbouring dialects being mutually intelligible. Some dialects are not intelligible to people who know only Standard German. However, all German dialects belong to the dialect continuum of High German and Low Saxon. Low German or Low Saxon Middle Low German was the lingua franca of the Hanseatic League. It was the predominant language in Northern Germany until the 16th century. In 1534, the Luther Bible was published. It aimed to be understandable to a broad audience and was based mainly on Central and Upper German varieties. The Early New High German language gained more prestige than Low German and became the language of science and literature. Around the same time, the Hanseatic League, a confederation of northern ports, lost its importance as new trade routes to Asia and the Americas were established, and the most powerful German states of that period were located in Middle and Southern Germany. The 18th and 19th centuries were marked by mass education in Standard German in schools. Gradually, Low German came to be politically viewed as a mere dialect spoken by the uneducated. The proportion of the population who can understand and speak it has decreased continuously since World War II. The major cities in the Low German area are Hamburg, Hanover, Bremen and Dortmund. Sometimes, Low Saxon and Low Franconian varieties are grouped together because both are unaffected by the High German consonant shift. Low Franconian In Germany, Low Franconian dialects are spoken in the northwest of North Rhine-Westphalia, along the Lower Rhine. The Low Franconian dialects spoken in Germany are referred to as Low Rhenish. In the north of the German Low Franconian language area, North Low Franconian dialects (also referred to as Cleverlands or as dialects of South Guelderish) are spoken. The South Low Franconian and Bergish dialects, which are spoken in the south of the German Low Franconian language area, are transitional dialects between Low Franconian and Ripuarian dialects. The Low Franconian dialects fall within a linguistic category used to classify a number of historical and contemporary West Germanic varieties most closely related to, and including, the Dutch language. Consequently, the vast majority of the Low Franconian dialects are spoken outside of the German language area, in the Netherlands and Belgium. During the Middle Ages and Early Modern Period, the Low Franconian dialects now spoken in Germany, used Middle Dutch or Early Modern Dutch as their literary language and Dachsprache. Following a 19th-century change in Prussian language policy, use of Dutch as an official and public language was forbidden; resulting in Standard German taking its place as the region's official language. As a result, these dialects are now considered German dialects from a socio-linguistic point of view. Nevertheless, topologically these dialects are structurally and phonologically far more similar to Dutch, than to German and form both the smallest and most divergent dialect cluster within the contemporary German language area. High German The High German dialects consist of the Central German, High Franconian and Upper German dialects. The High Franconian dialects are transitional dialects between Central and Upper German. The High German varieties spoken by the Ashkenazi Jews have several unique features and are considered as a separate language, Yiddish, written with the Hebrew alphabet. Central German The Central German dialects are spoken in Central Germany, from Aachen in the west to Görlitz in the east. They consist of Franconian dialects in the west (West Central German) and non-Franconian dialects in the east (East Central German). Modern Standard German is mostly based on Central German dialects. The Franconian, West Central German dialects are the Central Franconian dialects (Ripuarian and Moselle Franconian) and the Rhine Franconian dialects (Hessian and Palatine). These dialects are considered as German in Germany and Belgium Luxembourgish in Luxembourg Lorraine Franconian (spoken in Moselle) and as a Rhine Franconian variant of Alsatian (spoken in Alsace bossue only) in France Limburgish or Kerkrade dialect in the Netherlands. Luxembourgish as well as the Transylvanian Saxon dialect spoken in Transylvania are based on Moselle Franconian dialects. The major cities in the Franconian Central German area are Cologne and Frankfurt. Further east, the non-Franconian, East Central German dialects are spoken (Thuringian, Upper Saxon and North Upper Saxon-South Markish, and earlier, in the then German-speaking parts of Silesia also Silesian, and in then German southern East Prussia also High Prussian). The major cities in the East Central German area are Berlin and Leipzig. High Franconian The High Franconian dialects are transitional dialects between Central and Upper German. They consist of the East and South Franconian dialects. The East Franconian dialect branch is one of the most spoken dialect branches in Germany. These dialects are spoken in the region of Franconia and in the central parts of Saxon Vogtland. Franconia consists of the Bavarian districts of Upper, Middle, and Lower Franconia, the region of South Thuringia (Thuringia), and the eastern parts of the region of Heilbronn-Franken (Tauber Franconia and Hohenlohe) in Baden-Württemberg. The major cities in the East Franconian area are Nuremberg and Würzburg. South Franconian is mainly spoken in northern Baden-Württemberg in Germany, but also in the northeasternmost part of the region of Alsace in France. In Baden-Württemberg, they are considered as dialects of German. The major cities in the South Franconian area are Karlsruhe and Heilbronn. Upper German The Upper German dialects are the Alemannic and Swabian dialects in the west and the Bavarian dialects in the east. Alemannic and Swabian Alemannic dialects are spoken in Switzerland (High Alemannic in the densely populated Swiss Plateau, in the south also Highest Alemannic, and Low Alemannic in Basel), Baden-Württemberg (Swabian and Low Alemannic, in the southwest also High Alemannic), Bavarian Swabia (Swabian, in the southwesternmost part also Low Alemannic), Vorarlberg (Low, High, and Highest Alemannic), Alsace (Low Alemannic, in the southernmost part also High Alemannic), Liechtenstein (High and Highest Alemannic), and in the Tyrolean district of Reutte (Swabian). The Alemannic dialects are considered as Alsatian in Alsace. The major cities in the Alemannic area are Stuttgart, Freiburg, Basel, Zürich, Lucerne and Bern. Bavarian Bavarian dialects are spoken in Austria (Vienna, Lower and Upper Austria, Styria, Carinthia, Salzburg, Burgenland, and in most parts of Tyrol), Bavaria (Upper and Lower Bavaria as well as Upper Palatinate), South Tyrol, southwesternmost Saxony (Southern Vogtländisch), and in the Swiss village of Samnaun. The major cities in the Bavarian area are Vienna, Munich, Salzburg, Regensburg, Graz and Bolzano. Regiolects Berlinian, the High German regiolect or dialect of Berlin with Low German substrate Missingsch, a Low-German-coloured variety of High German. Ruhrdeutsch (Ruhr German), the High German regiolect of the Ruhr area. Grammar German is a fusional language with a moderate degree of inflection, with three grammatical genders; as such, there can be a large number of words derived from the same root. Noun inflection German nouns inflect by case, gender, and number: four cases: nominative, accusative, genitive, and dative. three genders: masculine, feminine, and neuter. Word endings sometimes reveal grammatical gender: for instance, nouns ending in (-ing), (-ship), or (-hood, -ness) are feminine, nouns ending in or (diminutive forms) are neuter and nouns ending in (-ism) are masculine. Others are more variable, sometimes depending on the region in which the language is spoken. And some endings are not restricted to one gender, for example: (-er), such as (feminine), celebration, party; (masculine), labourer; and (neuter), thunderstorm. two numbers: singular and plural. This degree of inflection is considerably less than in Old High German and other old Indo-European languages such as Latin, Ancient Greek, and Sanskrit, and it is also somewhat less than, for instance, Old English, modern Icelandic, or Russian. The three genders have collapsed in the plural. With four cases and three genders plus plural, there are 16 permutations of case and gender/number of the article (not the nouns), but there are only six forms of the definite article, which together cover all 16 permutations. In nouns, inflection for case is required in the singular for strong masculine and neuter nouns only in the genitive and in the dative (only in fixed or archaic expressions), and even this is losing ground to substitutes in informal speech. Weak masculine nouns share a common case ending for genitive, dative, and accusative in the singular. Feminine nouns are not declined in the singular. The plural has an inflection for the dative. In total, seven inflectional endings (not counting plural markers) exist in German: . Like the other Germanic languages, German forms noun compounds in which the first noun modifies the category given by the second: ("dog hut"; specifically: "dog kennel"). Unlike English, whose newer compounds or combinations of longer nouns are often written "open" with separating spaces, German (like some other Germanic languages) nearly always uses the "closed" form without spaces, for example: ("tree house"). Like English, German allows arbitrarily long compounds in theory (see also English compounds). The longest German word verified to be actually in (albeit very limited) use is , which, literally translated, is "beef labelling supervision duties assignment law" [from (cattle), (meat), (labelling), (supervision), (duties), (assignment), (law)]. However, examples like this are perceived by native speakers as excessively bureaucratic, stylistically awkward, or even satirical. Verb inflection The inflection of standard German verbs includes: two main conjugation classes: weak and strong (as in English). Additionally, there is a third class, known as mixed verbs, whose conjugation combines features of both the strong and weak patterns. three persons: first, second and third. two numbers: singular and plural. three moods: indicative, imperative and subjunctive (in addition to infinitive). two voices: active and passive. The passive voice uses auxiliary verbs and is divisible into static and dynamic. Static forms show a constant state and use the verb ’'to be'’ (sein). Dynamic forms show an action and use the verb "to become'’ (werden). two tenses without auxiliary verbs (present and preterite) and four tenses constructed with auxiliary verbs (perfect, pluperfect, future and future perfect). the distinction between grammatical aspects is rendered by combined use of the subjunctive or preterite marking so the plain indicative voice uses neither of those two markers; the subjunctive by itself often conveys reported speech; subjunctive plus preterite marks the conditional state; and the preterite alone shows either plain indicative (in the past), or functions as a (literal) alternative for either reported speech or the conditional state of the verb, when necessary for clarity. the distinction between perfect and progressive aspect is and has, at every stage of development, been a productive category of the older language and in nearly all documented dialects, but strangely enough it is now rigorously excluded from written usage in its present normalised form. disambiguation of completed vs. uncompleted forms is widely observed and regularly generated by common prefixes ( [to look], [to see – unrelated form: ]). Verb prefixes The meaning of basic verbs can be expanded and sometimes radically changed through the use of a number of prefixes. Some prefixes have a specific meaning; the prefix refers to destruction, as in (to tear apart), (to break apart), (to cut apart). Other prefixes have only the vaguest meaning in themselves; is found in a number of verbs with a large variety of meanings, as in (to try) from (to seek), (to interrogate) from (to take), (to distribute) from (to share), (to understand) from (to stand). Other examples include the following: (to stick), (to detain); (to buy), (to sell); (to hear), (to cease); (to drive), (to experience). Many German verbs have a separable prefix, often with an adverbial function. In finite verb forms, it is split off and moved to the end of the clause and is hence considered by some to be a "resultative particle". For example, , meaning "to go along", would be split, giving (Literal: "Go you with?"; Idiomatic: "Are you going along?"). Indeed, several parenthetical clauses may occur between the prefix of a finite verb and its complement (ankommen = to arrive, er kam an = he arrived, er ist angekommen = he has arrived): A selectively literal translation of this example to illustrate the point might look like this: He "came" on Friday evening, after a hard day at work and the usual annoyances that had time and again been troubling him for years now at his workplace, with questionable joy, to a meal which, as he hoped, his wife had already put on the table, finally home "to". Word order German word order is generally with the V2 word order restriction and also with the SOV word order restriction for main clauses. For yes-no questions, exclamations, and wishes, the finite verb always has the first position. In subordinate clauses, the verb occurs at the very end. German requires a verbal element (main verb or auxiliary verb) to appear second in the sentence. The verb is preceded by the topic of the sentence. The element in focus appears at the end of the sentence. For a sentence without an auxiliary, these are several possibilities: (The old man gave me yesterday the book; normal order) (The book gave [to] me yesterday the old man) (The book gave the old man [to] me yesterday) (The book gave [to] me the old man yesterday) (Yesterday gave [to] me the old man the book, normal order) ([To] me gave the old man the book yesterday (entailing: as for someone else, it was another date)) The position of a noun in a German sentence has no bearing on its being a subject, an object or another argument. In a declarative sentence in English, if the subject does not occur before the predicate, the sentence could well be misunderstood. However, German's flexible word order allows one to emphasise specific words: Normal word order: The manager entered yesterday at 10 o'clock with an umbrella in the hand his office. Object in front: His office entered the manager yesterday at 10 o'clock with an umbrella in the hand. The object (his office) is thus highlighted; it could be the topic of the next sentence. Adverb of time in front: Yesterday entered the manager at 10 o'clock with an umbrella in the hand his office. (but today without umbrella) Both time expressions in front: . Yesterday at 10 o'clock entered the manager with an umbrella in the hand his office. The full-time specification is highlighted. Another possibility: . Yesterday at 10 o'clock entered the manager his office with an umbrella in the hand. Both the time specification and the fact
dialect known as Barossa German developed, spoken predominantly in the Barossa Valley near Adelaide. Usage of German sharply declined with the advent of World War I, due to the prevailing anti-German sentiment in the population and related government action. It continued to be used as a first language into the 20th century, but its use is now limited to a few older speakers. As of the 2013 census, 36,642 people in New Zealand spoke German, mostly descendants of a small wave of 19th century German immigrants, making it the third most spoken European language after English and French and overall the ninth most spoken language. A German creole named was historically spoken in the former German colony of German New Guinea, modern day Papua New Guinea. It is at a high risk of extinction, with only about 100 speakers remaining, and a topic of interest among linguists seeking to revive interest in the language. As a foreign language Like English, French, and Spanish, German has become a standard foreign language throughout the world, especially in the Western World. German ranks second on par with French among the best known foreign languages in the European Union (EU) after English, as well as in Russia and Turkey. In terms of student numbers across all levels of education, German ranks third in the EU (after English and French) and in the United States (after Spanish and French). In 2020, approximately 15.4 million people were enrolled in learning German across all levels of education worldwide. This number has decreased from a peak of 20.1 million in 2000. Within the EU, not counting countries where it is an official language, German as a foreign language is most popular in Eastern and Northern Europe, namely the Czech Republic, Croatia, Denmark, the Netherlands, Slovakia, Hungary, Slovenia, Sweden, Poland, and Bosnia and Herzegovina. German was once, and to some extent still is, a lingua franca in those parts of Europe. Standard High German The basis of Standard High German developed with the Luther Bible and the chancery language spoken by the Saxon court. However, there are places where the traditional regional dialects have been replaced by new vernaculars based on Standard High German; that is the case in large stretches of Northern Germany but also in major cities in other parts of the country. It is important to note, however, that the colloquial Standard High German differs from the formal written language, especially in grammar and syntax, in which it has been influenced by dialectal speech. Standard High German differs regionally among German-speaking countries in vocabulary and some instances of pronunciation and even grammar and orthography. This variation must not be confused with the variation of local dialects. Even though the regional varieties of Standard High German are only somewhat influenced by the local dialects, they are very distinct. Standard High German is thus considered a pluricentric language. In most regions, the speakers use a continuum from more dialectal varieties to more standard varieties depending on the circumstances. Varieties In German linguistics, German dialects are distinguished from varieties of Standard High German. The varieties of Standard High German refer to the different local varieties of the pluricentric Standard High German. They differ only slightly in lexicon and phonology. In certain regions, they have replaced the traditional German dialects, especially in Northern Germany. German Standard German Austrian Standard German Swiss Standard German In the German-speaking parts of Switzerland, mixtures of dialect and standard are very seldom used, and the use of Standard High German is largely restricted to the written language. About 11% of the Swiss residents speak Standard High German at home, but this is mainly due to German immigrants. This situation has been called a medial diglossia. Swiss Standard German is used in the Swiss education system, while Austrian German is officially used in the Austrian education system. Dialects The German dialects are the traditional local varieties of the language; many of them are not mutually intelligibile with standard German, and they have great differences in lexicon, phonology, and syntax. If a narrow definition of language based on mutual intelligibility is used, many German dialects are considered to be separate languages (for instance in the Ethnologue). However, such a point of view is unusual in German linguistics. The German dialect continuum is traditionally divided most broadly into High German and Low German, also called Low Saxon. However, historically, High German dialects and Low Saxon/Low German dialects do not belong to the same language. Nevertheless, in today's Germany, Low Saxon/Low German is often perceived as a dialectal variation of Standard German on a functional level even by many native speakers. The variation among the German dialects is considerable, with often only neighbouring dialects being mutually intelligible. Some dialects are not intelligible to people who know only Standard German. However, all German dialects belong to the dialect continuum of High German and Low Saxon. Low German or Low Saxon Middle Low German was the lingua franca of the Hanseatic League. It was the predominant language in Northern Germany until the 16th century. In 1534, the Luther Bible was published. It aimed to be understandable to a broad audience and was based mainly on Central and Upper German varieties. The Early New High German language gained more prestige than Low German and became the language of science and literature. Around the same time, the Hanseatic League, a confederation of northern ports, lost its importance as new trade routes to Asia and the Americas were established, and the most powerful German states of that period were located in Middle and Southern Germany. The 18th and 19th centuries were marked by mass education in Standard German in schools. Gradually, Low German came to be politically viewed as a mere dialect spoken by the uneducated. The proportion of the population who can understand and speak it has decreased continuously since World War II. The major cities in the Low German area are Hamburg, Hanover, Bremen and Dortmund. Sometimes, Low Saxon and Low Franconian varieties are grouped together because both are unaffected by the High German consonant shift. Low Franconian In Germany, Low Franconian dialects are spoken in the northwest of North Rhine-Westphalia, along the Lower Rhine. The Low Franconian dialects spoken in Germany are referred to as Low Rhenish. In the north of the German Low Franconian language area, North Low Franconian dialects (also referred to as Cleverlands or as dialects of South Guelderish) are spoken. The South Low Franconian and Bergish dialects, which are spoken in the south of the German Low Franconian language area, are transitional dialects between Low Franconian and Ripuarian dialects. The Low Franconian dialects fall within a linguistic category used to classify a number of historical and contemporary West Germanic varieties most closely related to, and including, the Dutch language. Consequently, the vast majority of the Low Franconian dialects are spoken outside of the German language area, in the Netherlands and Belgium. During the Middle Ages and Early Modern Period, the Low Franconian dialects now spoken in Germany, used Middle Dutch or Early Modern Dutch as their literary language and Dachsprache. Following a 19th-century change in Prussian language policy, use of Dutch as an official and public language was forbidden; resulting in Standard German taking its place as the region's official language. As a result, these dialects are now considered German dialects from a socio-linguistic point of view. Nevertheless, topologically these dialects are structurally and phonologically far more similar to Dutch, than to German and form both the smallest and most divergent dialect cluster within the contemporary German language area. High German The High German dialects consist of the Central German, High Franconian and Upper German dialects. The High Franconian dialects are transitional dialects between Central and Upper German. The High German varieties spoken by the Ashkenazi Jews have several unique features and are considered as a separate language, Yiddish, written with the Hebrew alphabet. Central German The Central German dialects are spoken in Central Germany, from Aachen in the west to Görlitz in the east. They consist of Franconian dialects in the west (West Central German) and non-Franconian dialects in the east (East Central German). Modern Standard German is mostly based on Central German dialects. The Franconian, West Central German dialects are the Central Franconian dialects (Ripuarian and Moselle Franconian) and the Rhine Franconian dialects (Hessian and Palatine). These dialects are considered as German in Germany and Belgium Luxembourgish in Luxembourg Lorraine Franconian (spoken in Moselle) and as a Rhine Franconian variant of Alsatian (spoken in Alsace bossue only) in France Limburgish or Kerkrade dialect in the Netherlands. Luxembourgish as well as the Transylvanian Saxon dialect spoken in Transylvania are based on Moselle Franconian dialects. The major cities in the Franconian Central German area are Cologne and Frankfurt. Further east, the non-Franconian, East Central German dialects are spoken (Thuringian, Upper Saxon and North Upper Saxon-South Markish, and earlier, in the then German-speaking parts of Silesia also Silesian, and in then German southern East Prussia also High Prussian). The major cities in the East Central German area are Berlin and Leipzig. High Franconian The High Franconian dialects are transitional dialects between Central and Upper German. They consist of the East and South Franconian dialects. The East Franconian dialect branch is one of the most spoken dialect branches in Germany. These dialects are spoken in the region of Franconia and in the central parts of Saxon Vogtland. Franconia consists of the Bavarian districts of Upper, Middle, and Lower Franconia, the region of South Thuringia (Thuringia), and the eastern parts of the region of Heilbronn-Franken (Tauber Franconia and Hohenlohe) in Baden-Württemberg. The major cities in the East Franconian area are Nuremberg and Würzburg. South Franconian is mainly spoken in northern Baden-Württemberg in Germany, but also in the northeasternmost part of the region of Alsace in France. In Baden-Württemberg, they are considered as dialects of German. The major cities in the South Franconian area are Karlsruhe and Heilbronn. Upper German The Upper German dialects are the Alemannic and Swabian dialects in the west and the Bavarian dialects in the east. Alemannic and Swabian Alemannic dialects are spoken in Switzerland (High Alemannic in the densely populated Swiss Plateau, in the south also Highest Alemannic, and Low Alemannic in Basel), Baden-Württemberg (Swabian and Low Alemannic, in the southwest also High Alemannic), Bavarian Swabia (Swabian, in the southwesternmost part also Low Alemannic), Vorarlberg (Low, High, and Highest Alemannic), Alsace (Low Alemannic, in the southernmost part also High Alemannic), Liechtenstein (High and Highest Alemannic), and in the Tyrolean district of Reutte (Swabian). The Alemannic dialects are considered as Alsatian in Alsace. The major cities in the Alemannic area are Stuttgart, Freiburg, Basel, Zürich, Lucerne and Bern. Bavarian Bavarian dialects are spoken in Austria (Vienna, Lower and Upper Austria, Styria, Carinthia, Salzburg, Burgenland, and in most parts of Tyrol), Bavaria (Upper and Lower Bavaria as well as Upper Palatinate), South Tyrol, southwesternmost Saxony (Southern Vogtländisch), and in the Swiss village of Samnaun. The major cities in the Bavarian area are Vienna, Munich, Salzburg, Regensburg, Graz and Bolzano. Regiolects Berlinian, the High German regiolect or dialect of Berlin with Low German substrate Missingsch, a Low-German-coloured variety of High German. Ruhrdeutsch (Ruhr German), the High German regiolect of the Ruhr area. Grammar German is a fusional language with a moderate degree of inflection, with three grammatical genders; as such, there can be a large number of words derived from the same root. Noun inflection German nouns inflect by case, gender, and number: four cases: nominative, accusative, genitive, and dative. three genders: masculine, feminine, and neuter. Word endings sometimes reveal grammatical gender: for instance, nouns ending in (-ing), (-ship), or (-hood, -ness) are feminine, nouns ending in or (diminutive forms) are neuter and nouns ending in (-ism) are masculine. Others are more variable, sometimes depending on the region in which the language is spoken. And some endings are not restricted to one gender, for example: (-er), such as (feminine), celebration, party; (masculine), labourer; and (neuter), thunderstorm. two numbers: singular and plural. This degree of inflection is considerably less than in Old High German and other old Indo-European languages such as Latin, Ancient Greek, and Sanskrit, and it is also somewhat less than, for instance, Old English, modern Icelandic, or Russian. The three genders have collapsed in the plural. With four cases and three genders plus plural, there are 16 permutations of case and gender/number of the article (not the nouns), but there are only six forms of the definite article, which together cover all 16 permutations. In nouns, inflection for case is required in the singular for strong masculine and neuter nouns only in the genitive and in the dative (only in fixed or archaic expressions), and even this is losing ground to substitutes in informal speech. Weak masculine nouns share a common case ending for genitive, dative, and accusative in the singular. Feminine nouns are not declined in the singular. The plural has an inflection for the dative. In total, seven inflectional endings (not counting plural markers) exist in German: . Like the other Germanic languages, German forms noun compounds in which the first noun modifies the category given by the second: ("dog hut"; specifically: "dog kennel"). Unlike English, whose newer compounds or combinations of longer nouns are often written "open" with separating spaces, German (like some other Germanic languages) nearly always uses the "closed" form without spaces, for example: ("tree house"). Like English, German allows arbitrarily long compounds in theory (see also English compounds). The longest German word verified to be actually in (albeit very limited) use is , which, literally translated, is "beef labelling supervision duties assignment law" [from (cattle), (meat), (labelling), (supervision), (duties), (assignment), (law)]. However, examples like this are perceived by native speakers as excessively bureaucratic, stylistically awkward, or even satirical. Verb inflection The inflection of standard German verbs includes: two main conjugation classes: weak and strong (as in English). Additionally, there is a third class, known as mixed verbs, whose conjugation combines features of both the strong and weak patterns. three persons: first, second and third. two numbers: singular and plural. three moods: indicative, imperative and subjunctive (in addition to infinitive). two voices: active and passive. The passive voice uses auxiliary verbs and is divisible into static and dynamic. Static forms show a constant state and use the verb ’'to
Nouns and adjectives Pronouns show distinctions in person (1st, 2nd, and 3rd), number (singular, dual, and plural in the ancient language; singular and plural alone in later stages), and gender (masculine, feminine, and neuter), and decline for case (from six cases in the earliest forms attested to four in the modern language). Nouns, articles, and adjectives show all the distinctions except for a person. Both attributive and predicative adjectives agree with the noun. Verbs The inflectional categories of the Greek verb have likewise remained largely the same over the course of the language's history but with significant changes in the number of distinctions within each category and their morphological expression. Greek verbs have synthetic inflectional forms for: Syntax Many aspects of the syntax of Greek have remained constant: verbs agree with their subject only, the use of the surviving cases is largely intact (nominative for subjects and predicates, accusative for objects of most verbs and many prepositions, genitive for possessors), articles precede nouns, adpositions are largely prepositional, relative clauses follow the noun they modify and relative pronouns are clause-initial. However, the morphological changes also have their counterparts in the syntax, and there are also significant differences between the syntax of the ancient and that of the modern form of the language. Ancient Greek made great use of participial constructions and of constructions involving the infinitive, and the modern variety lacks the infinitive entirely (employing a raft of new periphrastic constructions instead) and uses participles more restrictively. The loss of the dative led to a rise of prepositional indirect objects (and the use of the genitive to directly mark these as well). Ancient Greek tended to be verb-final, but neutral word order in the modern language is VSO or SVO. Vocabulary Modern Greek inherits most of its vocabulary from Ancient Greek, which in turn is an Indo-European language, but also includes a number of borrowings from the languages of the populations that inhabited Greece before the arrival of Proto-Greeks, some documented in Mycenaean texts; they include a large number of Greek toponyms. The form and meaning of many words have evolved. Loanwords (words of foreign origin) have entered the language, mainly from Latin, Venetian, and Turkish. During the older periods of Greek, loanwords into Greek acquired Greek inflections, thus leaving only a foreign root word. Modern borrowings (from the 20th century on), especially from French and English, are typically not inflected; other modern borrowings are derived from South Slavic (Macedonian/Bulgarian) and Eastern Romance languages (Aromanian and Megleno-Romanian). Greek loanwords in other languages Greek words have been widely borrowed into other languages, including English. Example words include: mathematics, physics, astronomy, democracy, philosophy, athletics, theatre, rhetoric, baptism, evangelist, etc. Moreover, Greek words and word elements continue to be productive as a basis for coinages: anthropology, photography, telephony, isomer, biomechanics, cinematography, etc. Together with Latin words, they form the foundation of international scientific and technical vocabulary. For example, all words ending in –logy ("discourse"). There are many English words of Greek origin. Classification Greek is an independent branch of the Indo-European language family. The ancient language most closely related to it may be ancient Macedonian, which most scholars suggest may have been a dialect of Greek itself, but it is poorly attested and it is difficult to conclude. Independently of the Macedonian question, some scholars have grouped Greek into Graeco-Phrygian, as Greek and the extinct Phrygian share features that are not found in other Indo-European languages. Among living languages, some Indo-Europeanists suggest that Greek may be most closely related to Armenian (see Graeco-Armenian) or the Indo-Iranian languages (see Graeco-Aryan), but little definitive evidence has been found for grouping the living branches of the family. In addition, Albanian has also been considered somewhat related to Greek and Armenian by some linguists. If proven and recognized, the three languages would form a new Balkan sub-branch with other dead European languages. Writing system Linear B Linear B, attested as early as the late 15th century BC, was the first script used to write Greek. It is basically a syllabary, which was finally deciphered by Michael Ventris and John Chadwick in the 1950s (its precursor, Linear A, has not been deciphered and most likely encodes a non-Greek language). The language of the Linear B texts, Mycenaean Greek, is the earliest known form of Greek. Cypriot syllabary Another similar system used to write the Greek language was the Cypriot syllabary (also a descendant of Linear A via the intermediate Cypro-Minoan syllabary), which is closely related to Linear B but uses somewhat different syllabic conventions to represent phoneme sequences. The Cypriot syllabary is attested in Cyprus from the 11th century BC until its gradual abandonment in the late Classical period, in favor of the standard Greek alphabet. Greek alphabet Greek has been written in the Greek alphabet since approximately the 9th century BC. It was created by modifying the Phoenician alphabet, with the innovation of adopting certain letters to represent the vowels. The variant of the alphabet in use today is essentially the late Ionic variant, introduced for writing classical Attic in 403 BC. In classical Greek, as in classical Latin, only upper-case letters existed. The lower-case Greek letters were developed much later by medieval scribes to permit a faster, more convenient cursive writing style with the use of ink and quill. The Greek alphabet consists of 24 letters, each with an uppercase (majuscule) and lowercase (minuscule) form. The letter sigma has an additional lowercase form (ς) used in the final position: Diacritics In addition to the letters, the Greek alphabet features a number of diacritical signs: three different accent marks (acute, grave, and circumflex), originally denoting different shapes of pitch accent on the stressed vowel; the so-called breathing marks (rough and smooth breathing), originally used to signal presence or absence of word-initial /h/; and the diaeresis, used to mark the full syllabic value of a vowel that would otherwise be read as part of a diphthong. These marks were introduced during the course of the Hellenistic period. Actual usage of the grave in handwriting saw a rapid decline in favor of uniform usage of the acute during the late 20th century, and it has only been retained in typography. After the writing reform of 1982, most diacritics are no longer used. Since then, Greek has been written mostly in the simplified monotonic orthography (or monotonic system), which employs only the acute accent and the diaeresis. The traditional system, now called the polytonic orthography (or polytonic system), is still used internationally for the writing of Ancient Greek. Punctuation In Greek, the question mark is written as the English semicolon, while the functions of the colon and semicolon are performed by a raised point (•), known as the ano teleia (). In Greek the comma also functions as a silent letter in a handful of Greek words, principally distinguishing (ó,ti, 'whatever') from (óti, 'that'). Ancient Greek texts often used scriptio continua ('continuous writing'), which means that ancient authors and scribes would write word after word with no spaces or punctuation between words to differentiate or mark boundaries. Boustrophedon, or bi-directional text, was also used in Ancient Greek. Latin alphabet Greek has occasionally been written in the Latin script, especially in areas under Venetian rule or by Greek Catholics. The term / applies when the Latin script is used to write Greek in the cultural ambit of Catholicism (because / is an older Greek term for West-European dating to when most of (Roman Catholic Christian) West Europe was under the control of the Frankish Empire). / (meaning 'Catholic Chiot') alludes to the significant presence of Catholic missionaries based on the island of Chios. Additionally, the term Greeklish is often used when the Greek language is written in a Latin script in online communications. The Latin script is nowadays used by the Greek-speaking communities of Southern Italy. Hebrew alphabet The Yevanic dialect was written by Romaniote and Constantinopolitan Karaite Jews using the Hebrew Alphabet. Arabic alphabet Some Greek Muslims from Crete wrote their Cretan Greek in the Arabic alphabet. The same happened among Epirote Muslims in Ioannina. This usage is sometimes called aljamiado as when Romance languages are written in the Arabic alphabet. Example text The Article 1 of the Universal Declaration of Human Rights in Greek: 'Ολοι οι άνθρωποι γεννιούνται ελεύθεροι και ίσοι στην αξιοπρέπεια και τα δικαιώματα. Είναι προικισμένοι με λογική και συνείδηση, και οφείλουν να συμπεριφέρονται μεταξύ τους με πνεύμα αδελφοσύνης. Transcription of the example text into Latin alphabet: 'Oloi oi ánthropoi gennioúntai eléftheroi kai ísoi stin axioprépeia kai ta dikaiómata. Eínai proikisménoi me logikí kai syneídisi, kai ofeíloun na symperiférontai metaxý tous me pnévma adelfosýnis. The Article 1 of the Universal Declaration of Human Rights in English: All human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood. See also Modern Greek Varieties of Modern Greek Medieval Greek Ancient Greek Ancient Greek dialects Hellenic languages List of Greek and Latin roots in English List of medical roots, suffixes and prefixes Notes References Citations Sources Further reading External links General background Greek Language, Columbia Electronic Encyclopedia. The Greek Language and Linguistics Gateway, useful information on the history of the Greek language, application of modern Linguistics to the study of Greek, and tools for learning Greek. Aristotle University of Thessaloniki, The Greek Language Portal, a portal for Greek language and linguistic education. The Perseus Project has many useful pages for the study of classical languages and
Historical unity The historical unity and continuing identity between the various stages of the Greek language are often emphasized. Although Greek has undergone morphological and phonological changes comparable to those seen in other languages, never since classical antiquity has its cultural, literary, and orthographic tradition been interrupted to the extent that one can speak of a new language emerging. Greek speakers today still tend to regard literary works of ancient Greek as part of their own rather than a foreign language. It is also often stated that the historical changes have been relatively slight compared with some other languages. According to one estimation, "Homeric Greek is probably closer to Demotic than 12-century Middle English is to modern spoken English". Geographic distribution Greek is spoken today by at least 13 million people, principally in Greece and Cyprus along with a sizable Greek-speaking minority in Albania near the Greek-Albanian border. A significant percentage of Albania's population has some basic knowledge of the Greek language due in part to the Albanian wave of immigration to Greece in the 1980s and '90s. Prior to the Greco-Turkish War and the resulting population exchange in 1923 a very large population of Greek-speakers also existed in Turkey, though very few remain today. A small Greek-speaking community is also found in Bulgaria near the Greek-Bulgarian border. Greek is also spoken worldwide by the sizable Greek diaspora which has notable communities in the United States, Australia, Canada, South Africa, Chile, Brazil, Argentina, Russia, Ukraine, the United Kingdom, and throughout the European Union, especially in Germany. Historically, significant Greek-speaking communities and regions were found throughout the Eastern Mediterranean, in what are today Southern Italy, Turkey, Cyprus, Syria, Lebanon, Palestine, Israel, Egypt, and Libya; in the area of the Black Sea, in what are today Turkey, Bulgaria, Romania, Ukraine, Russia, Georgia, Armenia, and Azerbaijan; and, to a lesser extent, in the Western Mediterranean in and around colonies such as Massalia, Monoikos, and Mainake. It was also used as a liturgical language in Christian Nubian kingdom of Makuria which was in modern day Sudan. Official status Greek, in its modern form, is the official language of Greece, where it is spoken by almost the entire population. It is also the official language of Cyprus (nominally alongside Turkish). Because of the membership of Greece and Cyprus in the European Union, Greek is one of the organization's 24 official languages. Furthermore, Greek is officially recognized as official in Dropull and Himara (Albania), and as a minority language all over Albania. It is also recognized as an official minority language in the regions of Apulia and Calabria in Italy. In the framework of the European Charter for Regional or Minority Languages, Greek is protected and promoted officially as a regional and minority language in Armenia, Hungary, Romania, and Ukraine. Characteristics The phonology, morphology, syntax, and vocabulary of the language show both conservative and innovative tendencies across the entire attestation of the language from the ancient to the modern period. The division into conventional periods is, as with all such periodizations, relatively arbitrary, especially because at all periods, Ancient Greek has enjoyed high prestige, and the literate borrowed heavily from it. Phonology Across its history, the syllabic structure of Greek has varied little: Greek shows a mixed syllable structure, permitting complex syllabic onsets but very restricted codas. It has only oral vowels and a fairly stable set of consonantal contrasts. The main phonological changes occurred during the Hellenistic and Roman period (see Koine Greek phonology for details): replacement of the pitch accent with a stress accent. simplification of the system of vowels and diphthongs: loss of vowel length distinction, monophthongisation of most diphthongs and several steps in a chain shift of vowels towards (iotacism). development of the voiceless aspirated plosives and to the voiceless fricatives and , respectively; the similar development of to may have taken place later (the phonological changes are not reflected in the orthography, and both earlier and later phonemes are written with φ, θ, and χ). development of the voiced plosives , , and to their voiced fricative counterparts (later ), , and . Morphology In all its stages, the morphology of Greek shows an extensive set of productive derivational affixes, a limited but productive system of compounding and a rich inflectional system. Although its morphological categories have been fairly stable over time, morphological changes are present throughout, particularly in the nominal and verbal systems. The major change in the nominal morphology since the classical stage was the disuse of the dative case (its functions being largely taken over by the genitive). The verbal system has lost the infinitive, the synthetically-formed future, and perfect tenses and the optative mood. Many have been replaced by periphrastic (analytical) forms. Nouns and adjectives Pronouns show distinctions in person (1st, 2nd, and 3rd), number (singular, dual, and plural in the ancient language; singular and plural alone in later stages), and gender (masculine, feminine, and neuter), and decline for case (from six cases in the earliest forms attested to four in the modern language). Nouns, articles, and adjectives show all the distinctions except for a person. Both attributive and predicative adjectives agree with the noun. Verbs The inflectional categories of the Greek verb have likewise remained largely the same over the course of the language's history but with significant changes in the number of distinctions within each category and their morphological expression. Greek verbs have synthetic inflectional forms for: Syntax Many aspects of the syntax of Greek have remained constant: verbs agree with their subject only, the use of the surviving cases is largely intact (nominative for subjects and predicates, accusative for objects of most verbs and many prepositions, genitive for possessors), articles precede nouns, adpositions are
the Polish Kabbalist, "the legend was known to several persons, thus allowing us to speculate that the legend had indeed circulated for some time before it was committed to writing and, consequently, we may assume that its origins are to be traced to the generation immediately following the death of R. Eliyahu, if not earlier." The classic narrative: The Golem of Prague The most famous golem narrative involves Judah Loew ben Bezalel, the late 16th century rabbi of Prague, also known as the Maharal, who reportedly "created a golem out of clay from the banks of the Vltava River and brought it to life through rituals and Hebrew incantations to defend the Prague ghetto from anti-Semitic attacks" and pogroms. Depending on the version of the legend, the Jews in Prague were to be either expelled or killed under the rule of Rudolf II, the Holy Roman Emperor. The Golem was called Josef and was known as Yossele. It was said that he could make himself invisible and summon spirits from the dead. Rabbi Loew deactivated the Golem on Friday evenings by removing the shem before the Sabbath (Saturday) began, so as to let it rest on Sabbath. One Friday evening, Rabbi Loew forgot to remove the shem, and feared that the Golem would desecrate the Sabbath. A different story tells of a golem that fell in love, and when rejected, became the violent monster seen in most accounts. Some versions have the golem eventually going on a murderous rampage. The rabbi then managed to pull the shem from his mouth and immobilize him in front of the synagogue, whereupon the golem fell in pieces. The Golem's body was stored in the attic genizah of the Old New Synagogue, where it would be restored to life again if needed. Rabbi Loew then forbade anyone except his successors from going into the attic. Rabbi Yechezkel Landau, a successor of Rabbi Loew, reportedly wanted to go up the steps to the attic when he was Chief Rabbi of Prague to verify the tradition. Rabbi Landau fasted and immersed himself in a mikveh, wrapped himself in phylacteries and a prayer-shawl and started ascending the steps. At the top of the steps, he hesitated and then came immediately back down, trembling and frightened. He then re-enacted Rabbi Loew's original warning. According to legend, the body of Rabbi Loew's Golem still lies in the synagogue's attic. When the attic was renovated in 1883, no evidence of the Golem was found. Some versions of the tale state that the Golem was stolen from the genizah and entombed in a graveyard in Prague's Žižkov district, where the Žižkov Television Tower now stands. A recent legend tells of a Nazi agent ascending to the synagogue attic, but he died instead under suspicious circumstances. The attic is not open to the general public. Some Orthodox Jews believe that the Maharal did actually create a golem. The evidence for this belief has been analyzed from an Orthodox Jewish perspective by Shnayer Z. Leiman. Sources of the Prague narrative The general view of historians and critics is that the story of the Golem of Prague was a German literary invention of the early 19th century. According to John Neubauer, the first writers on the Prague Golem were: 1837: Berthold Auerbach, Spinoza 1841: Gustav Philippson, Der Golam, eine Legende 1841: Franz Klutschak, Der Golam des Rabbi Löw 1842: Adam Tendlau Der Golem des Hoch-Rabbi-Löw 1847: Leopold Weisel, Der Golem However, there are in fact a couple of slightly earlier examples, in 1834 and 1836. All of these early accounts of the Golem of Prague are in German by Jewish writers. It has been suggested that they emerged as part of a Jewish folklore movement parallel with the contemporary German folklore movement. The origins of the story have been obscured by attempts to exaggerate its age and to pretend that it dates from the time of the Maharal. It has been said that Rabbi Yudel Rosenberg (1859–1935) of Tarłów (before moving to Canada where he became one of its most prominent rabbis) originated the idea that the narrative dates from the time of the Maharal. Rosenberg published Nifl'os Maharal (Wonders of Maharal) (Piotrków, 1909) which purported to be an eyewitness account by the Maharal's son-in-law, who had helped to create the Golem. Rosenberg claimed that the book was based upon a manuscript that he found in the main library in Metz. Wonders of Maharal "is generally recognized in academic circles to be a literary hoax". Gershom Sholem observed that the manuscript "contains not ancient legends but modern fiction". Rosenberg's claim was further disseminated in Chayim Bloch's (1881–1973) The Golem: Legends of the Ghetto of Prague (English edition 1925). The Jewish Encyclopedia of 1906 cites the historical work Zemach David by David Gans, a disciple of the Maharal, published in 1592. In it, Gans writes of an audience between the Maharal and Rudolph II: "Our lord the emperor ... Rudolph ... sent for and called upon our master Rabbi Low ben Bezalel and received him with a welcome and merry expression, and spoke to him face to face, as one would to a friend. The nature and quality
Sanhedrin 38b), Adam was initially created as a golem () when his dust was "kneaded into a shapeless husk". Like Adam, all golems are created from mud by those close to divinity, but no anthropogenic golem is fully human. Early on, the main disability of the golem was its inability to speak. Sanhedrin 65b describes Rava creating a man (). He sent the man to Rav Zeira. Rav Zeira spoke to him, but he did not answer. Rav Zeira said, "You were created by the sages; return to your dust". During the Middle Ages, passages from the Sefer Yetzirah (Book of Creation) were studied as a means to create and animate a golem, although there is little in the writings of Jewish mysticism that supports this belief. It was believed that golems could be activated by an ecstatic experience induced by the ritualistic use of various letters of the Hebrew alphabet forming a "" (any one of the Names of God), wherein the was written on a piece of paper and inserted in the mouth or in the forehead of the golem. A golem is inscribed with Hebrew words in some tales (for example, some versions of Chełm and Prague, as well as in Polish tales and versions of the Brothers Grimm), such as the word (, "truth" in Hebrew) written on its forehead. The golem could then be deactivated by removing the aleph (א) in , thus changing the inscription from "truth" to "death" ( , meaning "dead"). Samuel of Speyer (12th century) was said to have created a golem. Rabbi Jacob ben Shalom arrived at Barcelona from Germany in 1325 and remarked that the law of destruction is the reversal of the law of creation. One source credits 11th century Solomon ibn Gabirol with creating a golem, possibly female, for household chores. In 1625, Joseph Delmedigo wrote that "many legends of this sort are current, particularly in Germany." The earliest known written account of how to create a golem can be found in Sodei Razayya by Eleazar ben Judah of Worms of the late 12th and early 13th century. The Golem of Chełm The oldest description of the creation of a golem by a historical figure is included in a tradition connected to Rabbi Eliyahu of Chełm (1550–1583). A Polish Kabbalist, writing in about 1630–1650, reported the creation of a golem by Rabbi Eliyahu thus: "And I have heard, in a certain and explicit way, from several respectable persons that one man [living] close to our time, whose name is R. Eliyahu, the master of the name, who made a creature out of matter [Heb. Golem] and form [Heb. tzurah] and it performed hard work for him, for a long period, and the name of emet was hanging upon his neck until he finally removed it for a certain reason, the name from his neck and it turned to dust." A similar account was reported by a Christian author, Christoph Arnold, in 1674. Rabbi Jacob Emden (d. 1776) elaborated on the story in a book published in 1748: "As an aside, I'll mention here what I heard from my father's holy mouth regarding the Golem created by his ancestor, the Gaon R. Eliyahu Ba'al Shem of blessed memory. When the Gaon saw that the Golem was growing larger and larger, he feared that the Golem would destroy the universe. He then removed the Holy Name that was embedded on his forehead, thus causing him to disintegrate and return to dust. Nonetheless, while he was engaged in extracting the Holy Name from him, the Golem injured him, scarring him on the face." According to the Polish Kabbalist, "the legend was known to several persons, thus allowing us to speculate that the legend had indeed circulated for some time before it was committed to writing and, consequently, we may assume that its origins are to be traced to the generation immediately following the death of R. Eliyahu, if not earlier." The classic narrative: The Golem of Prague The most famous golem narrative involves Judah Loew ben Bezalel, the late 16th century rabbi of Prague, also known as the Maharal, who reportedly "created a golem out of clay from the banks of the Vltava River and brought it to life through rituals and Hebrew incantations to defend the Prague ghetto from anti-Semitic attacks" and pogroms. Depending on the version of the legend, the Jews in Prague were to be either expelled or killed under the rule of Rudolf II, the Holy Roman Emperor. The Golem was called Josef and was known as Yossele. It was said that he could make himself invisible and summon spirits from the dead. Rabbi Loew deactivated the Golem on Friday evenings by removing the shem before the Sabbath (Saturday) began, so as to let it rest on Sabbath. One Friday evening, Rabbi Loew forgot to remove the shem, and feared that the Golem would desecrate the Sabbath. A different story tells of a golem that fell in love, and when rejected, became the violent monster seen in most accounts. Some versions have the golem eventually going on a murderous rampage. The rabbi then managed to pull the shem from his mouth and immobilize him in front of the synagogue, whereupon the golem fell in pieces. The Golem's body was stored in the attic genizah of the Old New Synagogue, where it would be restored to life again if needed. Rabbi Loew then forbade anyone except his successors from going into the attic. Rabbi Yechezkel Landau, a successor of Rabbi Loew, reportedly wanted to go up the steps to the attic when he was Chief Rabbi of Prague to verify the tradition. Rabbi Landau fasted and immersed himself in a mikveh, wrapped himself in phylacteries and a prayer-shawl and started ascending the steps. At the top of the steps, he hesitated and then came immediately back down, trembling and frightened. He then re-enacted Rabbi Loew's original warning. According to legend, the
He wrote in the conclusion to his 1940 essay on Charles Dickens, George Woodcock suggested that the last two sentences also describe Orwell. Orwell wrote a critique of George Bernard Shaw's play Arms and the Man. He considered this Shaw's best play and the most likely to remain socially relevant, because of its theme that war is not, generally speaking, a glorious romantic adventure. His 1945 essay In Defence of P.G. Wodehouse contains an amusing assessment of Wodehouse's writing and also argues that his broadcasts from Germany (during the war) did not really make him a traitor. He accused The Ministry of Information of exaggerating Wodehouse's actions for propaganda purposes. Food writing In 1946, the British Council commissioned Orwell to write an essay on British food as part of a drive to promote British relations abroad. In the essay titled British Cookery, Orwell described the British diet as "a simple, rather heavy, perhaps slightly barbarous diet" and where "hot drinks are acceptable at most hours of the day". He discusses the ritual of breakfast in the UK, "this is not a snack but a serious meal. The hour at which people have their breakfast is of course governed by the time at which they go to work." He wrote that high tea in the United Kingdom consisted of a variety of savoury and sweet dishes, but "no tea would be considered a good one if it did not include at least one kind of cake”, before adding ”as well as cakes, biscuits are much eaten at tea-time”. Orwell included a recipe for marmalade, a popular British spread on bread. However, the British Council declined to publish the essay on the grounds that it was too problematic to write about food at the time of strict rationing in the UK. In 2019, the essay was discovered in the British Council's archives along with the rejection letter. The British Council issued an official apology to Orwell over the rejection of the commissioned essay. Reception and evaluations of Orwell's works Arthur Koestler said that Orwell's "uncompromising intellectual honesty made him appear almost inhuman at times". Ben Wattenberg stated: "Orwell's writing pierced intellectual hypocrisy wherever he found it". According to historian Piers Brendon, "Orwell was the saint of common decency who would in earlier days, said his BBC boss Rushbrook Williams, 'have been either canonised—or burnt at the stake'". Raymond Williams in Politics and Letters: Interviews with New Left Review describes Orwell as a "successful impersonation of a plain man who bumps into experience in an unmediated way and tells the truth about it". Christopher Norris declared that Orwell's "homespun empiricist outlook—his assumption that the truth was just there to be told in a straightforward common-sense way—now seems not merely naïve but culpably self-deluding". The American scholar Scott Lucas has described Orwell as an enemy of the Left. John Newsinger has argued that Lucas could only do this by portraying "all of Orwell's attacks on Stalinism [–] as if they were attacks on socialism, despite Orwell's continued insistence that they were not". Orwell's work has taken a prominent place in the school literature curriculum in England, with Animal Farm a regular examination topic at the end of secondary education (GCSE), and Nineteen Eighty-Four a topic for subsequent examinations below university level (A Levels). A 2016 UK poll saw Animal Farm ranked the nation's favourite book from school. Historian John Rodden stated: "John Podhoretz did claim that if Orwell were alive today, he'd be standing with the neo-conservatives and against the Left. And the question arises, to what extent can you even begin to predict the political positions of somebody who's been dead three decades and more by that time?" In Orwell's Victory, Christopher Hitchens argues: "In answer to the accusation of inconsistency Orwell as a writer was forever taking his own temperature. In other words, here was someone who never stopped testing and adjusting his intelligence". John Rodden points out the "undeniable conservative features in the Orwell physiognomy" and remarks on how "to some extent Orwell facilitated the kinds of uses and abuses by the Right that his name has been put to. In other ways there has been the politics of selective quotation." Rodden refers to the essay "Why I Write", in which Orwell refers to the Spanish Civil War as being his "watershed political experience", saying: "The Spanish War and other events in 1936–37, turned the scale. Thereafter I knew where I stood. Every line of serious work that I have written since 1936 has been written directly or indirectly against totalitarianism and for democratic socialism as I understand it." (emphasis in original) Rodden goes on to explain how, during the McCarthy era, the introduction to the Signet edition of Animal Farm, which sold more than 20 million copies, makes use of selective quotation: "[Introduction]: If the book itself, Animal Farm, had left any doubt of the matter, Orwell dispelled it in his essay Why I Write: 'Every line of serious work that I've written since 1936 has been written directly or indirectly against Totalitarianism ....'[Rodden]: dot, dot, dot, dot, the politics of ellipsis. 'For Democratic Socialism' is vaporized, just like Winston Smith did it at the Ministry of Truth, and that's very much what happened at the beginning of the McCarthy era and just continued, Orwell being selectively quoted." Fyvel wrote about Orwell: "His crucial experience [...] was his struggle to turn himself into a writer, one which led through long periods of poverty, failure and humiliation, and about which he has written almost nothing directly. The sweat and agony was less in the slum-life than in the effort to turn the experience into literature." In October 2015 Finlay Publisher, for the Orwell Society, published George Orwell 'The Complete Poetry''', compiled and presented by Dione Venables. Influence on language and writing In his essay "Politics and the English Language" (1946), Orwell wrote about the importance of precise and clear language, arguing that vague writing can be used as a powerful tool of political manipulation because it shapes the way we think. In that essay, Orwell provides six rules for writers: Never use a metaphor, simile or other figure of speech which you are used to seeing in print. Never use a long word where a short one will do. If it is possible to cut a word out, always cut it out. Never use the passive where you can use the active. Never use a foreign phrase, a scientific word or a jargon word if you can think of an everyday English equivalent. Break any of these rules sooner than say anything outright barbarous. Orwell worked as a journalist at The Observer for seven years, and its editor David Astor gave a copy of this celebrated essay to every new recruit. In 2003, literary editor at the newspaper Robert McCrum wrote, "Even now, it is quoted in our style book". Journalist Jonathan Heawood noted: "Orwell's criticism of slovenly language is still taken very seriously." Andrew N. Rubin argues that "Orwell claimed that we should be attentive to how the use of language has limited our capacity for critical thought just as we should be equally concerned with the ways in which dominant modes of thinking have reshaped the very language that we use." The adjective "Orwellian" connotes an attitude and a policy of control by propaganda, surveillance, misinformation, denial of truth and manipulation of the past. In Nineteen Eighty-Four, Orwell described a totalitarian government that controlled thought by controlling language, making certain ideas literally unthinkable. Several words and phrases from Nineteen Eighty-Four have entered popular language. "Newspeak" is a simplified and obfuscatory language designed to make independent thought impossible. "Doublethink" means holding two contradictory beliefs simultaneously. The "Thought Police" are those who suppress all dissenting opinion. "Prolefeed" is homogenised, manufactured superficial literature, film and music used to control and indoctrinate the populace through docility. "Big Brother" is a supreme dictator who watches everyone. Orwell may have been the first to use the term "cold war" to refer to the state of tension between powers in the Western Bloc and the Eastern Bloc that followed World War II in his essay, "You and the Atom Bomb", published in Tribune on 19 October 1945. He wrote: "We may be heading not for general breakdown but for an epoch as horribly stable as the slave empires of antiquity. James Burnham's theory has been much discussed, but few people have yet considered its ideological implications—this is, the kind of world-view, the kind of beliefs, and the social structure that would probably prevail in a State which was at once unconquerable and in a permanent state of 'cold war' with its neighbours." Modern culture In 2014, a play written by playwright Joe Sutton titled Orwell in America was first performed by the Northern Stage theatre company in White River Junction, Vermont. It is a fictitious account of Orwell doing a book tour in the United States (something he never did in his lifetime). It moved to off-Broadway in 2016. Orwell's birthplace, a bungalow in Motihari, Bihar, India, was opened as a museum in May 2015. Statue A statue of George Orwell, sculpted by the British sculptor Martin Jennings, was unveiled on 7 November 2017 outside Broadcasting House, the headquarters of the BBC. The wall behind the statue is inscribed with the following phrase: "If liberty means anything at all, it means the right to tell people what they do not want to hear". These are words from his proposed preface to Animal Farm and a rallying cry for the idea of free speech in an open society. Personal life Childhood Jacintha Buddicom's account, Eric & Us, provides an insight into Blair's childhood. She quoted his sister Avril that "he was essentially an aloof, undemonstrative person" and said herself of his friendship with the Buddicoms: "I do not think he needed any other friends beyond the schoolfriend he occasionally and appreciatively referred to as 'CC'". She could not recall him having schoolfriends to stay and exchange visits as her brother Prosper often did in holidays. Cyril Connolly provides an account of Blair as a child in Enemies of Promise. Years later, Blair mordantly recalled his prep school in the essay "Such, Such Were the Joys", claiming among other things that he "was made to study like a dog" to earn a scholarship, which he alleged was solely to enhance the school's prestige with parents. Jacintha Buddicom repudiated Orwell's schoolboy misery described in the essay, stating that "he was a specially happy child". She noted that he did not like his name because it reminded him of a book he greatly disliked—Eric, or, Little by Little, a Victorian boys' school story. Connolly remarked of him as a schoolboy, "The remarkable thing about Orwell was that alone among the boys he was an intellectual and not a parrot for he thought for himself". At Eton, John Vaughan Wilkes, his former headmaster's son at St Cyprians, recalled that "he was extremely argumentative—about anything—and criticising the masters and criticising the other boys [...] We enjoyed arguing with him. He would generally win the arguments—or think he had anyhow." Roger Mynors concurs: "Endless arguments about all sorts of things, in which he was one of the great leaders. He was one of those boys who thought for himself." Blair liked to carry out practical jokes. Buddicom recalls him swinging from the luggage rack in a railway carriage like an orangutan to frighten a woman passenger out of the compartment. At Eton, he played tricks on John Crace, his housemaster, among which was to enter a spoof advertisement in a college magazine implying pederasty. Gow, his tutor, said he "made himself as big a nuisance as he could" and "was a very unattractive boy". Later Blair was expelled from the crammer at Southwold for sending a dead rat as a birthday present to the town surveyor. In one of his As I Please essays he refers to a protracted joke when he answered an advertisement for a woman who claimed a cure for obesity. Blair had an interest in natural history which stemmed from his childhood. In letters from school he wrote about caterpillars and butterflies, and Buddicom recalls his keen interest in ornithology. He also enjoyed fishing and shooting rabbits, and conducting experiments as in cooking a hedgehog or shooting down a jackdaw from the Eton roof to dissect it. His zeal for scientific experiments extended to explosives—again Buddicom recalls a cook giving notice because of the noise. Later in Southwold, his sister Avril recalled him blowing up the garden. When teaching he enthused his students with his nature-rambles both at Southwold and at Hayes. His adult diaries are permeated with his observations on nature. Relationships and marriage Buddicom and Blair lost touch shortly after he went to Burma and she became unsympathetic towards him. She wrote that it was because of the letters he wrote complaining about his life, but an addendum to Eric & Us by Venables reveals that he may have lost her sympathy through an incident which was, at best, a clumsy attempt at seduction. Mabel Fierz, who later became Blair's confidante, said: "He used to say the one thing he wished in this world was that he'd been attractive to women. He liked women and had many girlfriends I think in Burma. He had a girl in Southwold and another girl in London. He was rather a womaniser, yet he was afraid he wasn't attractive." Brenda Salkield (Southwold) preferred friendship to any deeper relationship and maintained a correspondence with Blair for many years, particularly as a sounding board for his ideas. She wrote: "He was a great letter writer. Endless letters, and I mean when he wrote you a letter he wrote pages." His correspondence with Eleanor Jacques (London) was more prosaic, dwelling on a closer relationship and referring to past rendezvous or planning future ones in London and Burnham Beeches. When Orwell was in the sanatorium in Kent, his wife's friend Lydia Jackson visited. He invited her for a walk and out of sight "an awkward situation arose." Jackson was to be the most critical of Orwell's marriage to Eileen O'Shaughnessy, but their later correspondence hints at a complicity. Eileen at the time was more concerned about Orwell's closeness to Brenda Salkield. Orwell had an affair with his secretary at Tribune which caused Eileen much distress, and others have been mooted. In a letter to Ann Popham he wrote: "I was sometimes unfaithful to Eileen, and I also treated her badly, and I think she treated me badly, too, at times, but it was a real marriage, in the sense that we had been through awful struggles together and she understood all about my work, etc." Similarly he suggested to Celia Kirwan that they had both been unfaithful. There are several testaments that it was a well-matched and happy marriage.Patrica Donahue in Stephen Wadhams Remembering Orwell In June 1944, Orwell and Eileen adopted a three-week-old boy they named Richard Horatio. According to Richard, Orwell was a wonderful father who gave him devoted, if rather rugged, attention and a great degree of freedom. After Orwell's death Richard went to live with Orwell's sister and her husband. Blair was very lonely after Eileen's death in 1945, and desperate for a wife, both as companion for himself and as mother for Richard. He proposed marriage to four women, including Celia Kirwan, and eventually Sonia Brownell accepted. Orwell had met her when she was assistant to Cyril Connolly, at Horizon literary magazine. They were married on 13 October 1949, only three months before Orwell's death. Some maintain that Sonia was the model for Julia in Nineteen Eighty-Four. Social interactions Orwell was noted for very close and enduring friendships with a few friends, but these were generally people with a similar background or with a similar level of literary ability. Ungregarious, he was out of place in a crowd and his discomfort was exacerbated when he was outside his own class. Though representing himself as a spokesman for the common man, he often appeared out of place with real working people. His brother-in-law Humphrey Dakin, a "Hail fellow, well met" type, who took him to a local pub in Leeds, said that he was told by the landlord: "Don't bring that bugger in here again." Adrian Fierz commented "He wasn't interested in racing or greyhounds or pub crawling or shove ha'penny. He just did not have much in common with people who did not share his intellectual interests." Awkwardness attended many of his encounters with working-class representatives, as with Pollitt and McNair, but his courtesy and good manners were often commented on. Jack Common observed on meeting him for the first time, "Right away manners, and more than manners—breeding—showed through." In his tramping days, he did domestic work for a time. His extreme politeness was recalled by a member of the family he worked for; she declared that the family referred to him as "Laurel" after the film comedian. With his gangling figure and awkwardness, Orwell's friends often saw him as a figure of fun. Geoffrey Gorer commented "He was awfully likely to knock things off tables, trip over things. I mean, he was a gangling, physically badly co-ordinated young man. I think his feeling [was] that even the inanimate world was against him." When he shared a flat with Heppenstall and Sayer, he was treated in a patronising manner by the younger men. At the BBC in the 1940s, "everybody would pull his leg" and Spender described him as having real entertainment value "like, as I say, watching a Charlie Chaplin movie". A friend of Eileen's reminisced about her tolerance and humour, often at Orwell's expense. One biography of Orwell accused him of having had an authoritarian streak. In Burma, he struck out at a Burmese boy who, while "fooling around" with his friends, had "accidentally bumped into him" at a station, resulting in Orwell falling "heavily" down some stairs. One of his former pupils recalled being beaten so hard he could not sit down for a week. When sharing a flat with Orwell, Heppenstall came home late one night in an advanced stage of loud inebriation. The upshot was that Heppenstall ended up with a bloody nose and was locked in a room. When he complained, Orwell hit him across the legs with a shooting stick and Heppenstall then had to defend himself with a chair. Years later, after Orwell's death, Heppenstall wrote a dramatic account of the incident called "The Shooting Stick" and Mabel Fierz confirmed that Heppenstall came to her in a sorry state the following day. Orwell got on well with young people. The pupil he beat considered him the best of teachers and the young recruits in Barcelona tried to drink him under the table without success. His nephew recalled Uncle Eric laughing louder than anyone in the cinema at a Charlie Chaplin film. In the wake of his most famous works, he attracted many uncritical hangers-on, but many others who sought him found him aloof and even dull. With his soft voice, he was sometimes shouted down or excluded from discussions. At this time, he was severely ill; it was wartime or the austerity period after it; during the war his wife suffered from depression; and after her death he was lonely and unhappy. In addition to that, he always lived frugally and seemed unable to care for himself properly. As a result of all this, people found his circumstances bleak. Some, like Michael Ayrton, called him "Gloomy George", but others developed the idea that he was an "English secular saint". Although Orwell was frequently heard on the BBC for panel discussion and one-man broadcasts, no recorded copy of his voice is known to exist. Lifestyle Orwell was a heavy smoker, who rolled his own cigarettes from strong shag tobacco, despite his bronchial condition. His penchant for the rugged life often took him to cold and damp situations, both in the long term, as in Catalonia and Jura, and short term, for example, motorcycling in the rain and suffering a shipwreck. Described by The Economist as "perhaps the 20th century's best chronicler of English culture", Orwell considered fish and chips, football, the pub, strong tea, cut-price chocolate, the movies, and radio among the chief comforts for the working class. He advocated a patriotic defence of a British way of life that could not be trusted to intellectuals or, by implication, the state: Orwell enjoyed strong tea—he had Fortnum & Mason's tea brought to him in Catalonia. His 1946 essay, "A Nice Cup of Tea", appeared in the London Evening Standard article on how to make tea, with Orwell writing, "tea is one of the mainstays of civilisation in this country and causes violent disputes over how it should be made", with the main issue being whether to put tea in the cup first and add the milk afterward, or the other way round, on which he states, "in every family in Britain there are probably two schools of thought on the subject". He appreciated English beer, taken regularly and moderately, despised drinkers of lager and wrote about an imagined, ideal British pub in his 1946 Evening Standard article, "The Moon Under Water". Not as particular about food, he enjoyed the wartime "Victory Pie" and extolled canteen food at the BBC. He preferred traditional English dishes, such as roast beef, and kippers. His 1945 essay, "In Defence of English Cooking", included Yorkshire pudding, crumpets, muffins, innumerable biscuits, Christmas pudding, shortbread, various British cheeses and Oxford marmalade. Reports of his Islington days refer to the cosy afternoon tea table. His dress sense was unpredictable and usually casual. In Southwold, he had the best cloth from the local tailor but was equally happy in his tramping outfit. His attire in the Spanish Civil War, along with his size-12 boots, was a source of amusement.Jennie Lee in Peter Davison, Complete Works XI 5 David Astor described him as looking like a prep school master, while according to the Special Branch dossier, Orwell's tendency to dress "in Bohemian fashion" revealed that the author was "a Communist". Orwell's confusing approach to matters of social decorum—on the one hand expecting a working-class guest to dress for dinner, and on the other, slurping tea out of a saucer at the BBC canteen—helped stoke his reputation as an English eccentric. Views Religion Orwell was an atheist who identified himself with the humanist outlook on life. Despite this, and despite his criticisms of both religious doctrine and religious organisations, he nevertheless regularly participated in the social and civic life of the church, including by attending Church of England Holy Communion. Acknowledging this contradiction, he once said: "It seems rather mean to go to HC [Holy Communion] when one doesn't believe, but I have passed myself off for pious & there is nothing for it but to keep up with the deception." He had two Anglican marriages and left instructions for an Anglican funeral. Orwell was also extremely well-read in Biblical literature and could quote lengthy passages from the Book of Common Prayer from memory. His extensive knowledge of the Bible came coupled with unsparing criticism of its philosophy, and as an adult he could not bring himself to believe in its tenets. He said in part V of his essay, "Such, Such Were the Joys", that "Till about the age of fourteen I believed in God, and believed that the accounts given of him were true. But I was well aware that I did not love him." Orwell directly contrasted Christianity with secular humanism in his essay "Lear, Tolstoy and the Fool", finding the latter philosophy more palatable and less "self-interested". Literary critic James Wood wrote that in the struggle, as he saw it, between Christianity and humanism, "Orwell was on the humanist side, of course—basically an unmetaphysical, English version of Camus's philosophy of perpetual godless struggle." Orwell's writing was often explicitly critical of religion, and Christianity in particular. He found the church to be a "selfish [...] church of the landed gentry" with its establishment "out of touch" with the majority of its communicants and altogether a pernicious influence on public life. In their 1972 study, The Unknown Orwell, the writers Peter Stansky and William Abrahams noted that at Eton Blair displayed a "sceptical attitude" to Christian belief. Crick observed that Orwell displayed "a pronounced anti-Catholicism". Evelyn Waugh, writing in 1946, acknowledged Orwell's high moral sense and respect for justice but believed "he seems never to have been touched at any point by a conception of religious thought and life." His contradictory and sometimes ambiguous views about the social benefits of religious affiliation mirrored the dichotomies between his public and private lives: Stephen Ingle wrote that it was as if the writer George Orwell "vaunted" his unbelief while Eric Blair the individual retained "a deeply ingrained religiosity". Politics Orwell liked to provoke arguments by challenging the status quo, but he was also a traditionalist with a love of old English values. He criticised and satirised, from the inside, the various social milieux in which he found himself—provincial town life in A Clergyman's Daughter; middle-class pretension in Keep the Aspidistra Flying; preparatory schools in "Such, Such Were the Joys"; and some socialist groups in The Road to Wigan Pier. In his Adelphi days, he described himself as a "Tory-anarchist". Of colonialism in Burmese Days, he portrays the English colonists as a "dull, decent people, cherishing and fortifying their dullness behind a quarter of a million bayonets." In 1928, Orwell began his career as a professional writer in Paris at a journal owned by the French Communist Henri Barbusse. His first article, "La Censure en Angleterre" ("Censorship in England"), was an attempt to account for the "extraordinary and illogical" moral censorship of plays and novels then practised in Britain. His own explanation was that the rise of the "puritan middle class", who had stricter morals than the aristocracy, tightened the rules of censorship in the 19th century. Orwell's first published article in his home country, "A Farthing Newspaper", was a critique of the new French daily the Ami de Peuple. This paper was sold much more cheaply than most others, and was intended for ordinary people to read. Orwell pointed out that its proprietor François Coty also owned the right-wing dailies Le Figaro and Le Gaulois, which the Ami de Peuple was supposedly competing against. Orwell suggested that cheap newspapers were no more than a vehicle for advertising and anti-leftist propaganda, and predicted the world might soon see free newspapers which would drive legitimate dailies out of business. Writing for Le Progrès Civique, Orwell described the British colonial government in Burma and India: Spanish Civil War and socialism The Spanish Civil War played the most important part in defining Orwell's socialism. He wrote to Cyril Connolly from Barcelona on 8 June 1937: "I have seen wonderful things and at last really believe in Socialism, which I never did before." Having witnessed the success of the anarcho-syndicalist communities, for example in Anarchist Catalonia, and the subsequent brutal suppression of the anarcho-syndicalists, anti-Stalin communist parties and revolutionaries by the Soviet Union-backed Communists, Orwell returned from Catalonia a staunch anti-Stalinist and joined the British Independent Labour Party, his card being issued on 13 June 1938. Although he was never a Trotskyist, he was strongly influenced by the Trotskyist and anarchist critiques of the Soviet regime, and by the anarchists' emphasis on individual freedom. In Part 2 of The Road to Wigan Pier, published by the Left Book Club, Orwell stated that "a real Socialist is one who wishes—not merely conceives it as desirable, but actively wishes—to see tyranny overthrown". Orwell stated in "Why I Write" (1946): "Every line of serious work that I have written since 1936 has been written, directly or indirectly, against totalitarianism and for democratic socialism, as I understand it." Orwell's conception of socialism was of a planned economy alongside democracy, which was the common notion of socialism in the early and middle 20th century. Orwell's emphasis on "democracy" primarily referred to a strong emphasis on civil liberties within a socialist economy as opposed to majoritarian rule, though he was not necessarily opposed to majority rule. Orwell was a proponent of a federal socialist Europe, a position outlined in his 1947 essay "Toward European Unity", which first appeared in Partisan Review. According to biographer John Newsinger: In his 1938 essay "Why I joined the Independent Labour Party," published in the ILP-affiliated New Leader, Orwell wrote: Towards the end of the essay, he wrote: "I do not mean I have lost all faith in the Labour Party. My most earnest hope is that the Labour Party will win a clear majority in the next General Election." The Second World War Orwell was opposed to rearmament against Nazi Germany and at the time of the Munich Agreement he signed a manifesto entitled "If War Comes We Shall Resist"—but he changed his view after the Molotov–Ribbentrop Pact and the outbreak of the war. He left the ILP because of its opposition to the war and adopted a political position of "revolutionary patriotism". On 21 March 1940 he wrote a review of Adolf Hitler's Mein Kampf for The New English Weekly, in which he analysed the dictator's psychology. According to Orwell "a thing that strikes one is the rigidity of his mind, the way in which his world-view doesn't develop. It is the fixed vision of a monomaniac and not likely to be much affected by the temporary manoeuvres of power politics". Asking "how was it that he was able to put [his] monstrous vision across?", Orwell tried to understand why Hitler was worshipped by the German people: "The situation in Germany, with its seven million unemployed, was obviously favourable for demagogues. But Hitler could not have succeeded against his many rivals if it had not been for the attraction of his own personality, which one can feel even in the clumsy writing of Mein Kampf, and which is no doubt overwhelming when one hears his speeches…The fact is that there is something deeply appealing about him. The initial, personal cause of his grievance against the universe can only be guessed at; but at any rate the grievance is here. He is the martyr, the victim, Prometheus chained to the rock, the self-sacrificing hero who fights single-handed against impossible odds. If he were killing a mouse he would know how to make it seem like a dragon." In December 1940 he wrote in Tribune (the Labour left's weekly): "We are in a strange period of history in which a revolutionary has to be a patriot and a patriot has to be a revolutionary." During the war, Orwell was highly critical of the popular idea that an Anglo-Soviet alliance would be the basis of a post-war world of peace and prosperity. In 1942, commenting on London Times editor E. H. Carr's pro-Soviet views, Orwell stated that "all the appeasers, e.g. Professor E.H. Carr, have switched their allegiance from Hitler to Stalin". In his reply (dated 15 November 1943) to an invitation from the Duchess of Atholl to speak for the British League for European Freedom, he stated that he did not agree with their objectives. He admitted that what they said was "more truthful than the lying propaganda found in most of the press", but added that he could not "associate himself with an essentially Conservative body" that claimed to "defend democracy in Europe" but had "nothing to say about British imperialism". His closing paragraph stated: "I belong to the Left and must work inside it, much as I hate Russian totalitarianism and its poisonous influence in this country." Tribune and post-war Britain Orwell joined the staff of Tribune magazine as literary editor, and from then until his death, was a left-wing (though hardly orthodox) Labour-supporting democratic socialist. On 1 September 1944, writing about the Warsaw uprising, Orwell expressed in Tribune his hostility against the influence of the alliance with the USSR over the allies: "Do remember that dishonesty and cowardice always have to be paid for. Do not imagine that for years on end you can make yourself the boot-licking propagandist of the sovietic regime, or any other regime, and then suddenly return to honesty and reason. Once a whore, always a whore." According to Newsinger, although Orwell "was always critical of the 1945–51 Labour government's moderation, his support for it began to pull him to the right politically. This did not lead him to embrace conservatism, imperialism or reaction, but to defend, albeit critically, Labour reformism." Between 1945 and 1947, with A. J. Ayer and Bertrand Russell, he contributed a series of articles and essays to Polemic, a short-lived British "Magazine of Philosophy, Psychology, and Aesthetics" edited by the ex-Communist Humphrey Slater.Collini, Stefan (2006). Absent Minds: Intellectuals in Britain. Oxford University Press. Writing in early 1945 a long essay titled "Antisemitism in Britain", for the Contemporary Jewish Record, Orwell stated that antisemitism was on the increase in Britain and that it was "irrational and will not yield to arguments". He argued that it would be useful to discover why anti-Semites could "swallow such absurdities on one particular subject while remaining sane on others". He wrote: "For quite six years the English admirers of Hitler contrived not to learn of the existence of Dachau and Buchenwald. ... Many English people have heard almost nothing about the extermination of German and Polish Jews during the present war. Their own anti-Semitism has caused this vast crime to bounce off their consciousness." In Nineteen Eighty-Four, written shortly after the war, Orwell portrayed the Party as enlisting anti-Semitic passions against their enemy, Goldstein. Orwell publicly defended P. G. Wodehouse against charges of being a Nazi sympathiser—occasioned by his agreement to do some broadcasts over the German radio in 1941—a defence based on Wodehouse's lack of interest in and ignorance of politics. Special Branch, the intelligence division of the Metropolitan Police, maintained a file on Orwell for more than 20 years of his life. The dossier, published by The National Archives, states that, according to one investigator, Orwell had "advanced Communist views and several of his Indian friends say that they have often seen him at Communist meetings". MI5, the intelligence department of the Home Office, noted: "It is evident from his recent writings—'The Lion and the Unicorn'—and his contribution to Gollancz's symposium The Betrayal of the Left that he does not hold with the Communist Party nor they with him." Sexuality Sexual politics plays an important role in Nineteen Eighty-Four. In the novel, people's intimate relationships are strictly governed by the party's Junior Anti-Sex League, by opposing sexual relations and instead encouraging artificial insemination. Personally, Orwell disliked what he thought as misguided middle-class revolutionary emancipatory views, expressing disdain for "every fruit-juice drinker, nudist, sandal-wearer, sex-maniacs". Orwell was also openly against homosexuality, at a time when such prejudice was common. Speaking at the 2003 George Orwell Centenary Conference, Daphne Patai said: "Of course he was homophobic. That has nothing to do with his relations with his homosexual friends. Certainly, he had a negative attitude and a certain kind of anxiety, a denigrating attitude towards homosexuality. That is definitely the case. I think his writing reflects that quite fully." Orwell used the homophobic epithets "nancy" and "pansy", such in his expressions of contempt for what he called the "pansy Left", and "nancy poets", i.e. left-wing homosexual or bisexual writers and intellectuals such as Stephen Spender and W. H. Auden. The protagonist of Keep the Aspidistra Flying, Gordon Comstock, conducts an internal critique of his customers when working in a bookshop, and there is an extended passage of several pages in which he concentrates on a homosexual male customer, and sneers at him for his "nancy" characteristics, including a lisp, which he identifies in detail, with some disgust. Stephen Spender "thought Orwell's occasional homophobic outbursts were part of his rebellion against the public school". Biographies of Orwell Orwell's will requested that no biography of him be written, and his widow, Sonia Brownell, repelled every attempt by those who tried to persuade her to let them write about him. Various recollections and interpretations were published in the 1950s and 1960s, but Sonia saw the 1968 Collected Works as the record of his life. She did appoint Malcolm Muggeridge as official biographer, but later biographers have seen this as deliberate spoiling as Muggeridge eventually gave up the work. In 1972, two American authors, Peter Stansky and
English, and History. He passed the entrance exam, coming seventh out of the 26 candidates who exceeded the pass mark. Policing in Burma Blair's maternal grandmother lived at Moulmein, so he chose a posting in Burma, then still a province of British India. In October 1922 he sailed on board SS Herefordshire via the Suez Canal and Ceylon to join the Indian Imperial Police in Burma. A month later, he arrived at Rangoon and travelled to the police training school in Mandalay. He was appointed an Assistant District Superintendent (on probation) on 29 November 1922, with effect from 27 November and at the pay of Rs. 525 per month. After a short posting at Maymyo, Burma's principal hill station, he was posted to the frontier outpost of Myaungmya in the Irrawaddy Delta at the beginning of 1924. Working as an imperial police officer gave him considerable responsibility while most of his contemporaries were still at university in England. When he was posted farther east in the Delta to Twante as a sub-divisional officer, he was responsible for the security of some 200,000 people. At the end of 1924, he was posted to Syriam, closer to Rangoon. Syriam had the refinery of the Burmah Oil Company, "the surrounding land a barren waste, all vegetation killed off by the fumes of sulphur dioxide pouring out day and night from the stacks of the refinery." But the town was near Rangoon, a cosmopolitan seaport, and Blair went into the city as often as he could, "to browse in a bookshop; to eat well-cooked food; to get away from the boring routine of police life". In September 1925 he went to Insein, the home of Insein Prison, the second largest prison in Burma. In Insein, he had "long talks on every conceivable subject" with Elisa Maria Langford-Rae (who later married Kazi Lhendup Dorjee). She noted his "sense of utter fairness in minutest details". By this time, Blair had completed his training and was receiving a monthly salary of Rs. 740, including allowances. In Burma, Blair acquired a reputation as an outsider. He spent much of his time alone, reading or pursuing non-pukka activities, such as attending the churches of the Karen ethnic group. A colleague, Roger Beadon, recalled (in a 1969 recording for the BBC) that Blair was fast to learn the language and that before he left Burma, "was able to speak fluently with Burmese priests in 'very high-flown Burmese'." Blair made changes to his appearance in Burma that remained for the rest of his life, including adopting a pencil moustache. Emma Larkin writes in the introduction to Burmese Days, "While in Burma, he acquired a moustache similar to those worn by officers of the British regiments stationed there. [He] also acquired some tattoos; on each knuckle he had a small untidy blue circle. Many Burmese living in rural areas still sport tattoos like this—they are believed to protect against bullets and snake bites." In April 1926 he moved to Moulmein, where his maternal grandmother lived. At the end of that year, he was assigned to Katha in Upper Burma, where he contracted dengue fever in 1927. Entitled to a leave in England that year, he was allowed to return in July due to his illness. While on leave in England and on holiday with his family in Cornwall in September 1927, he reappraised his life. Deciding against returning to Burma, he resigned from the Indian Imperial Police to become a writer, with effect from 12 March 1928 after five-and-a-half years of service. He drew on his experiences in the Burma police for the novel Burmese Days (1934) and the essays "A Hanging" (1931) and "Shooting an Elephant" (1936). London and Paris In England, he settled back in the family home at Southwold, renewing acquaintance with local friends and attending an Old Etonian dinner. He visited his old tutor Gow at Cambridge for advice on becoming a writer. In 1927 he moved to London. Ruth Pitter, a family acquaintance, helped him find lodgings, and by the end of 1927 he had moved into rooms in Portobello Road; a blue plaque commemorates his residence there. Pitter's involvement in the move "would have lent it a reassuring respectability in Mrs. Blair's eyes." Pitter had a sympathetic interest in Blair's writing, pointed out weaknesses in his poetry, and advised him to write about what he knew. In fact he decided to write of "certain aspects of the present that he set out to know" and ventured into the East End of London—the first of the occasional sorties he would make to discover for himself the world of poverty and the down-and-outers who inhabit it. He had found a subject. These sorties, explorations, expeditions, tours or immersions were made intermittently over a period of five years. In imitation of Jack London, whose writing he admired (particularly The People of the Abyss), Blair started to explore the poorer parts of London. On his first outing he set out to Limehouse Causeway, spending his first night in a common lodging house, possibly George Levy's "kip". For a while he "went native" in his own country, dressing like a tramp, adopting the name P.S. Burton and making no concessions to middle-class mores and expectations; he recorded his experiences of the low life for use in "The Spike", his first published essay in English, and in the second half of his first book, Down and Out in Paris and London (1933). In early 1928 he moved to Paris. He lived in the rue du Pot de Fer, a working class district in the 5th Arrondissement. His aunt Nellie Limouzin also lived in Paris and gave him social and, when necessary, financial support. He began to write novels, including an early version of Burmese Days, but nothing else survives from that period. He was more successful as a journalist and published articles in Monde, a political/literary journal edited by Henri Barbusse (his first article as a professional writer, "La Censure en Angleterre", appeared in that journal on 6 October 1928); G. K.'s Weekly, where his first article to appear in England, "A Farthing Newspaper", was printed on 29 December 1928; and Le Progrès Civique (founded by the left-wing coalition Le Cartel des Gauches). Three pieces appeared in successive weeks in Le Progrès Civique: discussing unemployment, a day in the life of a tramp, and the beggars of London, respectively. "In one or another of its destructive forms, poverty was to become his obsessive subject—at the heart of almost everything he wrote until Homage to Catalonia." He fell seriously ill in February 1929 and was taken to the Hôpital Cochin in the 14th arrondissement, a free hospital where medical students were trained. His experiences there were the basis of his essay "How the Poor Die", published in 1946. He chose not to identify the hospital, and indeed was deliberately misleading about its location. Shortly afterwards, he had all his money stolen from his lodging house. Whether through necessity or to collect material, he undertook menial jobs such as dishwashing in a fashionable hotel on the rue de Rivoli, which he later described in Down and Out in Paris and London. In August 1929, he sent a copy of "The Spike" to John Middleton Murry's New Adelphi magazine in London. The magazine was edited by Max Plowman and Sir Richard Rees, and Plowman accepted the work for publication. Southwold In December 1929 after nearly two years in Paris, Blair returned to England and went directly to his parents' house in Southwold, a coastal town in Suffolk, which remained his base for the next five years. The family was well established in the town, and his sister Avril was running a tea-house there. He became acquainted with many local people, including Brenda Salkeld, the clergyman's daughter who worked as a gym-teacher at St Felix Girls' School in the town. Although Salkeld rejected his offer of marriage, she remained a friend and regular correspondent for many years. He also renewed friendships with older friends, such as Dennis Collings, whose girlfriend Eleanor Jacques was also to play a part in his life. In early 1930 he stayed briefly in Bramley, Leeds, with his sister Marjorie and her husband Humphrey Dakin, who was as unappreciative of Blair as when they knew each other as children. Blair was writing reviews for Adelphi and acting as a private tutor to a disabled child at Southwold. He then became tutor to three young brothers, one of whom, Richard Peters, later became a distinguished academic. "His history in these years is marked by dualities and contrasts. There is Blair leading a respectable, outwardly eventless life at his parents' house in Southwold, writing; then in contrast, there is Blair as Burton (the name he used in his down-and-out episodes) in search of experience in the kips and spikes, in the East End, on the road, and in the hop fields of Kent." He went painting and bathing on the beach, and there he met Mabel and Francis Fierz, who later influenced his career. Over the next year he visited them in London, often meeting their friend Max Plowman. He also often stayed at the homes of Ruth Pitter and Richard Rees, where he could "change" for his sporadic tramping expeditions. One of his jobs was domestic work at a lodgings for half a crown (two shillings and sixpence, or one-eighth of a pound) a day. Blair now contributed regularly to Adelphi, with "A Hanging" appearing in August 1931. From August to September 1931 his explorations of poverty continued, and, like the protagonist of A Clergyman's Daughter, he followed the East End tradition of working in the Kent hop fields. He kept a diary about his experiences there. Afterwards, he lodged in the Tooley Street kip, but could not stand it for long, and with financial help from his parents moved to Windsor Street, where he stayed until Christmas. "Hop Picking", by Eric Blair, appeared in the October 1931 issue of New Statesman, whose editorial staff included his old friend Cyril Connolly. Mabel Fierz put him in contact with Leonard Moore, who became his literary agent in April 1932. At this time Jonathan Cape rejected A Scullion's Diary, the first version of Down and Out. On the advice of Richard Rees, he offered it to Faber and Faber, but their editorial director, T. S. Eliot, also rejected it. Blair ended the year by deliberately getting himself arrested, so that he could experience Christmas in prison, but after he was picked up and taken to Bethnal Green police station in the East End of London the authorities did not regard his "drunk and disorderly" behaviour as imprisonable, and after two days in a cell he returned home to Southwold. Teaching career In April 1932 Blair became a teacher at The Hawthorns High School, a school for boys, in Hayes, West London. This was a small school offering private schooling for children of local tradesmen and shopkeepers, and had only 14 or 16 boys aged between ten and sixteen, and one other master. While at the school he became friendly with the curate of the local parish church and became involved with activities there. Mabel Fierz had pursued matters with Moore, and at the end of June 1932, Moore told Blair that Victor Gollancz was prepared to publish A Scullion's Diary for a £40 advance, through his recently founded publishing house, Victor Gollancz Ltd, which was an outlet for radical and socialist works. At the end of the summer term in 1932, Blair returned to Southwold, where his parents had used a legacy to buy their own home. Blair and his sister Avril spent the holidays making the house habitable while he also worked on Burmese Days. He was also spending time with Eleanor Jacques, but her attachment to Dennis Collings remained an obstacle to his hopes of a more serious relationship. "Clink", an essay describing his failed attempt to get sent to prison, appeared in the August 1932 number of Adelphi. He returned to teaching at Hayes and prepared for the publication of his book, now known as Down and Out in Paris and London. He wished to publish under a different name to avoid any embarrassment to his family over his time as a "tramp". In a letter to Moore (dated 15 November 1932), he left the choice of pseudonym to Moore and to Gollancz. Four days later, he wrote to Moore, suggesting the pseudonyms P. S. Burton (a name he used when tramping), Kenneth Miles, George Orwell, and H. Lewis Allways. He finally adopted the nom de plume George Orwell because "It is a good round English name." The name George was inspired by the patron saint of England, and Orwell after the River Orwell in Suffolk which was one of Orwell's favourite locations. Down and Out in Paris and London was published by Victor Gollancz in London on 9 January 1933 and received favourable reviews, with Cecil Day-Lewis complimenting Orwell's "clarity and good sense", and The Times Literary Supplement comparing Orwell's eccentric characters to the characters of Dickens. Down and Out was modestly successful and was next published by Harper & Brothers in New York. In mid-1933 Blair left Hawthorns to become a teacher at Frays College, in Uxbridge, west London. This was a much larger establishment with 200 pupils and a full complement of staff. He acquired a motorcycle and took trips through the surrounding countryside. On one of these expeditions he became soaked and caught a chill that developed into pneumonia. He was taken to a cottage hospital in Uxbridge, where for a time his life was believed to be in danger. When he was discharged in January 1934, he returned to Southwold to convalesce and, supported by his parents, never returned to teaching. He was disappointed when Gollancz turned down Burmese Days, mainly on the grounds of potential suits for libel, but Harper were prepared to publish it in the United States. Meanwhile, Blair started work on the novel A Clergyman's Daughter, drawing upon his life as a teacher and on life in Southwold. Eleanor Jacques was now married and had gone to Singapore and Brenda Salkeld had left for Ireland, so Blair was relatively isolated in Southwold—working on the allotments, walking alone and spending time with his father. Eventually in October, after sending A Clergyman's Daughter to Moore, he left for London to take a job that had been found for him by his aunt Nellie Limouzin. Hampstead This job was as a part-time assistant in Booklovers' Corner, a second-hand bookshop in Hampstead run by Francis and Myfanwy Westrope, who were friends of Nellie Limouzin in the Esperanto movement. The Westropes were friendly and provided him with comfortable accommodation at Warwick Mansions, Pond Street. He was sharing the job with Jon Kimche, who also lived with the Westropes. Blair worked at the shop in the afternoons and had his mornings free to write and his evenings free to socialise. These experiences provided background for the novel Keep the Aspidistra Flying (1936). As well as the various guests of the Westropes, he was able to enjoy the company of Richard Rees and the Adelphi writers and Mabel Fierz. The Westropes and Kimche were members of the Independent Labour Party, although at this time Blair was not seriously politically active. He was writing for the Adelphi and preparing A Clergyman's Daughter and Burmese Days for publication. At the beginning of 1935 he had to move out of Warwick Mansions, and Mabel Fierz found him a flat in Parliament Hill. A Clergyman's Daughter was published on 11 March 1935. In early 1935 Blair met his future wife Eileen O'Shaughnessy, when his landlady, Rosalind Obermeyer, who was studying for a master's degree in psychology at University College London, invited some of her fellow students to a party. One of these students, Elizaveta Fen, a biographer and future translator of Chekhov, recalled Blair and his friend Richard Rees "draped" at the fireplace, looking, she thought, "moth-eaten and prematurely aged." Around this time, Blair had started to write reviews for The New English Weekly. In June, Burmese Days was published and Cyril Connolly's positive review in the New Statesman prompted Blair to re-establish contact with his old friend. In August, he moved into a flat, at 50 Lawford Road, Kentish Town, which he shared with Michael Sayers and Rayner Heppenstall. The relationship was sometimes awkward and Blair and Heppenstall even came to blows, though they remained friends and later worked together on BBC broadcasts. Blair was now working on Keep the Aspidistra Flying, and also tried unsuccessfully to write a serial for the News Chronicle. By October 1935 his flatmates had moved out and he was struggling to pay the rent on his own. He remained until the end of January 1936, when he stopped working at Booklovers' Corner. In 1980, English Heritage honoured Orwell with a blue plaque at his Kentish Town residence. The Road to Wigan Pier At this time, Victor Gollancz suggested Orwell spend a short time investigating social conditions in economically depressed Northern England. Two years earlier, J. B. Priestley had written about England north of the Trent, sparking an interest in reportage. The depression had also introduced a number of working-class writers from the North of England to the reading public. It was one of these working-class authors, Jack Hilton, whom Orwell sought for advice. Orwell had written to Hilton seeking lodging and asking for recommendations on his route. Hilton was unable to provide him lodging, but suggested that he travel to Wigan rather than Rochdale, "for there are the colliers and they're good stuff." On 31 January 1936, Orwell set out by public transport and on foot, reaching Manchester via Coventry, Stafford, the Potteries and Macclesfield. Arriving in Manchester after the banks had closed, he had to stay in a common lodging-house. The next day he picked up a list of contacts sent by Richard Rees. One of these, the trade union official Frank Meade, suggested Wigan, where Orwell spent February staying in dirty lodgings over a tripe shop. At Wigan, he visited many homes to see how people lived, took detailed notes of housing conditions and wages earned, went down Bryn Hall coal mine, and used the local public library to consult public health records and reports on working conditions in mines. During this time, he was distracted by concerns about style and possible libel in Keep the Aspidistra Flying. He made a quick visit to Liverpool and during March, stayed in south Yorkshire, spending time in Sheffield and Barnsley. As well as visiting mines, including Grimethorpe, and observing social conditions, he attended meetings of the Communist Party and of Oswald Mosley ("his speech the usual claptrap—The blame for everything was put upon mysterious international gangs of Jews") where he saw the tactics of the Blackshirts ("...one is liable to get both a hammering and a fine for asking a question which Mosley finds it difficult to answer."). He also made visits to his sister at Headingley, during which he visited the Brontë Parsonage at Haworth, where he was "chiefly impressed by a pair of Charlotte Brontë's cloth-topped boots, very small, with square toes and lacing up at the sides." Orwell needed somewhere he could concentrate on writing his book, and once again help was provided by Aunt Nellie, who was living at Wallington, Hertfordshire in a very small 16th-century cottage called the "Stores". Wallington was a tiny village 35 miles north of London, and the cottage had almost no modern facilities. Orwell took over the tenancy and moved in on 2 April 1936. He started work on The Road to Wigan Pier by the end of April, but also spent hours working on the garden and testing the possibility of reopening the Stores as a village shop. Keep the Aspidistra Flying was published by Gollancz on 20 April 1936. On 4 August, Orwell gave a talk at the Adelphi Summer School held at Langham, entitled An Outsider Sees the Distressed Areas; others who spoke at the school included John Strachey, Max Plowman, Karl Polanyi and Reinhold Niebuhr. The result of his journeys through the north was The Road to Wigan Pier, published by Gollancz for the Left Book Club in 1937. The first half of the book documents his social investigations of Lancashire and Yorkshire, including an evocative description of working life in the coal mines. The second half is a long essay on his upbringing and the development of his political conscience, which includes an argument for socialism (although he goes to lengths to balance the concerns and goals of socialism with the barriers it faced from the movement's own advocates at the time, such as "priggish" and "dull" socialist intellectuals and "proletarian" socialists with little grasp of the actual ideology). Gollancz feared the second half would offend readers and added a disculpatory preface to the book while Orwell was in Spain. Orwell's research for The Road to Wigan Pier led to him being placed under surveillance by the Special Branch from 1936, for 12 years, until one year before the publication of Nineteen Eighty-Four. Orwell married Eileen O'Shaughnessy on 9 June 1936. Shortly afterwards, the political crisis began in Spain and Orwell followed developments there closely. At the end of the year, concerned by Francisco Franco's military uprising (supported by Nazi Germany, Fascist Italy and local groups such as Falange), Orwell decided to go to Spain to take part in the Spanish Civil War on the Republican side. Under the erroneous impression that he needed papers from some left-wing organisation to cross the frontier, on John Strachey's recommendation he applied unsuccessfully to Harry Pollitt, leader of the British Communist Party. Pollitt was suspicious of Orwell's political reliability; he asked him whether he would undertake to join the International Brigade and advised him to get a safe-conduct from the Spanish Embassy in Paris. Not wishing to commit himself until he had seen the situation in situ, Orwell instead used his Independent Labour Party contacts to get a letter of introduction to John McNair in Barcelona. Spanish Civil War Orwell set out for Spain on about 23 December 1936, dining with Henry Miller in Paris on the way. Miller told Orwell that going to fight in the Civil War out of some sense of obligation or guilt was "sheer stupidity" and that the Englishman's ideas "about combating Fascism, defending democracy, etc., etc., were all baloney". A few days later in Barcelona, Orwell met John McNair of the Independent Labour Party (ILP) Office who quoted him: "I've come to fight against Fascism", but if someone had asked him what he was fighting for, "I should have answered: 'Common decency'". Orwell stepped into a complex political situation in Catalonia. The Republican government was supported by a number of factions with conflicting aims, including the Workers' Party of Marxist Unification (POUM – Partido Obrero de Unificación Marxista), the anarcho-syndicalist Confederación Nacional del Trabajo (CNT) and the Unified Socialist Party of Catalonia (a wing of the Spanish Communist Party, which was backed by Soviet arms and aid). Orwell was at first exasperated by this "kaleidoscope" of political parties and trade unions, "with their tiresome names". The ILP was linked to the POUM so Orwell joined the POUM. After a time at the Lenin Barracks in Barcelona he was sent to the relatively quiet Aragon Front under Georges Kopp. By January 1937 he was at Alcubierre above sea level, in the depth of winter. There was very little military action and Orwell was shocked by the lack of munitions, food and firewood as well as other extreme deprivations. With his Cadet Corps and police training, Orwell was quickly made a corporal. On the arrival of a British ILP Contingent about three weeks later, Orwell and the other English militiaman, Williams, were sent with them to Monte Oscuro. The newly arrived ILP contingent included Bob Smillie, Bob Edwards, Stafford Cottman and Jack Branthwaite. The unit was then sent on to Huesca. Meanwhile, back in England, Eileen had been handling the issues relating to the publication of The Road to Wigan Pier before setting out for Spain herself, leaving Nellie Limouzin to look after The Stores. Eileen volunteered for a post in John McNair's office and with the help of Georges Kopp paid visits to her husband, bringing him English tea, chocolate and cigars. Orwell had to spend some days in hospital with a poisoned hand and had most of his possessions stolen by the staff. He returned to the front and saw some action in a night attack on the Nationalist trenches where he chased an enemy soldier with a bayonet and bombed an enemy rifle position. In April, Orwell returned to Barcelona. Wanting to be sent to the Madrid front, which meant he "must join the International Column", he approached a Communist friend attached to the Spanish Medical Aid and explained his case. "Although he did not think much of the Communists, Orwell was still ready to treat them as friends and allies. That would soon change." This was the time of the Barcelona May Days and Orwell was caught up in the factional fighting. He spent much of the time on a roof, with a stack of novels, but encountered Jon Kimche from his Hampstead days during the stay. The subsequent campaign of lies and distortion carried out by the Communist press, in which the POUM was accused of collaborating with the fascists, had a dramatic effect on Orwell. Instead of joining the International Brigades as he had intended, he decided to return to the Aragon Front. Once the May fighting was over, he was approached by a Communist friend who asked if he still intended transferring to the International Brigades. Orwell expressed surprise that they should still want him, because according to the Communist press he was a fascist. "No one who was in Barcelona then, or for months later, will forget the horrible atmosphere produced by fear, suspicion, hatred, censored newspapers, crammed jails, enormous food queues and prowling gangs of armed men." After his return to the front, he was wounded in the throat by a sniper's bullet. At 6 ft 2 in (1.88 m), Orwell was considerably taller than the Spanish fighters and had been warned against standing against the trench parapet. Unable to speak, and with blood pouring from his mouth, Orwell was carried on a stretcher to Siétamo, loaded on an ambulance and after a bumpy journey via Barbastro arrived at the hospital in Lleida. He recovered sufficiently to get up and on 27 May 1937 was sent on to Tarragona and two days later to a POUM sanatorium in the suburbs of Barcelona. The bullet had missed his main artery by the barest margin and his voice was barely audible. It had been such a clean shot that the wound immediately went through the process of cauterisation. He received electrotherapy treatment and was declared medically unfit for service. By the middle of June the political situation in Barcelona had deteriorated and the POUM—painted by the pro-Soviet Communists as a Trotskyist organisation—was outlawed and under attack. The Communist line was that the POUM were "objectively" Fascist, hindering the Republican cause. "A particularly nasty poster appeared, showing a head with a POUM mask being ripped off to reveal a Swastika-covered face beneath." Members, including Kopp, were arrested and others were in hiding. Orwell and his wife were under threat and had to lie low, although they broke cover to try to help Kopp. Finally with their passports in order, they escaped from Spain by train, diverting to Banyuls-sur-Mer for a short stay before returning to England. In the first week of July 1937 Orwell arrived back at Wallington; on 13 July 1937 a deposition was presented to the Tribunal for Espionage & High Treason in Valencia, charging the Orwells with "rabid Trotskyism", and being agents of the POUM. The trial of the leaders of the POUM and of Orwell (in his absence) took place in Barcelona in October and November 1938. Observing events from French Morocco, Orwell wrote that they were "only a by-product of the Russian Trotskyist trials and from the start every kind of lie, including flagrant absurdities, has been circulated in the Communist press." Orwell's experiences in the Spanish Civil War gave rise to Homage to Catalonia (1938). In his book, The International Brigades: Fascism, Freedom and the Spanish Civil War, Giles Tremlett writes that according to Soviet files, Orwell and his wife Eileen were spied on in Barcelona in May 1937. "The papers are documentary evidence that not only Orwell, but also his wife Eileen, were being watched closely". Rest and recuperation Orwell returned to England in June 1937, and stayed at the O'Shaughnessy home at Greenwich. He found his views on the Spanish Civil War out of favour. Kingsley Martin rejected two of his works and Gollancz was equally cautious. At the same time, the communist Daily Worker was running an attack on The Road to Wigan Pier, taking out of context Orwell writing that "the working classes smell"; a letter to Gollancz from Orwell threatening libel action brought a stop to this. Orwell was also able to find a more sympathetic publisher for his views in Fredric Warburg of Secker & Warburg. Orwell returned to Wallington, which he found in disarray after his absence. He acquired goats, a cockerel (rooster) he called Henry Ford and a poodle puppy he called Marx; and settled down to animal husbandry and writing Homage to Catalonia. There were thoughts of going to India to work on The Pioneer, a newspaper in Lucknow, but by March 1938 Orwell's health had deteriorated. He was admitted to Preston Hall Sanatorium at Aylesford, Kent, a British Legion hospital for ex-servicemen to which his brother-in-law Laurence O'Shaughnessy was attached. He was thought initially to be suffering from tuberculosis and stayed in the sanatorium until September. A stream of visitors came to see him, including Common, Heppenstall, Plowman and Cyril Connolly. Connolly brought with him Stephen Spender, a cause of some embarrassment as Orwell had referred to Spender as a "pansy friend" some time earlier. Homage to Catalonia was published by Secker & Warburg and was a commercial flop. In the latter part of his stay at the clinic, Orwell was able to go for walks in the countryside and study nature. The novelist L. H. Myers secretly funded a trip to French Morocco for half a year for Orwell to avoid the English winter and recover his health. The Orwells set out in September 1938 via Gibraltar and Tangier to avoid Spanish Morocco and arrived at Marrakech. They rented a villa on the road to Casablanca and during that time Orwell wrote Coming Up for Air. They arrived back in England on 30 March 1939 and Coming Up for Air was published in June. Orwell spent time in Wallington and Southwold working on a Dickens essay and it was in June 1939 that Orwell's father, Richard Blair, died. Second World War and Animal Farm At the outbreak of the Second World War, Orwell's wife Eileen started working in the Censorship Department of the Ministry of Information in central London, staying during the week with her family in Greenwich. Orwell also submitted his name to the Central Register for war work, but nothing transpired. "They won't have me in the army, at any rate at present, because of my lungs", Orwell told Geoffrey Gorer. He returned to Wallington, and in late 1939 he wrote material for his first collection of essays, Inside the Whale. For the next year he was occupied writing reviews for plays, films and books for The Listener, Time and Tide and New Adelphi. On 29 March 1940 his long association with Tribune began with a review of a sergeant's account of Napoleon's retreat from Moscow. At the beginning of 1940, the first edition of Connolly's Horizon appeared, and this provided a new outlet for Orwell's work as well as new literary contacts. In May the Orwells took lease of a flat in London at Dorset Chambers, Chagford Street, Marylebone. It was the time of the Dunkirk evacuation and the death in France of Eileen's brother Lawrence caused her considerable grief and long-term depression. Throughout this period Orwell kept a wartime diary. Orwell was declared "unfit for any kind of military service" by the Medical Board in June, but soon afterwards found an opportunity to become involved in war activities by joining the Home Guard. He shared Tom Wintringham's socialist vision for the Home Guard as a revolutionary People's Militia. His lecture notes for instructing platoon members include advice on street fighting, field fortifications, and the use of mortars of various kinds. Sergeant Orwell managed to recruit Fredric Warburg to his unit. During the Battle of Britain he used to spend weekends with Warburg and his new Zionist friend, Tosco Fyvel, at Warburg's house at Twyford, Berkshire. At Wallington he worked on "England Your England" and in London wrote reviews for various periodicals. Visiting Eileen's family in Greenwich brought him face-to-face with the effects of the Blitz on East London. In mid-1940, Warburg, Fyvel and Orwell planned Searchlight Books. Eleven volumes eventually appeared, of which Orwell's The Lion and the Unicorn: Socialism and the English Genius, published on 19 February 1941, was the first. Early in 1941 he began to write for the American Partisan Review which linked Orwell with The New York Intellectuals who were also anti-Stalinist, and contributed to the Gollancz anthology The Betrayal of the Left, written in the light of the Molotov–Ribbentrop Pact (although Orwell referred to it as the Russo-German Pact and the Hitler-Stalin Pact). He also applied unsuccessfully for a job at the Air Ministry. Meanwhile, he was still writing reviews of books and plays and at this time met the novelist Anthony Powell. He also took part in a few radio broadcasts for the Eastern Service of the BBC. In March the Orwells moved to a seventh-floor flat at Langford Court, St John's Wood, while at Wallington Orwell was "digging for victory" by planting potatoes. In August 1941, Orwell finally obtained "war work" when he was taken on full-time by the BBC's Eastern Service. When interviewed for the job he indicated that he "accept[ed] absolutely the need for propaganda to be directed by the government" and stressed his view that, in wartime, discipline in the execution of government policy was essential. He supervised cultural broadcasts to India to counter propaganda from Nazi Germany designed to undermine imperial links. This was Orwell's first experience of the rigid conformity of life in an office, and it gave him an opportunity to create cultural programmes with contributions from T. S. Eliot, Dylan Thomas, E. M. Forster, Ahmed Ali, Mulk Raj Anand, and William Empson among others. At the end of August he had a dinner with H. G. Wells which degenerated into a row because Wells had taken offence at observations Orwell made about him in a Horizon article. In October Orwell had a bout of bronchitis and the illness recurred frequently. David Astor was looking for a provocative contributor for The Observer and invited Orwell to write for him—the first article appearing in March 1942. In early 1942 Eileen changed jobs to work at the Ministry of Food and in mid-1942 the Orwells moved to a larger flat, a ground floor and basement, 10a Mortimer Crescent in Maida Vale/Kilburn—"the kind of lower-middle-class ambience that Orwell thought was London at its best." Around the same time Orwell's mother and sister Avril, who had found work in a sheet-metal factory behind King's Cross Station, moved into a flat close to George and Eileen. At the BBC, Orwell introduced Voice, a literary programme for his Indian broadcasts, and by now was leading an active social life with literary friends, particularly on the political left. Late in 1942, he started writing regularly for the left-wing weekly Tribune directed by Labour MPs Aneurin Bevan and George Strauss. In March 1943, Orwell's mother died, and around the same time he told Moore he was starting work on a new book, which turned out to be Animal Farm. In September 1943, Orwell resigned from the BBC post that he had occupied for two years. His resignation followed a report confirming his fears that few Indians listened to the broadcasts, but he was also keen to concentrate on writing Animal Farm. Just six days before his last day of service, on 24 November 1943, his adaptation of the fairy tale, Hans Christian Andersen's The Emperor's New Clothes was broadcast. It was a genre in which he was greatly interested and which appeared on Animal Farms title-page. At this time he also resigned from the Home Guard on medical grounds. In November 1943, Orwell was appointed literary editor at Tribune, where his assistant was his old friend Jon Kimche. Orwell was on staff until early 1945, writing over 80 book reviews and on 3 December 1943 started his regular personal column, "As I Please", usually addressing three or four subjects in each. He was still writing reviews for other magazines, including Partisan Review, Horizon, and the New York Nation and becoming a respected pundit among left-wing circles but also a close friend of people on the right such as Powell, Astor and Malcolm Muggeridge. By April 1944 Animal Farm was ready for publication. Gollancz refused to publish it, considering it an attack on the Soviet regime which was a crucial ally in the war. A similar fate was met from other publishers (including T. S. Eliot at Faber and Faber) until Jonathan Cape agreed to take it. In May the Orwells had the opportunity to adopt a child, thanks to the contacts of Eileen's sister Gwen O'Shaughnessy, then a doctor in Newcastle upon Tyne. In June a V-1 flying bomb struck Mortimer Crescent and the Orwells had to find somewhere else to live. Orwell had to scrabble around in the rubble for his collection of books, which he had finally managed to transfer from Wallington, carting them away in a wheelbarrow. Another blow was Cape's reversal of his plan to publish Animal Farm. The decision followed his personal visit to Peter Smollett, an official at the Ministry of Information. Smollett was later identified as a Soviet agent. The Orwells spent some time in the North East, near Carlton, County Durham, dealing with matters in the adoption of a boy whom they named Richard Horatio Blair. By September 1944 they had set up home in Islington, at 27b Canonbury Square. Baby Richard joined them there, and Eileen gave up her work at the Ministry of Food to look after her family. Secker & Warburg had agreed to publish Animal Farm, planned for the following March, although it did not appear in print until August 1945. By February 1945 David Astor had invited Orwell to become a war correspondent for The Observer. Orwell had been looking for the opportunity throughout the war, but his failed medical reports prevented him from being allowed anywhere near action. He went first to liberated Paris and then to Germany and Austria, to such cites as Cologne and Stuttgart. He was never in the front line and was never under fire, but he followed the troops closely, "sometimes entering a captured town within a day of its fall while dead bodies lay in the streets." Some of his reports were published in the Manchester Evening News. It was while he was there that Eileen went into hospital for a hysterectomy and died under anaesthetic on 29 March 1945. She had not given Orwell much notice about this operation because of worries about the cost and because she expected to make a speedy recovery. Orwell returned home for a while and then went back to Europe. He returned finally to London to cover the 1945 general election at the beginning of July. Animal Farm: A Fairy Story was published in Britain on 17 August 1945, and a year later in the US, on 26 August 1946. Jura and Nineteen Eighty-Four Animal Farm had particular resonance in the post-war climate and its worldwide success made Orwell a sought-after figure. For the next four years, Orwell mixed journalistic work—mainly for Tribune, The Observer and the Manchester Evening News, though he also contributed to many small-circulation political and literary magazines—with writing his best-known work, Nineteen Eighty-Four, which was published in 1949. He was a leading figure in the so-called Shanghai Club (named after a restaurant in Soho) of left-leaning and émigré journalists, among them E. H. Carr, Sebastian Haffner, Isaac Deutscher, Barbara Ward and Jon Kimche. In the year following Eileen's death he published around 130 articles and a selection of his Critical Essays, while remaining active in various political lobbying campaigns. He employed a housekeeper, Susan Watson, to look after his adopted son at the Islington flat, which visitors now described as "bleak". In September he spent a fortnight on the island of Jura in the Inner Hebrides and saw it as a place to escape from the hassle of London literary life. David Astor was instrumental in arranging a place for Orwell on Jura. Astor's family owned Scottish estates in the area and a fellow Old Etonian, Robin Fletcher, had a property on the island. In late 1945 and early 1946 Orwell made several hopeless and unwelcome marriage proposals to younger women, including Celia Kirwan (who later became Arthur Koestler's sister-in-law); Ann Popham who happened to live in the same block of flats; and Sonia Brownell, one of Connolly's coterie at the Horizon office. Orwell suffered a tubercular haemorrhage in February 1946 but disguised his illness. In 1945
this, they are rare, with groups living in separate patches of suitable habitat, separated by miles of unsuitable flora. In the wet season, their diet includes fruit, insects, spiders, lizards, frogs, and snakes. In the dry season, they feed on fungi, the only tropical primates known to depend on this source of food. They live in small social groups (approximately six individuals) that stay within a few feet of one another most of the time, staying in contact via high-pitched calls. They are also known to form polyspecific groups with tamarins such as the white-lipped tamarin and brown-mantled tamarin. This is perhaps because Goeldi's marmosets are not known to have the X-linked polymorphism which enables some individuals of other New World monkey species to see in full tri-chromatic vision. The species takes its name from its discoverer, the Swiss naturalist Emil August Goeldi. References External links ARKive - images and movies of the Goeldi's monkey (Callimico goeldii) Press release on recent research on Goeldi's monkey by scientists at the University of Washington Primate Info Net Callimico goeldii Factsheet Pictures of Goeldi's Monkey Goeldi's
sometimes has red, white, or silverly brown highlights. Their bodies are about long, and their tails are about long. Goeldi's marmoset was first described in 1904, making Callimico one of the more recent monkey genera to be described. In older classification schemes it was sometimes placed in its own family Callimiconidae and sometimes, along with the marmosets and tamarins, in the subfamily Callitrichinae in the family Cebidae. More recently, Callitrichinae has been (re-)elevated to family status as Callitrichidae. Females reach sexual maturity at 8.5 months, males at 16.5 months. The gestation period lasts from 140 to 180 days. Unlike other New World monkeys, they have the capacity to give birth twice a year. The mother carries a single baby monkey per pregnancy, whereas most other species in the family Callitrichidae usually give birth to twins. For the first 2–3 weeks the mother acts as the primary caregiver until the father takes over most of the responsibilities except for nursing. The infant is weaned after about 65 days. Females outnumber males by 2 to 1. The life expectancy in captivity is about 10 years. Goeldi's marmosets prefer to forage in dense scrubby undergrowth; perhaps
of value ("the stakes") on an event with an uncertain outcome with the intent of winning something else of value. Gambling thus requires three elements to be present: consideration (an amount wagered), risk (chance), and a prize. The outcome of the wager is often immediate, such as a single roll of dice, a spin of a roulette wheel, or a horse crossing the finish line, but longer time frames are also common, allowing wagers on the outcome of a future sports contest or even an entire sports season. The term "gaming" in this context typically refers to instances in which the activity has been specifically permitted by law. The two words are not mutually exclusive; i.e., a "gaming" company offers (legal) "gambling" activities to the public and may be regulated by one of many gaming control boards, for example, the Nevada Gaming Control Board. However, this distinction is not universally observed in the English-speaking world. For instance, in the United Kingdom, the regulator of gambling activities is called the Gambling Commission (not the Gaming Commission). The word gaming is used more frequently since the rise of computer and video games to describe activities that do not necessarily involve wagering, especially online gaming, with the new usage still not having displaced the old usage as the primary definition in common dictionaries. "Gaming" has also been used to circumvent laws against "gambling". The media and others have used one term or the other to frame conversations around the subjects, resulting in a shift of perceptions among their audiences. Gambling is also a major international commercial activity, with the legal gambling market totaling an estimated $335 billion in 2009. In other forms, gambling can be conducted with materials that have a value, but are not real money. For example, players of marbles games might wager marbles, and likewise games of Pogs or Magic: The Gathering can be played with the collectible game pieces (respectively, small discs and trading cards) as stakes, resulting in a meta-game regarding the value of a player's collection of pieces. History Gambling dates back to the Paleolithic period, before written history. In Mesopotamia the earliest six-sided dice date to about 3000 BC. However, they were based on astragali dating back thousands of years earlier. In China, gambling houses were widespread in the first millennium BC, and betting on fighting animals was common. Lotto games and dominoes (precursors of Pai Gow) appeared in China as early as the 10th century. Playing cards appeared in the 9th century AD in China. Records trace gambling in Japan back at least as far as the 14th century. Poker, the most popular U.S. card game associated with gambling, derives from the Persian game As-Nas, dating back to the 17th century. The first known casino, the Ridotto, started operating in 1638 in Venice, Italy. Great Britain Gambling has been a main recreational activity in Great Britain for centuries. Horseracing has been a favorite theme for over three centuries. It has been heavily regulated. Historically much of the opposition comes from evangelical Protestants, and from social reformers. United States Gambling has been a popular activity in the United States for centuries. It has also been suppressed by law in many areas for almost as long. By the early 20th century, gambling was almost uniformly outlawed throughout the U.S. and thus became a largely illegal activity, helping to spur the growth of the mafia and other criminal organizations. The late 20th century saw a softening in attitudes towards gambling and a relaxation of laws against it. Regulation Many jurisdictions, local as well as national, either ban gambling or heavily control it by licensing the vendors. Such regulation generally leads to gambling tourism and illegal gambling in the areas where it is not allowed. The involvement of governments, through regulation and taxation, has led to a close connection between many governments and gaming organizations, where legal gambling provides significant government revenue, such as in Monaco and Macau, China. There is generally legislation requiring that gaming devices be statistically random, to prevent manufacturers from making some high-payoff results impossible. Since these high payoffs have very low probability, a house bias can quite easily be missed unless the devices are checked carefully. Most jurisdictions that allow gambling require participants to be above a certain age. In some jurisdictions, the gambling age differs depending on the type of gambling. For example, in many American states one must be over 21 to enter a casino, but may buy a lottery ticket after turning 18. Insurance Because contracts of insurance have many features in common with wagers, insurance contracts are often distinguished in law as agreements in which either party has an interest in the "bet-upon" outcome beyond the specific financial terms. e.g.: a "bet" with an insurer on whether one's house will burn down is not gambling, but rather insurance – as the homeowner has an obvious interest in the continued existence of his/her home independent of the purely financial aspects of the "bet" (i.e. the insurance policy). Nonetheless, both insurance and gambling contracts are typically considered aleatory contracts under most legal systems, though they are subject to different types of regulation. Asset recovery Under common law, particularly English Law (English unjust enrichment), a gambling contract may not give a casino bona fide purchaser status, permitting the recovery of stolen funds in some situations. In Lipkin Gorman v Karpnale Ltd, where a solicitor used stolen funds to gamble at a casino, the House of Lords overruled the High Court's previous verdict, adjudicating that the casino return the stolen funds less those subject to any change of position defence. U.S. Law precedents are somewhat similar. For case law on recovery of gambling losses where the loser had stolen the funds see "Rights of owner of stolen money as against one who won it in gambling transaction from thief". An interesting question is what happens when the person trying to make recovery is the gambler's spouse, and the money or property lost was either the spouse's, or was community property. This was a minor plot point in a Perry Mason novel, The Case of the Singing Skirt, and it cites an actual case Novo v. Hotel Del Rio. Religious views Hinduism Ancient Hindu poems like the Gambler's Lament and the Mahabharata testify to the popularity of gambling among ancient Indians. However, the text Arthashastra (c. 4th century BC) recommends taxation and control of gambling. Judaism Ancient Jewish authorities frowned on gambling, even disqualifying professional gamblers from testifying in court. Christianity Catholicism The Catholic Church holds the position that there is no moral impediment to gambling, so long as it is fair, all bettors have a reasonable chance of winning, there is no fraud involved, and the parties involved do not have actual knowledge of the outcome of the bet (unless they have disclosed this knowledge), and as long as the following conditions are met: the gambler can afford to lose the bet, and stops when the limit is reached, and the motivation is entertainment and not personal gain leading to the "love of money" or making a living. In general, Catholic bishops have opposed casino gambling on the grounds that it too often tempts people into problem gambling or addiction, and has particularly negative effects on poor people; they sometimes also cite secondary effects such as increases in loan sharking, prostitution, corruption, and general public immorality. Some parish pastors have also opposed casinos for the additional reason that they would take customers away from church bingo and annual festivals where games such as blackjack, roulette, craps, and poker are used for fundraising. St. Thomas Aquinas wrote that gambling should be especially forbidden where the losing bettor is underage or otherwise not able to consent to the transaction. Gambling has often been seen as having social consequences, as satirized by Balzac. For these social and religious reasons, most legal jurisdictions limit gambling, as advocated by Pascal. Protestantism Gambling views among Protestants vary, with some either discouraging or forbidding their members from participation in gambling. Methodists, in accordance with the doctrine of outward holiness, oppose gambling which they believe is a sin that feeds on greed; examples are the United Methodist Church, the Free Methodist Church, the Evangelical Wesleyan Church, the Salvation Army, and the Church of the Nazarene. Other Protestants that oppose gambling include many Mennonites, Quakers, the Christian Reformed Church in North America, the Church of the Lutheran Confession, the Southern Baptist Convention, the Assemblies of God, and the Seventh-day Adventist Church.
as in Monaco and Macau, China. There is generally legislation requiring that gaming devices be statistically random, to prevent manufacturers from making some high-payoff results impossible. Since these high payoffs have very low probability, a house bias can quite easily be missed unless the devices are checked carefully. Most jurisdictions that allow gambling require participants to be above a certain age. In some jurisdictions, the gambling age differs depending on the type of gambling. For example, in many American states one must be over 21 to enter a casino, but may buy a lottery ticket after turning 18. Insurance Because contracts of insurance have many features in common with wagers, insurance contracts are often distinguished in law as agreements in which either party has an interest in the "bet-upon" outcome beyond the specific financial terms. e.g.: a "bet" with an insurer on whether one's house will burn down is not gambling, but rather insurance – as the homeowner has an obvious interest in the continued existence of his/her home independent of the purely financial aspects of the "bet" (i.e. the insurance policy). Nonetheless, both insurance and gambling contracts are typically considered aleatory contracts under most legal systems, though they are subject to different types of regulation. Asset recovery Under common law, particularly English Law (English unjust enrichment), a gambling contract may not give a casino bona fide purchaser status, permitting the recovery of stolen funds in some situations. In Lipkin Gorman v Karpnale Ltd, where a solicitor used stolen funds to gamble at a casino, the House of Lords overruled the High Court's previous verdict, adjudicating that the casino return the stolen funds less those subject to any change of position defence. U.S. Law precedents are somewhat similar. For case law on recovery of gambling losses where the loser had stolen the funds see "Rights of owner of stolen money as against one who won it in gambling transaction from thief". An interesting question is what happens when the person trying to make recovery is the gambler's spouse, and the money or property lost was either the spouse's, or was community property. This was a minor plot point in a Perry Mason novel, The Case of the Singing Skirt, and it cites an actual case Novo v. Hotel Del Rio. Religious views Hinduism Ancient Hindu poems like the Gambler's Lament and the Mahabharata testify to the popularity of gambling among ancient Indians. However, the text Arthashastra (c. 4th century BC) recommends taxation and control of gambling. Judaism Ancient Jewish authorities frowned on gambling, even disqualifying professional gamblers from testifying in court. Christianity Catholicism The Catholic Church holds the position that there is no moral impediment to gambling, so long as it is fair, all bettors have a reasonable chance of winning, there is no fraud involved, and the parties involved do not have actual knowledge of the outcome of the bet (unless they have disclosed this knowledge), and as long as the following conditions are met: the gambler can afford to lose the bet, and stops when the limit is reached, and the motivation is entertainment and not personal gain leading to the "love of money" or making a living. In general, Catholic bishops have opposed casino gambling on the grounds that it too often tempts people into problem gambling or addiction, and has particularly negative effects on poor people; they sometimes also cite secondary effects such as increases in loan sharking, prostitution, corruption, and general public immorality. Some parish pastors have also opposed casinos for the additional reason that they would take customers away from church bingo and annual festivals where games such as blackjack, roulette, craps, and poker are used for fundraising. St. Thomas Aquinas wrote that gambling should be especially forbidden where the losing bettor is underage or otherwise not able to consent to the transaction. Gambling has often been seen as having social consequences, as satirized by Balzac. For these social and religious reasons, most legal jurisdictions limit gambling, as advocated by Pascal. Protestantism Gambling views among Protestants vary, with some either discouraging or forbidding their members from participation in gambling. Methodists, in accordance with the doctrine of outward holiness, oppose gambling which they believe is a sin that feeds on greed; examples are the United Methodist Church, the Free Methodist Church, the Evangelical Wesleyan Church, the Salvation Army, and the Church of the Nazarene. Other Protestants that oppose gambling include many Mennonites, Quakers, the Christian Reformed Church in North America, the Church of the Lutheran Confession, the Southern Baptist Convention, the Assemblies of God, and the Seventh-day Adventist Church. Other Christian denominations Other churches that oppose gambling include the Jehovah's Witnesses, The Church of Jesus Christ of Latter-day Saints, the Iglesia ni Cristo, and the Members Church of God International. Islam Although different interpretations of Shari‘ah (Islamic Law) exist in the Muslim world, there is a consensus among the ‘Ulema’ (, Scholars (of Islam)) that gambling is haraam (, sinful or forbidden). In assertions made during its prohibition, Muslim jurists describe gambling as being both un-Qur’anic, and as being generally harmful to the Muslim Ummah (, Community). The Arabic terminology for gambling is Maisir. In parts of the world that implement full Shari‘ah, such as Aceh, punishments for Muslim gamblers can range up to 12 lashes or a one-year prison term and a fine for those who provide a venue for such practises. Some Islamic nations prohibit gambling; most other countries regulate it. Bahá'í Faith According to the Most Holy Book, paragraph 155, gambling is forbidden. Types Casino games While almost any game can be played for money, and any game typically played for money can also be played just for fun, some games are generally offered in a casino setting. Table games Electronic gaming Online roulette Pachinko Sic Bo Slot machine Video poker Video bingo Video poker Other gambling Bingo Keno Non-casino games Gambling games that take place outside of casinos include bingo (as played in the US and UK), dead pool, lotteries, pull-tab games and scratchcards, and Mahjong. Other non-casino gambling games include: Non-casino card games, including historical games like Basset, Ecarté, Lansquenet and Put. Technically, a gambling card game is one in which the cards are not actually played but simply bet on. Carnival Games such as The Razzle or Hanky Pank Coin-tossing games such as Head and Tail, Two-up* Confidence tricks such as Three-card Monte or the Shell game Dice-based games, such as Backgammon, Liar's dice, Passe-dix, Hazard, Threes, Pig, or Mexico (or Perudo); *Although coin tossing is not usually played in a casino, it has been known to be an official gambling game in some Australian casinos Fixed-odds betting Fixed-odds betting and Parimutuel betting frequently occur at many types of sporting events, and political elections. In addition many bookmakers offer fixed odds on a number of non-sports related outcomes, for example the direction and extent of movement of various financial indices, the winner of television competitions such as Big Brother, and election results. Interactive prediction markets also offer trading on these outcomes, with "shares" of results trading on an open market. Parimutuel betting One of the most widespread forms of gambling involves betting on horse or greyhound racing. Wagering may take place through parimutuel pools, or bookmakers may take bets personally. Parimutuel wagers pay off at prices determined by support in the wagering pools, while bookmakers pay off either at the odds offered at the time of accepting the bet; or at the median odds offered by track bookmakers at the time the race started. Sports betting Betting on team sports has become an important service industry in many countries. For example, millions of people play the football pools every week in the United Kingdom. In addition to organized sports betting, both legal and illegal, there are many side-betting games played by casual groups of spectators, such as NCAA Basketball Tournament Bracket Pools, Super Bowl Squares, Fantasy Sports Leagues with monetary entry fees and winnings, and in-person spectator games like Moundball. Virtual sports Based on Sports Betting, Virtual Sports are fantasy and never played sports events made by software that can be played every time without wondering about external things like weather conditions. Arbitrage betting Arbitrage betting is a theoretically risk-free betting system in which every outcome of an event is bet upon so that a known profit will be made by the bettor upon completion of the event regardless of the outcome. Arbitrage betting is a combination of the ancient art of arbitrage trading and gambling, which has been made possible by the large numbers of bookmakers in the marketplace, creating occasional opportunities for arbitrage. Other types of betting One can also bet with another person that a statement is true or false, or that a specified event will happen (a "back bet") or will not happen (a "lay bet") within a specified time. This occurs in particular when two people have opposing but strongly held views on truth or events. Not only do the parties hope to gain from the bet, they place the bet also to demonstrate their certainty about the issue. Some means of determining the issue at stake must exist. Sometimes the amount bet remains nominal, demonstrating the outcome as one of principle rather than of financial importance. Betting exchanges allow consumers to both back and lay at odds of their choice. Similar in some ways to a stock exchange, a bettor may want to back a horse (hoping it will win) or lay a horse (hoping it will lose, effectively acting as bookmaker). Spread betting allows gamblers to wagering on the outcome of an event where the pay-off is based on the accuracy of the wager, rather than a simple "win or lose" outcome. For example, a wager can be based on the when a point is scored in the game in minutes and each minute away from the prediction increases or reduces the payout. Staking systems Many betting systems have been created in an attempt to "beat the house" but no system can make a mathematically unprofitable bet in terms of expected value profitable over time. Widely used systems include: Card counting – Many systems exist for blackjack to keep track of the ratio of ten values to all others; when this ratio is high the player has an advantage and should increase the amount of their bets. Keeping track of cards dealt confers an advantage in other games as well. Due-column betting – A variation on fixed profits betting in which the bettor sets a target profit and then calculates a bet size that will make this profit, adding any losses to the target.
it is presumed that each player acts simultaneously or, at least, without knowing the actions of the other. If players have some information about the choices of other players, the game is usually presented in extensive form. Every extensive-form game has an equivalent normal-form game, however, the transformation to normal form may result in an exponential blowup in the size of the representation, making it computationally impractical. Characteristic function form In games that possess removable utility, separate rewards are not given; rather, the characteristic function decides the payoff of each unity. The idea is that the unity that is 'empty', so to speak, does not receive a reward at all. The origin of this form is to be found in John von Neumann and Oskar Morgenstern's book; when looking at these instances, they guessed that when a union appears, it works against the fraction as if two individuals were playing a normal game. The balanced payoff of C is a basic function. Although there are differing examples that help determine coalitional amounts from normal games, not all appear that in their function form can be derived from such. Formally, a characteristic function is seen as: (N,v), where N represents the group of people and is a normal utility. Such characteristic functions have expanded to describe games where there is no removable utility. Alternative game representations Alternative game representation forms are used for some subclasses of games or adjusted to the needs of interdisciplinary research. In addition to classical game representations, some of the alternative representations also encode time related aspects. General and applied uses As a method of applied mathematics, game theory has been used to study a wide variety of human and animal behaviors. It was initially developed in economics to understand a large collection of economic behaviors, including behaviors of firms, markets, and consumers. The first use of game-theoretic analysis was by Antoine Augustin Cournot in 1838 with his solution of the Cournot duopoly. The use of game theory in the social sciences has expanded, and game theory has been applied to political, sociological, and psychological behaviors as well. Although pre-twentieth-century naturalists such as Charles Darwin made game-theoretic kinds of statements, the use of game-theoretic analysis in biology began with Ronald Fisher's studies of animal behavior during the 1930s. This work predates the name "game theory", but it shares many important features with this field. The developments in economics were later applied to biology largely by John Maynard Smith in his 1982 book Evolution and the Theory of Games. In addition to being used to describe, predict, and explain behavior, game theory has also been used to develop theories of ethical or normative behavior and to prescribe such behavior. In economics and philosophy, scholars have applied game theory to help in the understanding of good or proper behavior. Game-theoretic arguments of this type can be found as far back as Plato. An alternative version of game theory, called chemical game theory, represents the player's choices as metaphorical chemical reactant molecules called "knowlecules". Chemical game theory then calculates the outcomes as equilibrium solutions to a system of chemical reactions. Description and modeling The primary use of game theory is to describe and model how human populations behave. Some scholars believe that by finding the equilibria of games they can predict how actual human populations will behave when confronted with situations analogous to the game being studied. This particular view of game theory has been criticized. It is argued that the assumptions made by game theorists are often violated when applied to real-world situations. Game theorists usually assume players act rationally, but in practice, human behavior often deviates from this model. Game theorists respond by comparing their assumptions to those used in physics. Thus while their assumptions do not always hold, they can treat game theory as a reasonable scientific ideal akin to the models used by physicists. However, empirical work has shown that in some classic games, such as the centipede game, guess 2/3 of the average game, and the dictator game, people regularly do not play Nash equilibria. There is an ongoing debate regarding the importance of these experiments and whether the analysis of the experiments fully captures all aspects of the relevant situation. Some game theorists, following the work of John Maynard Smith and George R. Price, have turned to evolutionary game theory in order to resolve these issues. These models presume either no rationality or bounded rationality on the part of players. Despite the name, evolutionary game theory does not necessarily presume natural selection in the biological sense. Evolutionary game theory includes both biological as well as cultural evolution and also models of individual learning (for example, fictitious play dynamics). Prescriptive or normative analysis Some scholars see game theory not as a predictive tool for the behavior of human beings, but as a suggestion for how people ought to behave. Since a strategy, corresponding to a Nash equilibrium of a game constitutes one's best response to the actions of the other players – provided they are in (the same) Nash equilibrium – playing a strategy that is part of a Nash equilibrium seems appropriate. This normative use of game theory has also come under criticism. Economics and business Game theory is a major method used in mathematical economics and business for modeling competing behaviors of interacting agents. Applications include a wide array of economic phenomena and approaches, such as auctions, bargaining, mergers and acquisitions pricing, fair division, duopolies, oligopolies, social network formation, agent-based computational economics, general equilibrium, mechanism design, and voting systems; and across such broad areas as experimental economics, behavioral economics, information economics, industrial organization, and political economy. This research usually focuses on particular sets of strategies known as "solution concepts" or "equilibria". A common assumption is that players act rationally. In non-cooperative games, the most famous of these is the Nash equilibrium. A set of strategies is a Nash equilibrium if each represents a best response to the other strategies. If all the players are playing the strategies in a Nash equilibrium, they have no unilateral incentive to deviate, since their strategy is the best they can do given what others are doing. The payoffs of the game are generally taken to represent the utility of individual players. A prototypical paper on game theory in economics begins by presenting a game that is an abstraction of a particular economic situation. One or more solution concepts are chosen, and the author demonstrates which strategy sets in the presented game are equilibria of the appropriate type. Economists and business professors suggest two primary uses (noted above): descriptive and prescriptive. The Chartered Institute of Procurement & Supply (CIPS) promotes knowledge and use of game theory within the context of business procurement. CIPS and TWS Partners have conducted a series of surveys designed to explore the understanding, awareness and application of game theory among procurement professionals. Some of the main findings in their third annual survey (2019) include: application of game theory to procurement activity has increased – at the time it was at 19% across all survey respondents 65% of participants predict that use of game theory applications will grow 70% of respondents say that they have "only a basic or a below basic understanding" of game theory 20% of participants had undertaken on-the-job training in game theory 50% of respondents said that new or improved software solutions were desirable 90% of respondents said that they do not have the software they need for their work. Project management Sensible decision-making is critical for the success of projects. In project management, game theory is used to model the decision-making process of players, such as investors, project managers, contractors, sub-contractors, governments and customers. Quite often, these players have competing interests, and sometimes their interests are directly detrimental to other players, making project management scenarios well-suited to be modeled by game theory. Piraveenan (2019) in his review provides several examples where game theory is used to model project management scenarios. For instance, an investor typically has several investment options, and each option will likely result in a different project, and thus one of the investment options has to be chosen before the project charter can be produced. Similarly, any large project involving subcontractors, for instance, a construction project, has a complex interplay between the main contractor (the project manager) and subcontractors, or among the subcontractors themselves, which typically has several decision points. For example, if there is an ambiguity in the contract between the contractor and subcontractor, each must decide how hard to push their case without jeopardizing the whole project, and thus their own stake in it. Similarly, when projects from competing organizations are launched, the marketing personnel have to decide what is the best timing and strategy to market the project, or its resultant product or service, so that it can gain maximum traction in the face of competition. In each of these scenarios, the required decisions depend on the decisions of other players who, in some way, have competing interests to the interests of the decision-maker, and thus can ideally be modeled using game theory. Piraveenan summarises that two-player games are predominantly used to model project management scenarios, and based on the identity of these players, five distinct types of games are used in project management. Government-sector–private-sector games (games that model public–private partnerships) Contractor–contractor games Contractor–subcontractor games Subcontractor–subcontractor games Games involving other players In terms of types of games, both cooperative as well as non-cooperative, normal-form as well as extensive-form, and zero-sum as well as non-zero-sum are used to model various project management scenarios. Political science The application of game theory to political science is focused in the overlapping areas of fair division, political economy, public choice, war bargaining, positive political theory, and social choice theory. In each of these areas, researchers have developed game-theoretic models in which the players are often voters, states, special interest groups, and politicians. Early examples of game theory applied to political science are provided by Anthony Downs. In his 1957 book An Economic Theory of Democracy, he applies the Hotelling firm location model to the political process. In the Downsian model, political candidates commit to ideologies on a one-dimensional policy space. Downs first shows how the political candidates will converge to the ideology preferred by the median voter if voters are fully informed, but then argues that voters choose to remain rationally ignorant which allows for candidate divergence. Game theory was applied in 1962 to the Cuban Missile Crisis during the presidency of John F. Kennedy. It has also been proposed that game theory explains the stability of any form of political government. Taking the simplest case of a monarchy, for example, the king, being only one person, does not and cannot maintain his authority by personally exercising physical control over all or even any significant number of his subjects. Sovereign control is instead explained by the recognition by each citizen that all other citizens expect each other to view the king (or other established government) as the person whose orders will be followed. Coordinating communication among citizens to replace the sovereign is effectively barred, since conspiracy to replace the sovereign is generally punishable as a crime. Thus, in a process that can be modeled by variants of the prisoner's dilemma, during periods of stability no citizen will find it rational to move to replace the sovereign, even if all the citizens know they would be better off if they were all to act collectively. A game-theoretic explanation for democratic peace is that public and open debate in democracies sends clear and reliable information regarding their intentions to other states. In contrast, it is difficult to know the intentions of nondemocratic leaders, what effect concessions will have, and if promises will be kept. Thus there will be mistrust and unwillingness to make concessions if at least one of the parties in a dispute is a non-democracy. However, game theory predicts that two countries may still go to war even if their leaders are cognizant of the costs of fighting. War may result from asymmetric information; two countries may have incentives to mis-represent the amount of military resources they have on hand, rendering them unable to settle disputes agreeably without resorting to fighting. Moreover, war may arise because of commitment problems: if two countries wish to settle a dispute via peaceful means, but each wishes to go back on the terms of that settlement, they may have no choice but to resort to warfare. Finally, war may result from issue indivisibilities. Game theory could also help predict a nation's responses when there is a new rule or law to be applied to that nation. One example is Peter John Wood's (2013) research looking into what nations could do to help reduce climate change. Wood thought this could be accomplished by making treaties with other nations to reduce greenhouse gas emissions. However, he concluded that this idea could not work because it would create a prisoner's dilemma for the nations. Biology Unlike those in economics, the payoffs for games in biology are often interpreted as corresponding to fitness. In addition, the focus has been less on equilibria that correspond to a notion of rationality and more on ones that would be maintained by evolutionary forces. The best-known equilibrium in biology is known as the evolutionarily stable strategy (ESS), first introduced in . Although its initial motivation did not involve any of the mental requirements of the Nash equilibrium, every ESS is a Nash equilibrium. In biology, game theory has been used as a model to understand many different phenomena. It was first used to explain the evolution (and stability) of the approximate 1:1 sex ratios. suggested that the 1:1 sex ratios are a result of evolutionary forces acting on individuals who could be seen as trying to maximize their number of grandchildren. Additionally, biologists have used evolutionary game theory and the ESS to explain the emergence of animal communication. The analysis of signaling games and other communication games has provided insight into the evolution of communication among animals. For example, the mobbing behavior of many species, in which a large number of prey animals attack a larger predator, seems to be an example of spontaneous emergent organization. Ants have also been shown to exhibit feed-forward behavior akin to fashion (see Paul Ormerod's Butterfly Economics). Biologists have used the game of chicken to analyze fighting behavior and territoriality. According to Maynard Smith, in the preface to Evolution and the Theory of Games, "paradoxically, it has turned out that game theory is more readily applied to biology than to the field of economic behaviour for which it was originally designed". Evolutionary game theory has been used to explain many seemingly incongruous phenomena in nature. One such phenomenon is known as biological altruism. This is a situation in which an organism appears to act in a way that benefits other organisms and is detrimental to itself. This is distinct from traditional notions of altruism because such actions are not conscious, but appear to be evolutionary adaptations to increase overall fitness. Examples can be found in species ranging from vampire bats that regurgitate blood they have obtained from a night's hunting and give it to group members who have failed to feed, to worker bees that care for the queen bee for their entire lives and never mate, to vervet monkeys that warn group members of a predator's approach, even when it endangers that individual's chance of survival. All of these actions increase the overall fitness of a group, but occur at a cost to the individual. Evolutionary game theory explains this altruism with the idea of kin selection. Altruists discriminate between the individuals they help and favor relatives. Hamilton's rule explains the evolutionary rationale behind this selection with the equation , where the cost to the altruist must be less than the benefit to the recipient multiplied by the coefficient of relatedness . The more closely related two organisms are causes the incidences of altruism to increase because they share many of the same alleles. This means that the altruistic individual, by ensuring that the alleles of its close relative are passed on through survival of its offspring, can forgo the option of having offspring itself because the same number of alleles are passed on. For example, helping a sibling (in diploid animals) has a coefficient of , because (on average) an individual shares half of the alleles in its sibling's offspring. Ensuring that enough of a sibling's offspring survive to adulthood precludes the necessity of the altruistic individual producing offspring. The coefficient values depend heavily on the scope of the playing field; for example if the choice of whom to favor includes all genetic living things, not just all relatives, we assume the discrepancy between all humans only accounts for approximately 1% of the diversity in the playing field, a coefficient that was in the smaller field becomes 0.995. Similarly if it is considered that information other than that of a genetic nature (e.g. epigenetics, religion, science, etc.) persisted through time the playing field becomes larger still, and the discrepancies smaller. Computer science and logic Game theory has come to play an increasingly important role in logic and in computer science. Several logical theories have a basis in game semantics. In addition, computer scientists have used games to model interactive computations. Also, game theory provides a theoretical basis to the field of multi-agent systems. Separately, game theory has played a role in online algorithms; in particular, the -server problem, which has in the past been referred to as games with moving costs and request-answer games. Yao's principle is a game-theoretic technique for proving lower bounds on the computational complexity of randomized algorithms, especially online algorithms. The emergence of the Internet has motivated the development of algorithms for finding equilibria in games, markets, computational auctions, peer-to-peer systems, and security and information markets. Algorithmic game theory and within it algorithmic mechanism design combine computational algorithm design and analysis of complex systems with economic theory. Philosophy Game theory has been put to several uses in philosophy. Responding to two papers by , used game theory to develop a philosophical account of convention. In so doing, he provided the first analysis of common knowledge and employed it in analyzing play in coordination games. In addition, he first suggested that one can understand meaning in terms of signaling games. This later suggestion has been pursued by several philosophers since Lewis. Following game-theoretic account of conventions, Edna Ullmann-Margalit (1977) and Bicchieri (2006) have developed theories of social norms that define them as Nash equilibria that result from transforming a mixed-motive game into a coordination game. Game theory has also challenged philosophers to think in terms of interactive epistemology: what it means for a collective to have common beliefs or knowledge, and what are the consequences of this knowledge for the social outcomes resulting from the interactions of agents. Philosophers who have worked in this area include Bicchieri (1989, 1993), Skyrms (1990), and Stalnaker (1999). In ethics, some (most notably David Gauthier, Gregory Kavka, and Jean Hampton) authors have attempted to pursue Thomas Hobbes' project of deriving morality from self-interest. Since games like the prisoner's dilemma present an apparent conflict between morality and self-interest, explaining why cooperation is required by self-interest is an important component of this project. This general strategy is a component of the general social contract view in political philosophy (for examples, see and ). Other authors have attempted to use evolutionary game theory in order to explain the emergence of human attitudes about morality and corresponding animal behaviors. These authors look at several games including the prisoner's dilemma, stag hunt, and the Nash bargaining game as providing an explanation for the emergence of attitudes about morality (see, e.g., and ). Retail and consumer product pricing Game theory applications are often used in the pricing strategies of retail and consumer markets, particularly for the sale of inelastic goods. With retailers constantly competing against one another for consumer market share, it has become a fairly common practice for retailers to discount certain goods, intermittently, in the hopes of increasing foot-traffic in brick and mortar locations (websites visits for e-commerce retailers) or increasing sales of ancillary or complimentary products. Black Friday, a popular shopping holiday in the US, is when many retailers focus on optimal pricing strategies to capture the holiday shopping market. In the Black Friday scenario, retailers using game theory applications typically ask "what is the dominant competitor's reaction to me?" In such a scenario, the game has two players: the retailer, and the consumer. The retailer is focused on an optimal pricing strategy, while the consumer is focused on the best deal. In this closed system, there often is no dominant strategy as both players have alternative options. That is, retailers can find a different customer, and consumers can shop at a different retailer. Given the market competition that day, however, the dominant strategy for retailers lies in outperforming competitors. The open system assumes multiple retailers selling similar goods, and a finite number of consumers demanding the goods at an optimal price. A blog by a Cornell University professor provided an example of such a strategy, when Amazon priced a Samsung TV $100 below retail value, effectively undercutting competitors. Amazon made up part of the difference by increasing the price of HDMI cables, as it has been found that consumers are less price discriminatory when it comes to the sale of secondary items. Retail markets continue to evolve strategies and applications of game theory when it comes to pricing consumer goods. The key insights found between simulations in a controlled environment and real-world retail experiences show that the applications of such strategies are more complex, as each retailer has to find an optimal balance between pricing, supplier relations, brand image, and the potential to cannibalize the sale of more profitable items. Epidemiology Since the decision to take a vaccine for a particular disease is often made by individuals, who may consider a range of factors and parameters in making this decision (such as the incidence and prevalence of the disease, perceived and real risks associated with contracting the disease, mortality rate, perceived and real risks associated with vaccination, and financial cost of vaccination), game theory has been used to model and predict vaccination uptake in a society. In popular culture Based on the 1998 book by Sylvia Nasar, the life story of game theorist and mathematician John Nash was turned into the 2001 biopic A Beautiful Mind, starring Russell Crowe as Nash. The 1959 military science fiction novel Starship Troopers by Robert A. Heinlein mentioned "games theory" and "theory of games". In the 1997 film of the same name, the character Carl Jenkins referred to his military intelligence assignment as being assigned to "games and theory". The 1964 film Dr. Strangelove satirizes game theoretic ideas about deterrence theory. For example, nuclear deterrence depends on the threat to retaliate catastrophically if a nuclear attack is detected. A game theorist might argue that such threats can fail to be credible, in the sense that they can lead to subgame imperfect equilibria. The movie takes this idea one step further, with the Soviet Union irrevocably committing to a catastrophic nuclear response without making the threat public. The 1980s power pop band Game Theory was founded by singer/songwriter Scott Miller, who described the band's name as alluding to "the study of calculating the most appropriate action given an adversary... to give yourself the minimum amount of failure". Liar Game, a 2005 Japanese manga and 2007 television series, presents the main characters in each episode with a game or problem that is typically drawn from game theory, as demonstrated by the strategies applied by the characters. The 1974 novel Spy Story by Len Deighton explores elements of
of their opponents. Negotiators may be unaware of their opponent's valuation of the object of negotiation, companies may be unaware of their opponent's cost functions, combatants may be unaware of their opponent's strengths, and jurors may be unaware of their colleague's interpretation of the evidence at trial. In some cases, participants may know the character of their opponent well, but may not know how well their opponent knows his or her own character. Bayesian game means a strategic game with incomplete information. For a strategic game, decision makers are players, and every player has a group of actions. A core part of the imperfect information specification is the set of states. Every state completely describes a collection of characteristics relevant to the player such as their preferences and details about them. There must be a state for every set of features that some player believes may exist. For example, where Player 1 is unsure whether Player 2 would rather date her or get away from her, while Player 2 understands Player 1's preferences as before. To be specific, supposing that Player 1 believes that Player 2 wants to date her under a probability of 1/2 and get away from her under a probability of 1/2 (this evaluation comes from Player 1's experience probably: she faces players who want to date her half of the time in such a case and players who want to avoid her half of the time). Due to the probability involved, the analysis of this situation requires to understand the player's preference for the draw, even though people are only interested in pure strategic equilibrium. Games in which the difficulty of finding an optimal strategy stems from the multiplicity of possible moves are called combinatorial games. Examples include chess and Go. Games that involve imperfect information may also have a strong combinatorial character, for instance backgammon. There is no unified theory addressing combinatorial elements in games. There are, however, mathematical tools that can solve particular problems and answer general questions. Games of perfect information have been studied in combinatorial game theory, which has developed novel representations, e.g. surreal numbers, as well as combinatorial and algebraic (and sometimes non-constructive) proof methods to solve games of certain types, including "loopy" games that may result in infinitely long sequences of moves. These methods address games with higher combinatorial complexity than those usually considered in traditional (or "economic") game theory. A typical game that has been solved this way is Hex. A related field of study, drawing from computational complexity theory, is game complexity, which is concerned with estimating the computational difficulty of finding optimal strategies. Research in artificial intelligence has addressed both perfect and imperfect information games that have very complex combinatorial structures (like chess, go, or backgammon) for which no provable optimal strategies have been found. The practical solutions involve computational heuristics, like alpha–beta pruning or use of artificial neural networks trained by reinforcement learning, which make games more tractable in computing practice. Infinitely long games Games, as studied by economists and real-world game players, are generally finished in finitely many moves. Pure mathematicians are not so constrained, and set theorists in particular study games that last for infinitely many moves, with the winner (or other payoff) not known until after all those moves are completed. The focus of attention is usually not so much on the best way to play such a game, but whether one player has a winning strategy. (It can be proven, using the axiom of choice, that there are gameseven with perfect information and where the only outcomes are "win" or "lose"for which neither player has a winning strategy.) The existence of such strategies, for cleverly designed games, has important consequences in descriptive set theory. Discrete and continuous games Much of game theory is concerned with finite, discrete games that have a finite number of players, moves, events, outcomes, etc. Many concepts can be extended, however. Continuous games allow players to choose a strategy from a continuous strategy set. For instance, Cournot competition is typically modeled with players' strategies being any non-negative quantities, including fractional quantities. Differential games Differential games such as the continuous pursuit and evasion game are continuous games where the evolution of the players' state variables is governed by differential equations. The problem of finding an optimal strategy in a differential game is closely related to the optimal control theory. In particular, there are two types of strategies: the open-loop strategies are found using the Pontryagin maximum principle while the closed-loop strategies are found using Bellman's Dynamic Programming method. A particular case of differential games are the games with a random time horizon. In such games, the terminal time is a random variable with a given probability distribution function. Therefore, the players maximize the mathematical expectation of the cost function. It was shown that the modified optimization problem can be reformulated as a discounted differential game over an infinite time interval. Evolutionary game theory Evolutionary game theory studies players who adjust their strategies over time according to rules that are not necessarily rational or farsighted. In general, the evolution of strategies over time according to such rules is modeled as a Markov chain with a state variable such as the current strategy profile or how the game has been played in the recent past. Such rules may feature imitation, optimization, or survival of the fittest. In biology, such models can represent evolution, in which offspring adopt their parents' strategies and parents who play more successful strategies (i.e. corresponding to higher payoffs) have a greater number of offspring. In the social sciences, such models typically represent strategic adjustment by players who play a game many times within their lifetime and, consciously or unconsciously, occasionally adjust their strategies. Stochastic outcomes (and relation to other fields) Individual decision problems with stochastic outcomes are sometimes considered "one-player games". These situations are not considered game theoretical by some authors. They may be modeled using similar tools within the related disciplines of decision theory, operations research, and areas of artificial intelligence, particularly AI planning (with uncertainty) and multi-agent system. Although these fields may have different motivators, the mathematics involved are substantially the same, e.g. using Markov decision processes (MDP). Stochastic outcomes can also be modeled in terms of game theory by adding a randomly acting player who makes "chance moves" ("moves by nature"). This player is not typically considered a third player in what is otherwise a two-player game, but merely serves to provide a roll of the dice where required by the game. For some problems, different approaches to modeling stochastic outcomes may lead to different solutions. For example, the difference in approach between MDPs and the minimax solution is that the latter considers the worst-case over a set of adversarial moves, rather than reasoning in expectation about these moves given a fixed probability distribution. The minimax approach may be advantageous where stochastic models of uncertainty are not available, but may also be overestimating extremely unlikely (but costly) events, dramatically swaying the strategy in such scenarios if it is assumed that an adversary can force such an event to happen. (See Black swan theory for more discussion on this kind of modeling issue, particularly as it relates to predicting and limiting losses in investment banking.) General models that include all elements of stochastic outcomes, adversaries, and partial or noisy observability (of moves by other players) have also been studied. The "gold standard" is considered to be partially observable stochastic game (POSG), but few realistic problems are computationally feasible in POSG representation. Metagames These are games the play of which is the development of the rules for another game, the target or subject game. Metagames seek to maximize the utility value of the rule set developed. The theory of metagames is related to mechanism design theory. The term metagame analysis is also used to refer to a practical approach developed by Nigel Howard. whereby a situation is framed as a strategic game in which stakeholders try to realize their objectives by means of the options available to them. Subsequent developments have led to the formulation of confrontation analysis. Pooling games These are games prevailing over all forms of society. Pooling games are repeated plays with changing payoff table in general over an experienced path, and their equilibrium strategies usually take a form of evolutionary social convention and economic convention. Pooling game theory emerges to formally recognize the interaction between optimal choice in one play and the emergence of forthcoming payoff table update path, identify the invariance existence and robustness, and predict variance over time. The theory is based upon topological transformation classification of payoff table update over time to predict variance and invariance, and is also within the jurisdiction of the computational law of reachable optimality for ordered system. Mean field game theory Mean field game theory is the study of strategic decision making in very large populations of small interacting agents. This class of problems was considered in the economics literature by Boyan Jovanovic and Robert W. Rosenthal, in the engineering literature by Peter E. Caines, and by mathematician Pierre-Louis Lions and Jean-Michel Lasry. Representation of games The games studied in game theory are well-defined mathematical objects. To be fully defined, a game must specify the following elements: the players of the game, the information and actions available to each player at each decision point, and the payoffs for each outcome. (Eric Rasmusen refers to these four "essential elements" by the acronym "PAPI".) A game theorist typically uses these elements, along with a solution concept of their choosing, to deduce a set of equilibrium strategies for each player such that, when these strategies are employed, no player can profit by unilaterally deviating from their strategy. These equilibrium strategies determine an equilibrium to the game—a stable state in which either one outcome occurs or a set of outcomes occur with known probability. Most cooperative games are presented in the characteristic function form, while the extensive and the normal forms are used to define noncooperative games. Extensive form The extensive form can be used to formalize games with a time sequencing of moves. Games here are played on trees (as pictured here). Here each vertex (or node) represents a point of choice for a player. The player is specified by a number listed by the vertex. The lines out of the vertex represent a possible action for that player. The payoffs are specified at the bottom of the tree. The extensive form can be viewed as a multi-player generalization of a decision tree. To solve any extensive form game, backward induction must be used. It involves working backward up the game tree to determine what a rational player would do at the last vertex of the tree, what the player with the previous move would do given that the player with the last move is rational, and so on until the first vertex of the tree is reached. The game pictured consists of two players. The way this particular game is structured (i.e., with sequential decision making and perfect information), Player 1 "moves" first by choosing either or (fair or unfair). Next in the sequence, Player 2, who has now seen Player 1s move, chooses to play either or . Once Player 2 has made their choice, the game is considered finished and each player gets their respective payoff. Suppose that Player 1 chooses and then Player 2 chooses : Player 1 then gets a payoff of "eight" (which in real-world terms can be interpreted in many ways, the simplest of which is in terms of money but could mean things such as eight days of vacation or eight countries conquered or even eight more opportunities to play the same game against other players) and Player 2 gets a payoff of "two". The extensive form can also capture simultaneous-move games and games with imperfect information. To represent it, either a dotted line connects different vertices to represent them as being part of the same information set (i.e. the players do not know at which point they are), or a closed line is drawn around them. (See example in the imperfect information section.) Normal form The normal (or strategic form) game is usually represented by a matrix which shows the players, strategies, and payoffs (see the example to the right). More generally it can be represented by any function that associates a payoff for each player with every possible combination of actions. In the accompanying example there are two players; one chooses the row and the other chooses the column. Each player has two strategies, which are specified by the number of rows and the number of columns. The payoffs are provided in the interior. The first number is the payoff received by the row player (Player 1 in our example); the second is the payoff for the column player (Player 2 in our example). Suppose that Player 1 plays Up and that Player 2 plays Left. Then Player 1 gets a payoff of 4, and Player 2 gets 3. When a game is presented in normal form, it is presumed that each player acts simultaneously or, at least, without knowing the actions of the other. If players have some information about the choices of other players, the game is usually presented in extensive form. Every extensive-form game has an equivalent normal-form game, however, the transformation to normal form may result in an exponential blowup in the size of the representation, making it computationally impractical. Characteristic function form In games that possess removable utility, separate rewards are not given; rather, the characteristic function decides the payoff of each unity. The idea is that the unity that is 'empty', so to speak, does not receive a reward at all. The origin of this form is to be found in John von Neumann and Oskar Morgenstern's book; when looking at these instances, they guessed that when a union appears, it works against the fraction as if two individuals were playing a normal game. The balanced payoff of C is a basic function. Although there are differing examples that help determine coalitional amounts from normal games, not all appear that in their function form can be derived from such. Formally, a characteristic function is seen as: (N,v), where N represents the group of people and is a normal utility. Such characteristic functions have expanded to describe games where there is no removable utility. Alternative game representations Alternative game representation forms are used for some subclasses of games or adjusted to the needs of interdisciplinary research. In addition to classical game representations, some of the alternative representations also encode time related aspects. General and applied uses As a method of applied mathematics, game theory has been used to study a wide variety of human and animal behaviors. It was initially developed in economics to understand a large collection of economic behaviors, including behaviors of firms, markets, and consumers. The first use of game-theoretic analysis was by Antoine Augustin Cournot in 1838 with his solution of the Cournot duopoly. The use of game theory in the social sciences has expanded, and game theory has been applied to political, sociological, and psychological behaviors as well. Although pre-twentieth-century naturalists such as Charles Darwin made game-theoretic kinds of statements, the use of game-theoretic analysis in biology began with Ronald Fisher's studies of animal behavior during the 1930s. This work predates the name "game theory", but it shares many important features with this field. The developments in economics were later applied to biology largely by John Maynard Smith in his 1982 book Evolution and the Theory of Games. In addition to being used to describe, predict, and explain behavior, game theory has also been used to develop theories of ethical or normative behavior and to prescribe such behavior. In economics and philosophy, scholars have applied game theory to help in the understanding of good or proper behavior. Game-theoretic arguments of this type can be found as far back as Plato. An alternative version of game theory, called chemical game theory, represents the player's choices as metaphorical chemical reactant molecules called "knowlecules". Chemical game theory then calculates the outcomes as equilibrium solutions to a system of chemical reactions. Description and modeling The primary use of game theory is to describe and model how human populations behave. Some scholars believe that by finding the equilibria of games they can predict how actual human populations will behave when confronted with situations analogous to the game being studied. This particular view of game theory has been criticized. It is argued that the assumptions made by game theorists are often violated when applied to real-world situations. Game theorists usually assume players act rationally, but in practice, human behavior often deviates from this model. Game theorists respond by comparing their assumptions to those used in physics. Thus while their assumptions do not always hold, they can treat game theory as a reasonable scientific ideal akin to the models used by physicists. However, empirical work has shown that in some classic games, such as the centipede game, guess 2/3 of the average game, and the dictator game, people regularly do not play Nash equilibria. There is an ongoing debate regarding the importance of these experiments and whether the analysis of the experiments fully captures all aspects of the relevant situation. Some game theorists, following the work of John Maynard Smith and George R. Price, have turned to evolutionary game theory in order to resolve these issues. These models presume either no rationality or bounded rationality on the part of players. Despite the name, evolutionary game theory does not necessarily presume natural selection in the biological sense. Evolutionary game theory includes both biological as well as cultural evolution and also models of individual learning (for example, fictitious play dynamics). Prescriptive or normative analysis Some scholars see game theory not as a predictive tool for the behavior of human beings, but as a suggestion for how people ought to behave. Since a strategy, corresponding to a Nash equilibrium of a game constitutes one's best response to the actions of the other players – provided they are in (the same) Nash equilibrium – playing a strategy that is part of a Nash equilibrium seems appropriate. This normative use of game theory has also come under criticism. Economics and business Game theory is a major method used in mathematical economics and business for modeling competing behaviors of interacting agents. Applications include a wide array of economic phenomena and approaches, such as auctions, bargaining, mergers and acquisitions pricing, fair division, duopolies, oligopolies, social network formation, agent-based computational economics, general equilibrium, mechanism design, and voting systems; and across such broad areas as experimental economics, behavioral economics, information economics, industrial organization, and political economy. This research usually focuses on particular sets of strategies known as "solution concepts" or "equilibria". A common assumption is that players act rationally. In non-cooperative games, the most famous of these is the Nash equilibrium. A set of strategies is a Nash equilibrium if each represents a best response to the other strategies. If all the players are playing the strategies in a Nash equilibrium, they have no unilateral incentive to deviate, since their strategy is the best they can do given what others are doing. The payoffs of the game are generally taken to represent the utility of individual players. A prototypical paper on game theory in economics begins by presenting a game that is an abstraction of a particular economic situation. One or more solution concepts are chosen, and the author demonstrates which strategy sets in the presented game are equilibria of the appropriate type. Economists and business professors suggest two primary uses (noted above): descriptive and prescriptive. The Chartered Institute of Procurement & Supply (CIPS) promotes knowledge and use of game theory within the context of business procurement. CIPS and TWS Partners have conducted a series of surveys designed to explore the understanding, awareness and application of game theory among procurement professionals. Some of the main findings in their third annual survey (2019) include: application of game theory to procurement activity has increased – at the time it was at 19% across all survey respondents 65% of participants predict that use of game theory applications will grow 70% of respondents say that they have "only a basic or a below basic understanding" of game theory 20% of participants had undertaken on-the-job training in game theory 50% of respondents said that new or improved software solutions were desirable 90% of respondents said that they do not have the software they need for their work. Project management Sensible decision-making is critical for the success of projects. In project management, game theory is used to model the decision-making process of players, such as investors, project managers, contractors, sub-contractors, governments and customers. Quite often, these players have competing interests, and sometimes their interests are directly detrimental to other players, making project management scenarios well-suited to be modeled by game theory. Piraveenan (2019) in his review provides several examples where game theory is used to model project management scenarios. For instance, an investor typically has several investment options, and each option will likely result in a different project, and thus one of the investment options has to be chosen before the project charter can be produced. Similarly, any large project involving subcontractors, for instance, a construction project, has a complex interplay between the main contractor (the project manager) and subcontractors, or among the subcontractors themselves, which typically has several decision points. For example, if there is an ambiguity in the contract between the contractor and subcontractor, each must decide how hard to push their case without jeopardizing the whole project, and thus their own stake in it. Similarly, when projects from competing organizations are launched, the marketing personnel have to decide what is the best timing and strategy to market the project, or its resultant product or service, so that it can gain maximum traction in the face of competition. In each of these scenarios, the required decisions depend on the decisions of other players who, in some way, have competing interests to the interests of the decision-maker, and thus can ideally be modeled using game theory. Piraveenan summarises that two-player games are predominantly used to model project management scenarios, and based on the identity of these players, five distinct types of games are used in project management. Government-sector–private-sector games (games that model public–private partnerships) Contractor–contractor games Contractor–subcontractor games Subcontractor–subcontractor games Games involving other players In terms of types of games, both cooperative as well as non-cooperative, normal-form as well as extensive-form, and zero-sum as well as non-zero-sum are used to model various project management scenarios. Political science The application of game theory to political science is focused in the overlapping areas of fair division, political economy, public choice, war bargaining, positive political theory, and social choice theory. In each of these areas, researchers have developed game-theoretic models in which the players are often voters, states, special interest groups, and politicians. Early examples of game theory applied to political science are provided by Anthony Downs. In his 1957 book An Economic Theory of Democracy, he applies the Hotelling firm location model to the political process. In the Downsian model, political candidates commit to ideologies on a one-dimensional policy space. Downs first shows how the political candidates will converge to the ideology preferred by the median voter if voters are fully informed, but then argues that voters choose to remain rationally ignorant which allows for candidate divergence. Game theory was applied in 1962 to the Cuban Missile Crisis during the presidency of John F. Kennedy. It has also been proposed that game theory explains the stability of any form of political government. Taking the simplest case of a monarchy, for example, the king, being only one person, does not and cannot maintain his authority by personally exercising physical control over all or even any significant number of his subjects. Sovereign control is instead explained by the recognition by each citizen that all other citizens expect each other to view the king (or other established government) as the person whose orders will be followed. Coordinating communication among citizens to replace the sovereign is effectively barred, since conspiracy to replace the sovereign is generally punishable as a crime. Thus, in a process that can be modeled by variants of the prisoner's dilemma, during periods of stability no citizen will find it rational to move to replace the sovereign, even if all the citizens know they would be better off if they were all to act collectively. A game-theoretic explanation for democratic peace is that public and open debate in democracies sends clear and reliable information regarding their intentions to other states. In contrast, it is difficult to know the intentions of nondemocratic leaders, what effect concessions will have, and if promises will be kept. Thus there will be mistrust and unwillingness to make concessions if at least one of the parties in a dispute is a non-democracy. However, game theory predicts that two countries may still go to war even if their leaders are cognizant of the costs of fighting. War may result from asymmetric information; two countries may have incentives to mis-represent the amount of military resources they have on hand, rendering them unable to settle disputes agreeably without resorting to fighting. Moreover, war may arise because of commitment problems: if two countries wish to settle a dispute via peaceful means, but each wishes to go back on the terms of that settlement, they may have no choice but to resort to warfare. Finally, war may result from issue indivisibilities. Game theory could also help predict a nation's responses when there is a new rule or law to be applied to that nation. One example is Peter John Wood's (2013) research looking into what nations could do to help reduce climate change. Wood thought this could be accomplished by making treaties with other nations to reduce greenhouse gas emissions. However, he concluded that this idea could not work because it would create a prisoner's dilemma for the nations. Biology Unlike those in economics, the payoffs for games in biology are often interpreted as corresponding to fitness. In addition, the focus has been less on equilibria that correspond to a notion of rationality and more on ones that would be maintained by evolutionary forces. The best-known equilibrium in biology is known as the evolutionarily stable strategy (ESS), first introduced in . Although its initial motivation did not involve any of the mental requirements of the Nash equilibrium, every ESS is a Nash equilibrium. In biology, game theory has
that lived in central and eastern Germany since the 7th century to have kept their traditions and not been completely integrated into the wider German nation. Until World War II the Poles were recognized as one of the national minorities. In 1924 the Union of Poles in Germany had initiated cooperation between all national minorities in Germany under the umbrella organization Association of National Minorities in Germany. Some of the union members wanted the Polish communities in easternmost Germany (now Poland) to join the newly established Polish nation after World War I. Even before the German invasion of Poland, leading anti-Nazi members of the Polish minority were deported to concentration camps; some were executed at the Piaśnica murder site. Minority rights for Poles in Germany were revoked by Hermann Göring's World War II decree of 27 February 1940, and their property was confiscated. After the war ended, the German government did not re-implement national minority rights for ethnic Poles. The reason for this is that the areas of Germany which formerly had a native Polish minority were annexed to Poland and the Soviet Union, while almost all of the native German populations (formerly the ethnic majority) in these areas subsequently fled or were expelled by force. With the mixed German-Polish territories now lost, the German government subsequently regarded ethnic Poles residing in what remained of Germany as immigrants, just like any other ethnic population with a recent history of arrival. In contrast, Germans living in Poland are recognized as national minority and have granted seats in Polish Parliament. It must be said, however, that an overwhelming number of Germans in Poland have centuries-old historical ties to the lands they now inhabit, whether from living in territory that once belonged to the German state, or from centuries-old communities. In contrast, most Poles in present-day Germany are recent immigrants, though there are some communities which have been present since the 19th and perhaps even the 18th centuries. Despite protests by some in the older Polish-German communities, and despite Germany being now a signatory to the Framework Convention for the Protection of National Minorities, Germany has so far refused to re-implement minority rights for ethnic Poles, based on the fact that almost all areas of historically mixed German-Polish heritage (where the minority rights formerly existed) are no longer part of Germany and because the vast majority of ethnic Poles now residing in Germany are recent immigrants. Roma people have been in Germany since the Middle Ages. They were persecuted by the Nazis, and thousands of Roma living in Germany were killed by the Nazi regime. Nowadays, they are spread all over Germany, mostly living in major cities. It is difficult to estimate their exact number, as the German government counts them as "persons without migrant background" in their statistics. There are also many assimilated Sinti and Roma. A vague figure given by the German Department of the Interior is about 70,000. In contrast to the old-established Roma population, the majority of them do not have German citizenship, they are classified as immigrants or refugees. After World War II, 14 million ethnic Germans were expelled from the eastern territories of Germany and homelands outside the former German Empire. The accommodation and integration of these Heimatvertriebene in the remaining part of Germany, in which many cities and millions of apartments had been destroyed, was a major effort in the post-war occupation zones and later states of Germany. Since the 1960s, ethnic Germans from the People's Republic of Poland and Soviet Union (especially from Kazakhstan, Russia, and Ukraine), have come to Germany. During the time of Perestroika, and after the dissolution of the Soviet Union, the number of immigrants increased heavily. Some of these immigrants are of mixed ancestry. During the 10-year period between 1987 and 2001, a total of 1,981,732 ethnic Germans from the FSU immigrated to Germany, along with more than a million of their non-German relatives. After 1997, however ethnic Slavs or those belonging to Slavic-Germanic mixed origins outnumbered these with only Germanic descent amongst the immigrants. The total number of people currently living in Germany having FSU connection is around 4 to 4.5 million (Including Germans, Slavs, Jews and those of mixed origins), out of that more than 50% is of German descent. Germany now has Europe's third-largest Jewish population. In 2004, twice as many Jews from former Soviet republics settled in Germany as in Israel, bringing the total inflow to more than 100,000 since 1991. Jews have a voice in German public life through the Central Council of Jews in Germany (Zentralrat der Juden in Deutschland). Some Jews from the former Soviet Union are of mixed heritage. In 2019 there were also a growing number of at least 529,000 black Afro-Germans defined as people with an African migrant background. Out of them more than 400 thousand have a citizenship of a Subsahara-African country, with others being German citizens. Most of them live in Berlin and Hamburg. Numerous persons from northern African Tunisia and Morocco live in Germany. While they are considered members of a minority group, for the most part, they do not considers themselves "Afro-Germans," nor are most of them perceived as such by the German people. However, Germany does not keep any statistics regarding ethnicity or race. Hence, the exact number of Germans of African descent is unknown. Germany's biggest East Asian minorities are the Chinese people in Germany, numbering 189,000 and Vietnamese people in Germany, numbering 188,000,many of whom living in Berlin and eastern Germany. Also there are about 35,000 Japanese citizens residing in Germany. There are also groups of South Asian and Southeast Asian immigrants. Around 163,000 Indians and 124,000 Pakistanis live in Germany. Additionally some 30,000 Filipino citizens and more than 20,000 Indonesian citizens reside in Germany. Numerous descendants of the so-called Gastarbeiter live in Germany. The Gastarbeiter mostly came from Turkey, Italy, Greece, Spain, Morocco, Portugal, the forme Yugoslavia, Tunisia and Chile. Also included were Vietnam, Mongolia, North Korea, Angola, Mozambique and Cuba when the former East Germany existed until reunification in 1990. The (socialist) German Democratic Republic (East Germany) however had their guest-workers stay in single-sex dormitories. Female guest workers had to sign contracts saying that they were not allowed to fall pregnant during their stay. If they fell pregnant nevertheless they faced forced abortion or deportation. This is one of the reasons why the vast majority of ethnic minorities today lives in western Germany and also one of the reasons why minorities such as the Vietnamese have the most unusual population pyramid, with nearly all second-generation Vietnamese Germans born after 1989. There is strong discrimination against Asian Germans in Germany. In a survey conducted by the Free University of Berlin between October and November 2020, 49% of Asian Germans said they had been discriminated against. In terms of discrimination, 62% were subjected to verbal insults, 11% were subjected to physical attacks such as being pushed, spit on, or sprayed with disinfectant. And 27% were rejected from medical clinics. The biggest problem is that Germans are insensitive to their own sense of discrimination, and most Germans are not aware that discrimination against Asians is taking place in Germany. Foreign nationals in Germany , the most common groups of resident foreign nationals in Germany were as follows: This list does not include non-ethnic Germans with German nationality and foreign nationals without resident status. Genetics of the German native people The most common Y chromosome haplogroups among German males are Haplogroup R1b, followed by Haplogroup I1, and Haplogroup R1a. Geography With an estimated 83.2 million inhabitants in December 2020,Germany is the second-most populous country in Europe after Russia, and ranks as the 19th largest country in the world in terms of population. Its population density stands at 233 inhabitants per square kilometer. States Germany comprises sixteen states that are collectively referred to as Länder. Due to differences in size and population the subdivision of these states varies, especially between city-states (Stadtstaaten) and states with larger territories (Flächenländer). For regional administrative purposes four states, namely Baden-Württemberg, Bavaria, Hesse and North Rhine-Westphalia, consist of a total of 19 Government Districts (Regierungsbezirke). As of 2019 Germany is divided into 400 districts (Kreise) on municipal level, these consist of 294 rural districts and 106 urban districts. Cities Metropolitan regions Germany officially has eleven metropolitan regions. In 2005, Germany had 82 cities with more than 100,000 inhabitants. Immigration The United Nations Population Fund lists Germany as host to the third-highest number of international migrants worldwide, behind the United States and Saudi Arabia. The largest ethnic group of non-German origin are the Turkish. Since the 1960s, West and later reunified Germany has attracted immigrants primarily from Southern and Eastern Europe as well as Turkey, many of whom (or their children) have acquired German citizenship over time. While most of these immigrants initially arrived as guest workers, changes to guest worker legislation allowed many to stay and to build lives in Germany. Germany had signed special visa agreements with several countries in times of severe labour shortages or when particular skills were deficient within the country. During the 1960s and 1970s, agreements were signed with the governments of Turkey, Yugoslavia, Italy and Spain to help Germany overcome its severe labour shortage. As of 2012, after Germany fully legalized visa-free immigrants from the eastern states of the EU, the largest sources of net immigration to Germany were other European countries, most importantly Poland, Romania, Bulgaria, Hungary, Italy, Spain, and Greece; notably, in the case of Turkey, German Turks moving to Turkey slightly outnumbered new immigrants in 2012, however, in recent years there are more Turkish immigrants in Germany than emigrants again, including illegal Turkish migrants. In 2015, there was a large increase in asylum applications, mainly due to the violent conflicts in Syria, Iraq and Afghanistan: 476,649 asylum applications were counted that year..This number went up to even 745,545 in 2016 and began to decline after it. Education Responsibility for educational oversight in Germany lies primarily with the individual federated
has been attributed to a "demographic shock": people not only had fewer children, they were also less likely to marry or divorce after the end of the GDR; the biographic options of the citizens of the former GDR had increased. Young motherhood seemed to be less attractive and the age of the first birth rose sharply. In the following years, the TFR in the East started to rise again, surpassing 1.0 in 1997 and 1.3 in 2004, and reaching the West's TFR (1.37) in 2007. In 2010, the East's fertility rate (1.459) clearly exceeded that of the West (1.385), while Germany's overall TFR had risen to 1.393, the highest value since 1990, which was still far below the natural replacement rate of 2.1 and the birth rates seen under communism. In 2016, the TFR was 1.64 in the East and 1.60 in the West. Between 1989 and 2009, about 2,000 schools closed because there were fewer children. In some regions the number of women between the ages of 20 and 30 has dropped by more than 30%. In 2004, in the age group 18-29 (statistically important for starting families) there were only 90 women for every 100 men in the new federal states (the East, including Berlin). Until 2007 family politics in the federal republic was compensatory, which means that poor families received more family benefits (such as the Erziehungsgeld) than rich ones. In 2007 the so-called Elterngeld was introduced. According to Christoph Butterwegge the Elterngeld was meant to "motivate highly educated women to have more children"; the poor on the other hand were disadvantaged by the Elterngeld, and now received lower child benefits than the middle classes. The very well-off (who earn more than 250.000 Euro per annum) and those on welfare receive no Elterngeld payments. In 2013 the following most recent developments were noticed: The income of families with young children has risen. Persons holding a college degree, persons older than 30 years and parents with only one child benefited the most. Single parents and young parents did not benefit. Fathers are becoming more involved in parenting, and 28% of them now take some time off work (3.3 months on average) when their children are born. Mothers are more likely to work and as a result less likely to be economically deprived than they used to be. The birth rate of college-educated women has risen. In the new federal states the fertility rate of college-educated women is now higher than that of those without college degrees. Differences in value priorities and the better availability of childcare in the eastern states are discussed as possible reasons. In 2019, the non-profit Austrian Institute of Economic Research and the Bertelsmann Stiftung published a study about the economic impact of demographics. The researchers assume a reduction in the per capita income of €3,700 until 2040. Demographic statistics Demographic statistics according to the World Population Review. One birth every 43 seconds One death every 34 seconds Net gain of one person every 4 minutes One net migrant every 2 minutes Demographic statistics according to the CIA World Factbook, unless otherwise indicated. Population 80,457,737 (July 2018 est.) 80,594,017 (July 2017 est.) 82,175,700 (2015 estimate) Age structure 0-14 years: 12.83% (male 5,299,798 /female 5,024,184) 15-24 years: 9.98% (male 4,092,901 /female 3,933,997) 25-54 years: 39.87% (male 16,181,931 /female 15,896,528) 55-64 years: 14.96% (male 5,989,111 /female 6,047,449) 65 years and over: 22.36% (male 7,930,590 /female 10,061,248) (2018 est.) 0–14 years: 12.8% (male 5,304,341/female 5,028,776) 15–24 years: 10.1% (male 4,145,486/female 3,986,302) 25-54 years: 40.5% (male 16,467,975/female 16,133,964) 55-64 years: 14.6% (male 5,834,179/female 5,913,322) 65 years and over: 22.06% (male 7,822,221/female 9,957,451) (2017 est.) 0–14 years: 13.9% (male 5,894,724; female 5,590,373) 15–64 years: 66.1% (male 27,811,357/female 26,790,222) 65 years and over: 19.6% (male 6,771,972/female 9,542,348) (2015 est.) 0–14 years: 13.7% (male 5,768,366/female 5,470,516) 15–64 years: 66.1% (male 27,707,761/female 26,676,759) 65 years and over: 20.3% (male 7,004,805/female 9,701,551) (2010 est.) Median age total: 47.4 years. Country comparison to the world: 3rd male: 46.2 years female: 48.5 years (2018 est.) Birth rate 8.6 births/1,000 population (2018 est.) Country comparison to the world: 213rd Death rate 11.8 deaths/1,000 population (2018 est.) Country comparison to the world: 19th 11.7 deaths/1,000 population (2017 est.) Total fertility rate 1.46 children born/woman (2018 est.) Country comparison to the world: 204th 1.43 children born/woman (2014) 1.42 children born/woman (2013) 1.38 children born/woman (2008) Net migration rate 1.5 migrant(s)/1,000 population (2018 est.) Country comparison to the world: 56th 1.5 migrant(s)/1,000 population (2017 est.) Population growth rate -0.17% (2018 est.) Country comparison to the world: 208th -0.16% (2017 est.) Mother's mean age at first birth 29.4 years (2015 est.) Life expectancy at birth total population: 80.8 years. Country comparison to the world: 34th male: 78.5 years female: 83.3 years (2017 est.) Urbanization urban population: 77.3% of total population (2018) rate of urbanization: 0.27% annual rate of change (2015-20 est.) Infant mortality rate total: 3.4 deaths/1,000 live births. Country comparison to the world: 205th male: 3.7 deaths/1,000 live births female: 3.1 deaths/1,000 live births (2017 est.) 4.09 deaths per 1,000 live births (2007) total: 3.99 deaths/1,000 live births (2010) total population: 81 years (2015) 80 years (2013) Sex ratio at birth: 1.06 male(s)/female under 15 years: 1.05 male(s)/female 15–64 years: 1.04 male(s)/female 65 years and over: 0.72 male(s)/female total population: 0.97 male(s)/female (2010 est.) Dependency ratios total dependency ratio: 52.1 youth dependency ratio: 19.9 elderly dependency ratio: 32.1 potential support ratio: 3.1 (2015 est.) School life expectancy (primary to tertiary education) total: 17 years male: 17 years female: 17 years (2015) Unemployment, youth ages 15–24 total: 7.2% male: 7.9% female: 6.5% (2015 est.) Country comparison to the world: 139th Most childbirths in Germany happen within marriage. Out of 778,080 births in 2019 258,835 were to unmarried parents, which means that around 33% or one third of the children are born out of wedlock, while two thirds are within. This percentage of unmarried birth has long been growing and reached 33% in 2010, more than twice of what it was in 1990. However in recent years it has started to stagnate or even decrease. The Mikrozensus done in 2008 revealed that the number of children a German woman aged 40 to 75 had, was closely linked to her educational achievement. In Western Germany the most educated women were the most likely to be childless. 26% of those groups stated they were childless, while 16% of those having an intermediate education, and 11% of those having compulsory education, stated the same. In Eastern Germany however, 9% of the most educated women of that age group and 7% of those who had an intermediary education were childless, while 12% of those having only compulsory education were childless. The reason for that east-western difference is that the GDR had an "educated mother scheme" and actively tried to encourage first births among the more educated. It did so by propagandizing the opinion that every educated woman should "present at least one child to socialism" and also by financially rewarding its more educated citizen to become parents. The government especially tried to persuade students to become parents while still in college and it was quite successful in doing so. In 1986 38% of all women, who were about to graduate from college, were mothers of at least one child and additional 14% were pregnant and 43% of all men, who were about to graduate from college, were fathers of at least one child. There was a sharp decline in the birth rate and especially in the birth rate of the educated after the fall of the Berlin wall. Nowadays, 5% of those about to graduate from college are parents. The more educated a Western German mother aged 40 to 75 was in 2008, the less likely she was to have a big family. The same was true for a mother living in Eastern Germany in 2008. In 2011, this trend was reversed in Eastern Germany, where more highly educated women now had a somewhat higher fertility rate than the rest of the population. Persons who said they had no religion tend to have fewer children than those who identify as Christians, and studies also found that conservative-leaning Christians had more children compared to liberal-leaning Christians. A study done in 2005 in the western German state of Nordrhein-Westfalen by the HDZ revealed that childlessness was especially widespread among scientists. It showed that 78% of the women scientists and 71% of the male scientists working in that state were childless. Ethnic minorities and migrant background (Migrationshintergrund) The Federal Statistical Office defines persons with a migrant background as all persons who migrated to the present area of the Federal Republic of Germany after 1949, plus all foreign nationals born in Germany and all persons born in Germany as German nationals with at least one parent who migrated to Germany or was born in Germany as a foreign national. The figures presented here are based on this definition only. In 2010, 2.3 million families with children under 18 years were living in Germany, in which at least one parent had foreign roots. They represented 29% of the total of 8.1 million families with minor children. Compared with 2005 – the year when the microcensus started to collect detailed information on the population with a migrant background – the proportion of migrant families has risen by 2 percentage points. In 2019, 40% children under 5 years old had migrant background. Most of the families with a migrant background live in the western part of Germany. In 2010, the proportion of migrant families in all families was 32% in the former territory of the Federal Republic. This figure was more than double that in the new Länder (incl. Berlin) where it stood at 15%.Eastern Germany has a much lower proportion of immigrants than the West, as the GDR did not let in that many guest workers and Eastern Germany's is not doing as good as West Germany's and had a higher percentage of jobless persons until recently. However in recent years the number of people with an immigrant background in East Germany has been growing as refugees (as well as German Repatriates) are distributed with the Königssteiner Schlüssel, so every German state has to take the same number of them compared to its population and economy. In 2019 19.036 million people or 89,6% of people with an immigrant background live in Western Germany (excluding Berlin), being 28,7% of its population, while 1.016 million people with immigrant background 4,8% live in Eastern States, being 8,2% of population, and 1.194 million people with an immigrant background 5,6% live in Berlin, being 33,1% of its population. In 2019, 26% of Germans of any age group (up from 18,4% in 2008) and 39% of German children (up from 30% in 2008) had at least one parent born abroad. Average age for Germans with at least one parent born abroad was 35.6 years (up from 33.8 years in 2008), while that for Germans, who had two parents born in Germany was 47.3 years (up from 44.6 in 2008). The largest groups of people with an immigrant background in Germany are people from Turkey, Poland and Russia. , the population by background was as follows: Four other sizable groups of people are referred to as "national minorities" (nationale Minderheiten) because they have lived in their respective regions for centuries: Danes, Frisians, Roma and Sinti, and Sorbs. There is a Danish minority (about 50,000, according to government sources) in the northernmost state of Schleswig-Holstein. Eastern and Northern Frisians live at Schleswig-Holstein's western coast, and in the north-western part of Lower Saxony. They are part of a wider community (Frisia) stretching from Germany to the northern Netherlands. The Sorbs, a Slavic people with about 60,000 members (according to government sources), are in the Lusatia region of Saxony and Brandenburg. They are the last remnants of the Slavs that lived in central and eastern Germany since the 7th century to have kept their traditions and not been completely integrated into the wider German nation. Until World War II the Poles were recognized as one of the national minorities. In 1924 the Union of Poles in Germany had initiated cooperation between all national minorities in Germany under the umbrella organization Association of National Minorities in Germany. Some of the union members wanted the Polish communities in easternmost Germany (now Poland) to join the newly established Polish nation after World War I. Even before the German invasion of Poland, leading anti-Nazi members of the Polish minority were deported to concentration camps; some were executed at the Piaśnica murder site. Minority rights for Poles in Germany were revoked by Hermann Göring's World War II decree of 27 February 1940, and their property was confiscated. After the war ended, the German government did not re-implement national minority rights for ethnic Poles. The reason for this is that the areas of Germany which formerly had a native Polish minority were annexed to Poland and the Soviet Union, while almost all of the native German populations (formerly the ethnic majority) in these areas subsequently fled or were expelled by force. With the mixed German-Polish territories now lost, the German government subsequently regarded ethnic Poles residing in what remained of Germany as immigrants, just like any other ethnic population with a recent history of arrival. In contrast, Germans living in Poland are recognized as national minority and have granted seats in Polish Parliament. It must be said, however, that an overwhelming number of Germans in Poland have centuries-old historical ties to the lands they now inhabit, whether from living in territory that once belonged to the German state, or from centuries-old communities. In contrast, most Poles in present-day Germany are recent immigrants, though there are some communities which have been present since the 19th and perhaps even the 18th centuries. Despite protests by some in the older Polish-German communities, and despite Germany being now a signatory to the Framework Convention for the Protection of National Minorities, Germany has so far refused to re-implement minority rights for ethnic Poles, based on the fact that almost all areas of historically mixed German-Polish heritage (where the minority rights formerly existed) are no longer part of Germany and because the vast majority of ethnic Poles now residing in Germany are recent immigrants. Roma people have been in Germany since the Middle Ages. They were persecuted by the Nazis, and thousands of Roma living in Germany were killed by the Nazi regime. Nowadays, they are spread all over Germany, mostly living in major cities. It is difficult to estimate their exact number, as the German government counts them as "persons without migrant background" in their statistics. There are also many assimilated Sinti and Roma. A vague figure given by the German Department of the Interior is about 70,000. In contrast to the old-established Roma population, the majority of them do not have German citizenship, they are classified as immigrants or refugees. After World War II, 14 million ethnic Germans were expelled from the eastern territories of Germany and homelands outside the former German Empire. The accommodation and integration of these Heimatvertriebene in the remaining part of Germany, in which many cities and millions of apartments had been destroyed, was a major effort in the post-war occupation zones and later states of Germany. Since the 1960s, ethnic Germans from the People's Republic of Poland and Soviet Union (especially from Kazakhstan, Russia, and Ukraine), have come to Germany. During the time of Perestroika, and after the dissolution of the Soviet Union, the number of immigrants increased heavily. Some of these immigrants are of mixed ancestry.
France and Italy. Germany's principal agricultural products are potatoes, wheat, barley, sugar beets, fruit, and cabbages. Despite the country's high level of industrialization, almost one-third of its territory is covered by forest. The forestry industry provides for about two-thirds of domestic consumption of wood and wood products, so Germany is a net importer of these items. The German soil is relatively poor in raw materials. Only lignite (brown coal) and potash salt (Kalisalz) are available in significant quantities. However, the former GDR's Wismut mining company produced a total of 230,400 tonnes of uranium between 1947 and 1990 and made East Germany the fourth-largest producer of uranium ore worldwide (largest in USSR's sphere of control) at the time. Oil, natural gas, and other resources are, for the most part, imported from other countries. Potash salt is mined in the center of the country (Niedersachsen, Sachsen-Anhalt and Thüringen). The most important producer is K+S (formerly Kali und Salz AG). Germany's bituminous coal deposits were created more than 300 million years ago from swamps which extended from the present-day South England, over the Ruhr area to Poland. Lignite deposits developed similarly, but during a later period, about 66 million years ago. Because the wood is not yet completely transformed into coal, brown coal contains less energy than bituminous coal. Lignite is extracted in the extreme western and eastern parts of the country, mainly in Nordrhein-Westfalen, Sachsen and Brandenburg. Considerable amounts are burned in coal plants near the mining areas, to produce electricity. Transporting lignite over far distances is not economically feasible, therefore the plants are located practically next to the extraction sites. Bituminous coal is mined in Nordrhein-Westfalen and Saarland. Most power plants burning bituminous coal operate on imported material, therefore the plants are located not only near to the mining sites, but throughout the country. In 2019, the country was the world's 3rd largest producer of selenium, the world's 5th largest producer of potash, the world's 5th largest producer of boron, the world's 7th largest producer of lime, the world's 13th largest producer of fluorspar, the world's 14th largest producer of feldspar, the world's 17th largest producer of graphite, the world's 18th largest producer of sulfur, in addition to being the 4th largest world producer of salt. Industry Industry and construction accounted for 30.7% of the gross domestic product in 2017 and employed 24.2% of the workforce. Germany excels in the production of automobiles, machinery, electrical equipment and chemicals. With the manufacture of 5.2 million vehicles in 2009, Germany was the world's fourth-largest producer and largest exporter of automobiles. German automotive companies enjoy an extremely strong position in the so-called premium segment, with a combined world market share of about 90%. Small- to medium-sized manufacturing firms (Mittelstand companies) which specialize in technologically advanced niche products and are often family-owned form a major part of the German economy. It is estimated that about 1500 German companies occupy a top three position in their respective market segment worldwide. In about two thirds of all industry sectors German companies belong to the top three competitors. Germany is the only country among the top five arms exporters that is not a permanent member of the United Nations Security Council. Services In 2017 services constituted 68.6% of gross domestic product (GDP), and the sector employed 74.3% of the workforce. The subcomponents of services are financial, renting, and business activities (30.5%); trade, hotels and restaurants, and transport (18%); and other service activities (21.7%). Germany is the seventh most visited country in the world, with a total of 407 million overnights during 2012. This number includes 68.83 million nights by foreign visitors. In 2012, over 30.4 million international tourists arrived in Germany. Berlin has become the third most visited city destination in Europe. Additionally, more than 30% of Germans spend their holiday in their own country, with the biggest share going to Mecklenburg-Vorpommern. Domestic and international travel and tourism combined directly contribute over EUR43.2 billion to German GDP. Including indirect and induced impacts, the industry contributes 4.5% of German GDP and supports 2 million jobs (4.8% of total employment). The largest annual international trade fairs and congresses are held in several German cities such as Hannover, Frankfurt, and Berlin. Government finances The debt-to-GDP ratio of Germany had its peak in 2010 when it stood at 80.3% and decreased since then. According to Eurostat, the government gross debt of Germany amounts to €2,152.0 billion or 71.9% of its GDP in 2015. The federal government achieved a budget surplus of €12.1 billion ($13.1 billion) in 2015. Germany's credit rating by credit rating agencies Standard & Poor's, Moody's and Fitch Ratings stands at the highest possible rating AAA with a stable outlook in 2016. Germany's "debt clock" (Schuldenuhr) reversed for the first time in 20 years in January 2018. It is now currently increasing at 10,424.00 per second (Oct2020).. Economists generally see Germany's current account surplus as undesirable. Infrastructure Energy Germany is the world's fifth-largest consumer of energy, and two-thirds of its primary energy was imported in 2002. In the same year, Germany was Europe's largest consumer of electricity, totaling 512.9 terawatt-hours. Government policy promotes energy conservation and the development of renewable energy sources, such as solar, wind, biomass, hydroelectric, and geothermal energy. As a result of energy-saving measures, energy efficiency has been improving since the beginning of the 1970s. The government has set the goal of meeting half the country's energy demands from renewable sources by 2050. Renewable energy also plays an increasing role in the labor market: Almost 700,000 people are employed in the energy sector. About 50 percent of them work with renewable energies. In 2000, the red-green coalition under Chancellor Schröder and the German nuclear power industry agreed to phase out all nuclear power plants by 2021. The conservative coalition under Chancellor Merkel reversed this decision in January 2010, electing to keep plants open. The nuclear disaster of the Japanese nuclear plant Fukushima in March 2011 however, changed the political climate fundamentally: Older nuclear plants have been shut down. Germany is seeking to have wind, solar, biogas, and other renewable energy sources play a bigger role, as the country looks to completely phase out nuclear power by 2022 and coal-fired power plants by 2038. Renewable energy yet still plays a more modest role in energy consumption, though German solar and wind power industries play a leading role worldwide. In 2009, Germany's total energy consumption (not just electricity) came from the following sources: oil 34.6%, natural gas 21.7%, lignite 11.4%, bituminous coal 11.1%, nuclear power 11.0%, hydro and wind power 1.5%, others 9.0%. In the first half of 2021, coal, natural gas and nuclear energy comprised 56% of the total electricity fed into Germany's grid in the first half of 2021. Coal was the leader out of the conventional energy sources, comprising over 27% of Germany's electricity. Wind power's contribution to the electric grid was 22%. There are 3 major entry points for oil pipelines: in the northeast (the Druzhba pipeline, coming from Gdańsk), west (coming from Rotterdam) and southeast (coming from Nelahozeves). The oil pipelines of Germany do not constitute a proper network, and sometimes only connect two different locations. Major oil refineries are located in or near the following cities: Schwedt, Spergau, Vohburg, Burghausen, Karlsruhe, Cologne, Gelsenkirchen, Lingen, Wilhelmshaven, Hamburg and Heide. Germany's network of natural gas pipelines, on the other hand, is dense and well-connected. Imported pipeline gas comes mostly from Russia, the Netherlands and the United Kingdom. Although gas imports from Russia have been historically reliable, even during the cold war, recent price disputes between Gazprom and the former Soviet states, such as Ukraine, have also affected Germany. As a result, high political importance is placed on the construction of the Nord Stream pipeline, running from Vyborg in Russia along the Baltic
Labour (SDA) which set working conditions and Strength through Joy (KDF) to ensure sports clubs for workers. West Germany Beginning with the replacement of the Reichsmark with the Deutsche Mark as legal tender, a lasting period of low inflation and rapid industrial growth was overseen by the government led by German Chancellor Konrad Adenauer and his minister of economics, Ludwig Erhard, raising West Germany from total wartime devastation to one of the most developed nations in modern Europe. In 1953 it was decided that Germany was to repay $1.1 billion of the aid it had received. The last repayment was made in June 1971. Apart from these factors, hard work and long hours at full capacity among the population in the 1950s, 1960s, and early 1970s and extra labor supplied by thousands of Gastarbeiter ("guest workers") provided a vital base for the economic upturn. East Germany By the early 1950s, the Soviet Union had seized reparations in the form of agricultural and industrial products and demanded further heavy reparation payments. Silesia with the Upper Silesian Coal Basin, and Stettin, a prominent natural port, were lost to Poland. Exports from West Germany exceeded $323 billion in 1988. In the same year, East Germany exported $30.7 billion worth of goods; 65% to other communist states. East Germany had zero unemployment. In 1976 the average annual GDP growth was roughly 5.9%. Federal Republic The German economy practically stagnated in the beginning of the 2000s. The worst growth figures were achieved in 2002 (+1.4%), in 2003 (+1.0%) and in 2005 (+1.4%). Unemployment was also chronically high. Due to these problems, together with Germany's aging population, the welfare system came under considerable strain. This led the government to push through a wide-ranging program of belt-tightening reforms, Agenda 2010, including the labor market reforms known as Hartz I - IV. In the later part of the first decade of 2000, the world economy experienced high growth, from which Germany as a leading exporter also profited. Some credit the Hartz reforms with achieving high growth and declining unemployment but others contend that they resulted in a massive decrease in standards of living and that its effects are limited and temporary. The nominal GDP of Germany contracted in the second and third quarters of 2008, putting the country in a technical recession following a global and European recession cycle. German industrial output dropped to 3.6% in September vis-à-vis August. In January 2009 the German government under Angela Merkel approved a €50 billion ($70 billion) economic stimulus plan to protect several sectors from a downturn and a subsequent rise in unemployment rates. Germany exited the recession in the second and third quarters of 2009, mostly due to rebounding manufacturing orders and exports - primarily from outside the Euro Zone - and relatively steady consumer demand. Germany is a founding member of the EU, the G8 and the G20, and was the world's largest exporter from 2003 to 2008. In 2011 it remained the third largest exporter and third largest importer. Most of the country's exports are in engineering, especially machinery, automobiles, chemical goods and metals. Germany is a leading producer of wind turbines and solar-power technology. Annual trade fairs and congresses are held in cities throughout Germany. 2011 was a record-breaking year for the German economy. German companies exported goods worth over €1 trillion ($1.3 trillion), the highest figure in history. The number of people in work has risen to 41.6 million, the highest recorded figure. Through 2012, Germany's economy continued to be stronger relative to local neighboring nations. Data The following table shows the main economic indicators in 1980–2020 (with IMF staff estimtates in 2021–2026). Inflation below 2% is in green. Companies Of the world's 500 largest stock-market-listed companies measured by revenue in 2010, the Fortune Global 500, 37 are headquartered in Germany. 30 Germany-based companies are included in the DAX, the German stock market index. Well-known global brands are Mercedes-Benz, BMW, SAP, Siemens, Volkswagen, Adidas, Audi, Allianz, Porsche, Bayer, BASF, Bosch, and Nivea. Germany is recognised for its specialised small and medium enterprises, known as the Mittelstand model. SMEs account for more than 99 per cent of German companies. Around 1,000 of these companies are global market leaders in their segment and are labelled hidden champions. From 1991 to 2010, 40,301 mergers and acquisitions with an involvement of German firms with a total known value of 2,422 bil. EUR have been announced. The largest transactions since 1991 are: the acquisition of Mannesmann by Vodafone for 204.8 bil. EUR in 1999, the merger of Daimler-Benz with Chrysler to form DaimlerChrysler in 1998 valued at 36.3 bil. EUR. Berlin (Economy of Berlin) developed an international Startup ecosystem and became a leading location for venture capital funded firms in the European Union. The list includes the largest German companies by revenue in 2011: Mergers and acquisitions Since the German reunification, there have been 52,258 mergers or acquisitions deals inbound or outbound in Germany. The most active year in terms of value was 1999 with a cumulated value of 48. bil. EUR, twice as much as the runner up which was 2006 with 24. bil. EUR (see graphic "M&A in Germany"). Here is a list of the top 10 deals (ranked by value) that include a German company. The Vodafone - Mannesmann deal is still the biggest deal in global history. Economic region Germany as a federation is a polycentric country and does not have a single economic center. The stock exchange is located in Frankfurt am Main, the largest Media company (Bertelsmann SE & Co. KGaA) is headquartered in Gütersloh; the largest car manufacturers are in Wolfsburg (Volkswagen), Stuttgart (Mercedes-Benz and Porsche), and Munich (Audi and BMW). Germany is an advocate of closer European economic and political integration. Its commercial policies are increasingly determined by agreements among European Union (EU) members and EU single market legislation. Germany introduced the common European currency, the euro on 1 January 1999. Its monetary policy is set by the European Central Bank in Frankfurt. The southern states ("Bundesländer"), especially Bayern, Baden-Württemberg, and Hessen, are economically stronger than the northern states. One of Germany's traditionally strongest (and at the same time oldest) economic regions is the Ruhr area in the west, between Duisburg and Dortmund. 27 of the country's 100 largest companies are located there. In recent years, however, the area, whose economy is based on natural resources and heavy industry, has seen a substantial rise in unemployment (2010: 8.7%). The economy of Bayern and Baden-Württemberg, the states with the lowest number of unemployed people (2018: 2.7%, 3.1%), on the other hand, is based on high-value products. Important sectors are automobiles, electronics, aerospace, and biomedicine, among others. Baden-Württemberg is an industrial center especially for the automobile and machine-building industry and the home of brands like Mercedes-Benz (Daimler), Porsche and Bosch. With the reunification on 3 October 1990, Germany began the major task of reconciling the economic systems of the two former republics. Interventionist economic planning ensured gradual development in eastern Germany up to the level of former West Germany, but the standard of living and annual income remains significantly higher in western German states. The modernization and integration of the eastern German economy continues to be a long-term process scheduled to last until the year 2019, with annual transfers from west to east amounting to roughly $80 billion. The overall unemployment rate has consistently fallen since 2005 and reached a 20-year low in 2012. The country in July 2014 began legislating to introduce a federally mandated minimum wage which would come into effect on 1 January 2015. German states Wealth The following top 10 list of German billionaires is based on an annual assessment of wealth and assets compiled and published by Forbes magazine on 1 March 2016. $27.9 billion Beate Heister (b. Albrecht) & Karl Albrecht Jr. $20.3 billion Theo Albrecht Jr. $18.5 billion Susanne Klatten $18.1 billion Georg Schaeffler $16.4 billion Dieter Schwarz $15.6 billion Stefan Quandt $15.4 billion Michael Otto $11.7 billion Heinz Hermann Thiele $10 billion Klaus-Michael Kühne $9.5 billion Hasso Plattner Wolfsburg is the city in Germany with the country's highest per capita GDP, at $128,000. The following top 10 list of German cities with the highest per capita GDP is based on a study by the Cologne Institute for Economic Research on 31 July 2013. $128,000 Wolfsburg, Lower Saxony $114,281 Frankfurt am Main, Hesse $108,347 Schweinfurt, Bavaria $104,000 Ingolstadt, Bavaria $99,389 Regensburg, Bavaria $92,525 Düsseldorf, North Rhine-Westphalia $92,464 Ludwigshafen am Rhein, Rhineland-Palatinate $91,630 Erlangen, Bavaria $91,121 Stuttgart, Baden-Württemberg $88,692 Ulm, Baden-Württemberg Sectors Germany has a social market economy characterised by a highly qualified labor force, a developed infrastructure, a large capital stock, a low level of corruption, and a high level of innovation. It has the largest national economy in Europe, the fourth largest by nominal GDP in the world, and ranked fifth by GDP (PPP) in 2015. The service sector contributes around 70% of the total GDP, industry 29.1%, and agriculture 0.9%. Primary In 2010 agriculture, forestry, and mining accounted for only 0.9% of Germany's gross domestic product (GDP) and employed only 2.4% of the population, down from 4% in 1991. Agriculture is extremely productive, and Germany can cover 90% of its nutritional needs with domestic production. Germany is the third-largest agricultural producer in the European Union after France and Italy. Germany's principal agricultural products are potatoes, wheat, barley, sugar beets, fruit, and cabbages. Despite the country's high level of industrialization, almost one-third of its territory is covered by forest. The forestry industry provides for about two-thirds of domestic consumption of wood and wood products, so Germany is a net importer of these items. The German soil is relatively poor in raw materials. Only lignite (brown coal) and potash salt (Kalisalz) are available in significant quantities. However, the former GDR's Wismut mining company produced a total of 230,400 tonnes of uranium between 1947 and 1990 and made East Germany the fourth-largest producer of uranium ore worldwide (largest in USSR's sphere of control) at the time. Oil, natural gas, and other resources are, for the most part, imported from other countries. Potash salt is mined in the center of the country (Niedersachsen, Sachsen-Anhalt and Thüringen). The most important producer is K+S (formerly Kali und Salz AG). Germany's bituminous coal deposits were created more than 300 million years ago from swamps which extended from the present-day South England, over the Ruhr area to Poland. Lignite deposits developed similarly, but during a later period, about 66 million years ago. Because the wood is not yet completely transformed into coal, brown coal contains less energy than bituminous coal. Lignite is extracted in the extreme western and eastern parts of the country, mainly in Nordrhein-Westfalen, Sachsen and Brandenburg. Considerable amounts are burned in coal plants near the mining areas, to produce electricity. Transporting lignite over far distances is not economically feasible, therefore the plants are located practically next to the extraction sites. Bituminous coal is mined in Nordrhein-Westfalen and Saarland. Most power plants burning bituminous coal operate on imported material, therefore the plants are located not only near to the mining sites, but throughout the country. In 2019, the country was the world's 3rd largest producer of selenium, the world's 5th largest producer of potash, the world's 5th largest producer of boron, the world's 7th largest producer of lime, the world's 13th largest producer of fluorspar, the world's 14th largest producer of feldspar, the world's 17th largest producer of graphite, the world's 18th largest producer of sulfur, in addition to being the 4th largest world producer of salt. Industry Industry and construction accounted for 30.7% of the gross domestic product in 2017 and employed 24.2% of the workforce. Germany excels in the production of automobiles, machinery, electrical equipment and chemicals. With the manufacture of 5.2 million vehicles in 2009, Germany was the world's fourth-largest producer and largest exporter of automobiles. German automotive companies enjoy an extremely strong position in the so-called premium segment, with a combined world market share of about 90%. Small- to medium-sized manufacturing firms (Mittelstand companies) which specialize in technologically advanced niche products and are often family-owned form a major part of the German economy. It is estimated that about 1500 German companies occupy a top three position in their respective market segment worldwide. In about two thirds of all industry sectors German companies belong to the top three competitors. Germany is the only country among the top five arms exporters that is not a permanent member of the United Nations Security Council. Services In 2017 services constituted 68.6% of gross domestic product (GDP), and the sector employed 74.3% of the workforce. The subcomponents of services are financial, renting, and business activities (30.5%); trade, hotels and restaurants, and transport (18%); and other service activities (21.7%). Germany is the seventh most visited country in the world, with a total of 407 million overnights during 2012. This number includes 68.83 million nights by foreign visitors. In 2012, over 30.4 million international tourists arrived in Germany. Berlin has become the third most visited city destination in Europe. Additionally, more than 30% of Germans spend their holiday in their own country, with the biggest share going to Mecklenburg-Vorpommern. Domestic and international travel and tourism combined directly contribute over EUR43.2 billion to German GDP. Including indirect and induced impacts, the industry contributes 4.5% of German GDP and supports 2 million jobs (4.8% of total employment). The largest annual international trade fairs and congresses are held in several German cities such as Hannover, Frankfurt, and Berlin. Government finances The debt-to-GDP ratio of Germany had its peak in 2010 when it stood at 80.3% and decreased since then. According to Eurostat, the government gross debt of Germany amounts to €2,152.0 billion or 71.9% of its GDP in 2015. The federal government achieved a budget surplus of €12.1 billion ($13.1 billion) in 2015. Germany's credit rating by credit rating agencies Standard & Poor's, Moody's and Fitch Ratings stands at the highest possible rating AAA with a stable outlook in 2016. Germany's "debt clock" (Schuldenuhr) reversed for the first time in 20 years in January 2018. It is now currently increasing at 10,424.00 per second (Oct2020).. Economists generally see Germany's current account surplus as undesirable. Infrastructure Energy Germany is the world's fifth-largest consumer of energy, and two-thirds of its primary energy was imported in 2002. In the same year, Germany was Europe's largest consumer of electricity, totaling 512.9 terawatt-hours. Government policy promotes
cities. A S-Bahn doesn't skip stations and runs more frequently than other trains. In Berlin and Hamburg the S-Bahn has a U-Bahn-like service and uses a third rail whereas all other S-Bahn services rely on regular catenary power supply. Rapid transit (U-Bahn) Relatively few cities have a full-fledged underground U-Bahn system; S-Bahn (suburban commuter railway) systems are far more common. In some cities the distinction between U-Bahn and S-Bahn systems is blurred, for instance some S-Bahn systems run underground, have frequencies similar to U-Bahn, and form part of the same integrated transport network. A larger number of cities has upgraded their tramways to light rail standards. These systems are called Stadtbahn (not to be confused with S-Bahn), on main line rails. Cities with U-Bahn systems are: Berlin (U-Bahn) Hamburg (U-Bahn) Munich (U-Bahn) Nuremberg/Fürth (U-Bahn) With the exception of Hamburg, all of those aforementioned cities also have a tram system, often with new lines built to light rail standards. Cities with Stadtbahn systems can be found in the article Trams in Germany. Trams (Straßenbahn) Germany was among the first countries to have electric street - running railways and Berlin has one of the longest tram networks in the world. Many West German cities abandoned their previous tram systems in the 1960s and 1970s while others upgraded them to "Stadtbahn" (~light rail) standard, often including underground sections. In the East, most cities retained or even expanded their tram systems and since reunification a trend towards new tram construction can be observed in most of the country. Today the only major German city without a tram or light rail system is Hamburg. Tram-train systems like the Karlsruhe model first came to prominence in Germany in the early 1990s and are implemented or discussed in several cities, providing coverage far into the rural areas surrounding cities. Air transport Short distances and the extensive network of motorways and railways make airplanes uncompetitive for travel within Germany. Only about 1% of all distance travelled was by plane in 2002. But due to a decline in prices with the introduction of low-fares airlines, domestic air travel is becoming more attractive. In 2013 Germany had the fifth largest passenger air market in the world with 105,016,346 passengers. However, the advent of new faster rail lines often leads to cuts in service by the airlines or even total abandonment of routes like Frankfurt-Cologne, Berlin-Hannover or Berlin-Hamburg. Airlines see: List of airlines of Germany Germany's largest airline is Lufthansa, which was privatised in the 1990s. Lufthansa also operates two regional subsidiaries under the Lufthansa Regional brand and a low-cost subsidiary, Eurowings, which operates independently. Lufthansa flies a dense network of domestic, European and intercontinental routes. Germany's second-largest airline was Air Berlin, which also operated a network of domestic and European destinations with a focus on leisure routes as well as some long-haul services. Air Berlin declared bankruptcy in 2017 with the last flight under its own name in October of that year. Charter and leisure carriers include Condor, TUIfly, MHS Aviation and Sundair. Major German cargo operators are Lufthansa Cargo, European Air Transport Leipzig (which is a subsidiary of DHL) and AeroLogic (which is jointly owned by DHL and Lufthansa Cargo). Airports see: List of airports in Germany Frankfurt Airport is Germany's largest airport, a major transportation hub in Europe and the world's twelfth busiest airport. It is one of the airports with the largest number of international destinations served worldwide. Depending on whether total passengers, flights or cargo traffic are used as a measure, it ranks first, second or third in Europe alongside London Heathrow Airport and Paris-Charles de Gaulle Airport. Germany's second biggest international airport is Munich Airport followed by Düsseldorf Airport. There are several more scheduled passenger airports throughout Germany, mainly serving European metropolitan and leisure destinations. Intercontinental long-haul routes are operated to and from the airports in Frankfurt, Munich, Düsseldorf, Berlin-Tegel, Cologne/Bonn, Hamburg and Stuttgart. Berlin Brandenburg Airport is expected to become the third largest German airport by annual passengers once it opens, serving as single airport for Berlin. Originally planned to be completed in 2011, the new airport has been delayed several times due to poor construction management and technical difficulties. As of September 2014, it is not yet known when the new airport will become operational. In 2017 it was announced that the airport wouldn't open before 2019. In the same year a non-binding referendum to keep Tegel Airport open even after the new airport opens was passed by Berlin voters. BER has opened on October 31, 2020 Airports — with paved runways: total: 318 over 3,047 m: 14 2,438 to 3,047 m: 49 1,524 to 2,437 m: 60 914 to 1,523 m: 70 under 914 m: 125 (2013 est.) Airports — with unpaved runways: total: 221 over 3,047 m: 0 2,438 to 3,047 m: 0 1,524 to 2,437 m: 1 914 to 1,523 m: 35 under 914 m: 185 (2013 est.) Heliports: 23 (2013 est.) Water transport Waterways: 7,467 km (2013); major rivers include the Rhine and Elbe; Kiel Canal is an important connection between the Baltic Sea and North Sea and one of the busiest waterways in the world, the Rhine-Main-Danube Canal links Rotterdam on the North Sea with the Black Sea. It passes through the highest point reachable by ocean-going vessels from the sea. The Canal has gained importance for leisure cruises in addition to cargo traffic. Pipelines: oil 2,400 km (2013) Ports and harbours: Berlin, Bonn, Brake, Bremen, Bremerhaven, Cologne, Dortmund, Dresden, Duisburg, Emden, Fürth, Hamburg, Karlsruhe, Kiel, Lübeck, Magdeburg, Mannheim, Nuremberg, Oldenburg, Rostock, Stuttgart, Wilhelmshaven The port of Hamburg is the largest sea-harbour in Germany and ranks #3 in Europe (after Rotterdam and Antwerpen), #17 worldwide (2016), in total
km/h) Intercity (IC) serve most major cities. Several extensions or upgrades to high speed lines are under construction or planned for the near future, some of them after decades of planning. The fastest high-speed train operated by Deutsche Bahn, the InterCityExpress or ICE connects major German and neighbouring international centres such as Zurich, Vienna, Copenhagen, Paris, Amsterdam and Brussels. The rail network throughout Germany is extensive and provides excellent service in most areas. On regular lines, at least one train every two hours will call even in the smallest of villages during the day. Nearly all larger metropolitan areas are served by S-Bahn, U-Bahn, Straßenbahn and/or bus networks. The German government on 13 February 2018 announced plans to make public transportation free as a means to reduce road traffic and decrease air pollution to EU-mandated levels. The new policy will be put to the test by the end of the year in the cities of Bonn, Essen, Herrenberg, Reutlingen and Mannheim. Issues remain concerning the costs of such a move as ticket sales for public transportation constitute a major source of income for cities. International freight trains While Germany and most of contiguous Europe use , differences in signalling, rules and regulations, electrification voltages, etc. create obstacles for freight operations across borders. These obstacles are slowly being overcome, with international (in- and outgoing) and transit (through) traffic being responsible for a large part of the recent uptake in rail freight volume. EU regulations have done much to harmonize standards, making cross border operations easier. Maschen Marshalling Yard near Hamburg is the second biggest in the world and the biggest in Europe. It serves as a freight hub distributing goods from Scandinavia to southern Europe and from Central Europe to the port of Hamburg and overseas. Being a densely populated prosperous country in the center of Europe, there are many important transit routes through Germany. The Mannheim–Karlsruhe–Basel railway has undergone upgrades and refurbishments since the 1980s and will likely undergo further upgrades for decades to come as it is the main route from the North Sea Ports to northern Italy via the Gotthard Base Tunnel. S-Bahn Almost all major metro areas of Germany have suburban rail systems called S-Bahnen (Schnellbahnen). These usually connect larger agglomerations to their suburbs and often other regional towns, although the Rhein-Ruhr S-Bahn connects several large cities. A S-Bahn doesn't skip stations and runs more frequently than other trains. In Berlin and Hamburg the S-Bahn has a U-Bahn-like service and uses a third rail whereas all other S-Bahn services rely on regular catenary power supply. Rapid transit (U-Bahn) Relatively few cities have a full-fledged underground U-Bahn system; S-Bahn (suburban commuter railway) systems are far more common. In some cities the distinction between U-Bahn and S-Bahn systems is blurred, for instance some S-Bahn systems run underground, have frequencies similar to U-Bahn, and form part of the same integrated transport network. A larger number of cities has upgraded their tramways to light rail standards. These systems are called Stadtbahn (not to be confused with S-Bahn), on main line rails. Cities with U-Bahn systems are: Berlin (U-Bahn) Hamburg (U-Bahn) Munich (U-Bahn) Nuremberg/Fürth (U-Bahn) With the exception of Hamburg, all of those aforementioned cities also have a tram system, often with new lines built to light rail standards. Cities with Stadtbahn systems can be found in the article Trams in Germany. Trams (Straßenbahn) Germany was among the first countries to have electric street - running railways and Berlin has one of the longest tram networks in the world. Many West German cities abandoned their previous tram systems in the 1960s and 1970s while others upgraded them to "Stadtbahn" (~light rail) standard, often including underground sections. In the East, most cities retained or even expanded their tram systems and since reunification a trend towards new tram construction can be observed in most of the country. Today the only major German city without a tram or light rail system is Hamburg. Tram-train systems like the Karlsruhe model first came to prominence in Germany in the early 1990s and are implemented or discussed in several cities, providing coverage far into the rural areas surrounding cities. Air transport Short distances and the extensive network of motorways and railways make airplanes uncompetitive for travel within Germany. Only about 1% of all distance travelled was by plane in 2002. But due to a decline in prices with the introduction of low-fares airlines, domestic air travel is becoming more attractive. In 2013 Germany had the fifth largest passenger air market in the world with 105,016,346 passengers. However, the advent of new faster rail lines often leads to cuts in service by the airlines or even total abandonment of routes like Frankfurt-Cologne, Berlin-Hannover or Berlin-Hamburg. Airlines see: List of airlines of Germany Germany's largest airline is Lufthansa, which was privatised in the 1990s. Lufthansa also operates two regional subsidiaries under the Lufthansa Regional brand and a low-cost subsidiary, Eurowings, which operates independently. Lufthansa flies a dense network of domestic, European and intercontinental routes. Germany's second-largest airline was Air Berlin, which also operated a network of domestic and European destinations with a focus on leisure routes as well as some long-haul services. Air Berlin declared bankruptcy in 2017 with the last flight under its own name in October of that year. Charter and leisure carriers include Condor, TUIfly, MHS Aviation and Sundair. Major German cargo operators are Lufthansa Cargo, European Air Transport Leipzig (which is a subsidiary of DHL) and AeroLogic (which is jointly owned by DHL and Lufthansa Cargo). Airports see: List of airports in Germany Frankfurt Airport is Germany's largest airport, a major transportation hub in Europe and the world's twelfth busiest airport. It is one of the airports with the largest number of international destinations served worldwide. Depending on whether total passengers, flights or cargo traffic are used as a measure, it ranks first, second or third in Europe alongside London Heathrow Airport and Paris-Charles de Gaulle Airport. Germany's second biggest international airport is Munich Airport followed by Düsseldorf Airport. There are several more scheduled passenger airports throughout Germany, mainly serving European metropolitan and leisure destinations. Intercontinental long-haul routes are operated to and from the airports in Frankfurt, Munich, Düsseldorf, Berlin-Tegel, Cologne/Bonn, Hamburg and Stuttgart. Berlin Brandenburg Airport is expected to become the third largest German airport by annual passengers once it opens, serving as single airport for Berlin. Originally planned to be completed in 2011, the new airport has been delayed several times due to poor construction management and technical difficulties. As of September 2014, it is not yet known when the new airport will become operational. In 2017 it was announced that the airport wouldn't open before 2019. In the same year a non-binding referendum to keep Tegel Airport open even after the new airport opens was passed by Berlin voters. BER has opened on October 31, 2020 Airports — with paved runways: total: 318 over 3,047 m: 14 2,438 to 3,047 m: 49 1,524 to 2,437 m: 60 914 to 1,523 m: 70 under 914 m: 125 (2013 est.) Airports — with unpaved runways: total: 221 over 3,047 m: 0 2,438 to 3,047 m: 0 1,524 to 2,437 m: 1 914 to 1,523 m: 35 under 914 m: 185 (2013 est.) Heliports: 23 (2013 est.) Water transport Waterways: 7,467 km (2013); major rivers include the Rhine and Elbe; Kiel Canal is an important connection between the Baltic Sea and North Sea and one of the busiest waterways in the world, the Rhine-Main-Danube Canal links Rotterdam on the North Sea with the Black Sea. It passes through the highest point reachable by ocean-going vessels from the sea. The Canal has gained importance for leisure cruises in addition to cargo traffic. Pipelines: oil 2,400 km (2013) Ports and harbours: Berlin, Bonn, Brake, Bremen, Bremerhaven, Cologne, Dortmund, Dresden, Duisburg, Emden, Fürth, Hamburg, Karlsruhe, Kiel, Lübeck, Magdeburg, Mannheim, Nuremberg, Oldenburg, Rostock, Stuttgart, Wilhelmshaven The port of Hamburg is the largest sea-harbour in Germany and ranks #3 in Europe (after Rotterdam and Antwerpen), #17 worldwide (2016), in total container traffic. Merchant marine: total: 427 ships Ships by type: barge carrier 2, bulk carrier 6, cargo ship 51, chemical tanker 15, container ship 298, Liquified Gas Carrier 6, passenger ship 4, petroleum tanker 10, refrigerated cargo 3, roll-on/roll-off ship 6 (2010 est.) Ferries operate mostly between mainland Germany and its islands, serving both tourism and freight transport. Car ferries also operate across the Baltic Sea to the Nordic countries, Russia and the Baltic countries. Rail ferries operate across the Fehmahrnbelt, from Rostock to Sweden (both carrying passenger trains) and from the Mukran port in Sassnitz on the island of Rügen to numerous Baltic Sea destinations (freight only). See also List of airports in Germany License
Weimar Republic Wehrmacht (1935–1945), armed forces of Nazi Germany National People's Army (1956–1990), armed forces of the former German Democratic Republic of eastern Germany, prior to the reunification. Bundeswehr (since 1955), current armed forces of Germany
of Nazi Germany National People's Army (1956–1990), armed forces of the former German Democratic Republic of eastern Germany, prior to the reunification. Bundeswehr (since 1955), current armed forces of Germany as well as the Federal
Balkans (unlike other European powers, who first proposed a pro-Belgrade policy). This is why Serb authorities sometimes referred to "new German imperialism" as one of the main reasons for Yugoslavia's collapse. German troops participate in the multinational efforts to bring "peace and stability" to the Balkans. Central Europe Weimar triangle (France, Germany and Poland); Germany continues to be active economically in the states of Central Europe, and to actively support the development of democratic institutions. In the 2000s, Germany has been arguably the centerpiece of the European Union (though the importance of France cannot be overlooked in this connection). Oceania See also Anglo-German naval arms race Human rights in Germany List of diplomatic missions in Germany List of diplomatic missions of Germany Security issues in Germany Sino-German cooperation (1911–1941) Visa requirements for German citizens References Further reading German diplomacy Bark, Dennis L., and David R. Gress. A History of West Germany. Vol. 1: From Shadow to Substance, 1945–1963. Vol. 2: Democracy and Its Discontents, 1963–1991 (1993), the standard scholarly history Blumenau, Bernhard, 'German Foreign Policy and the 'German Problem' During and After the Cold War: Changes and Continuities'. in: B Blumenau, J Hanhimäki & B Zanchetta (eds), New Perspectives on the End of the Cold War: Unexpected Transformations? Ch. 5. London: Routledge, 2018. . Brandenburg, Erich. From Bismarck to the World War: A History of German Foreign Policy 1870-1914 (1927) online. Buse, Dieter K., and Juergen C. Doerr, eds. Modern Germany: an encyclopedia of history, people and culture, 1871-1990 (2 vol. Garland, 1998). Clark, Claudia. Dear Barack: The Extraordinary Partnership of Barack Obama and Angela Merkel (2021) Cole, Alistair. Franco-German Relations (2000) Feldman, Lily Gardner. Germany's Foreign Policy of Reconciliation: From Enmity to Amity (Rowman & Littlefield; 2012) 393 pages; on German relations with France, Israel, Poland, and Czechoslovakia/the Czech Republic. excerpt Forsberg, Tuomas. "From Ostpolitik to ‘frostpolitik’? Merkel, Putin and German foreign policy towards Russia." International Affairs 92.1 (2016): 21-42. online Gaskarth, Jamie, and Kai Oppermann. "Clashing traditions: German foreign policy in a New Era." International Studies Perspectives 22.1 (2021): 84-105. online Geiss, Imanuel. German foreign policy, 1871-1914 (1976) Haftendorn, Helga. German Foreign Policy Since 1945 (2006), 441pp Hanrieder, Wolfram F. Germany, America, Europe: Forty Years of German Foreign Policy (1991) Heuser, Beatrice. NATO, Britain, France & the FRG: Nuclear Strategies & Forces for Europe, 1949-2000 (1997) 256pp Hewitson, Mark. "Germany and France before the First World War: a reassessment of Wilhelmine foreign policy." English Historical Review 115.462 (2000): 570–606. in JSTOR Junker, Detlef, ed. The United States and Germany in the Era of the Cold War (2 vol 2004), 150 short essays by scholars covering 1945–1990 excerpt and text search vol 1; excerpt and text search vol 2 Kefferputz, Roderick and Jeremy Stern. "The United States, Germany, and World Order: New Priorities for a Changing Alliance." Atlantic Council: Issue Brief (2021) online Kimmich, Christoph. German Foreign Policy 1918-1945: A Guide to Research and Research Materials (2nd ed. Scholarly Resources, 1991) 264 pp. Leitz, Christian. Nazi Foreign Policy, 1933-1941: The Road to Global War (2004) Maulucci Jr., Thomas W. Adenauer's Foreign Office: West German Diplomacy in the Shadow of the Third Reich (2012) excerpt Oppermann, Kai. "National role conceptions, domestic constraints and the new 'normalcy' in German foreign policy: the Eurozone crisis, Libya and beyond." German Politics;; 21.4 (2012): 502-519. Paterson, William E. "Foreign Policy in the Grand Coalition." German politics 19.3-4 (2010): 497-514. Papayoanou, Paul A. "Interdependence, institutions, and the balance of power: Britain, Germany, and World War I." International Security 20.4 (1996): 42–76. Schwarz, Hans-Peter. Konrad Adenauer: A German Politician and Statesman in a Period of War, Revolution and Reconstruction (2 vol 1995) excerpt and text search vol 2. Schmitt, Bernadotte E. "Triple Alliance and Triple Entente, 1902-1914." American Historical Review 29.3 (1924): 449–473. in JSTOR Sontag, Raymond James. Germany and England: Background of Conflict, 1848-1898 (1938) Spang, Christian W. and Rolf-Harald Wippich, eds. Japanese-German Relations, 1895-1945: War, Diplomacy and Public Opinion (2006) Weinberg, Gerhard L. The Foreign Policy of Hitler’s Germany (2 vol, 1970–80). Wright, Jonathan. Germany and the Origins of the Second World War (Palgrave Macmillan, 2007) 223pp. online review Young, William. German Diplomatic Relations 1871-1945: The Wilhelmstrasse and the Formulation of Foreign Policy (2006); how the foreign ministry shaped policy World/European diplomatic context Albrecht-Carrié, René. A Diplomatic History of Europe Since the Congress of Vienna (1958), 736pp; a basic introduction that gives context to Germany's roles Kaiser, David E. Economic Diplomacy and the Origins of the Second World War: Germany, Britain, France, and Eastern Europe, 1930-1939 (Princeton UP, 2015). Kennedy, Paul. The Rise and Fall of the Great Powers: Economic Change and Military Conflict from 1500 to 2000 (1989) excerpt and text search; very wide-ranging, with much on economic power Langer, William. An Encyclopedia of World History (5th ed. 1973), very detailed outline Langer, William. European Alliances and Alignments 1870-1890 (2nd ed. 1950); advanced coverage of Bismarckian system Langer, William L. The Diplomacy of Imperialism 1890-1902 (2 vol, 1935) Macmillan, Margaret. The War That Ended Peace: The Road to 1914 (2013) cover 1890s to 1914; see esp. ch 3–5, 8, Mowat, R. B. A History of European Diplomacy 1815-1914 (1922), basic introduction Schroeder, Paul W. The Transformation of European Politics 1763-1848 (1996) Steiner, Zara. The Lights that Failed: European International History 1919-1933 (2007) excerpt and text search Steiner, Zara. The Triumph of the Dark: European International History 1933-1939 (2011) excerpt and text search Taylor, A. J. P. The Struggle for Mastery in Europe: 1848–1918 (1957) excerpt and text search, advanced coverage of all major powers External links German -Bashing and the Breakup of Yugoslavia, ("The Donald W. Treadgold Papers in Russian, East European and Central Asian Studies, nº 16, March 1998). University of Washington: HMJ School of International Studies The German Economy in the New Europe EU Enlargement and Transatlantic Relations Bierling, Stephan. Die Außenpolitik der Bundesrepublik Deutschland: Normen, Akteure, Entscheidungen. 2. Auflage. München: Oldenbourg, 2005 . von Bredow, Wilfried. Die Außenpolitik der
and civil society groups increases the quality of conflict resolution, development cooperation and humanitarian aid for fragile states. The framework seeks to benefit from the expertise of the NGOs in exchange for these groups to have a chance for influencing foreign policy. Disputes In 2001, the discovery that the terrorist cell which carried out the attacks against the United States on 11 September 2001, was based in Hamburg, sent shock waves through the country. The government of Chancellor Gerhard Schröder backed the following U.S. military actions, sending Bundeswehr troops to Afghanistan to lead a joint NATO program to provide security in the country after the ousting of the Taliban. Nearly all of the public was strongly against America's 2003 invasion of Iraq, and any deployment of troops. This position was shared by the SPD/Green government, which led to some friction with the United States. In August 2006, the German government disclosed a botched plot to bomb two German trains. The attack was to occur in July 2006 and involved a 21-year-old Lebanese man, identified only as Youssef Mohammed E. H. Prosecutors said Youssef and another man left suitcases stuffed with crude propane-gas bombs on the trains. As of February 2007, Germany had about 3,000 NATO-led International Security Assistance Force force in Afghanistan as part of the War on Terrorism, the third largest contingent after the United States (14,000) and the United Kingdom (5,200). German forces are mostly in the more secure north of the country. However, Germany, along with some other larger European countries (with the exception of the UK and the Netherlands), have been criticised by the UK and Canada for not sharing the burden of the more intensive combat operations in southern Afghanistan. Global initiatives Humanitarian aid Germany is the largest net contributor to the United Nations and has several development agencies working in Africa and the Middle East. The development policy of the Federal Republic of Germany is an independent area of German foreign policy. It is formulated by the Federal Ministry for Economic Cooperation and Development (BMZ) and carried out by the implementing organisations. The German government sees development policy as a joint responsibility of the international community. It is the world's third biggest aid donor after the United States and France. Germany spent 0.37 per cent of its gross domestic product (GDP) on development, which is below the government's target of increasing aid to 0.51 per cent of GDP by 2010. The international target of 0.7% of GNP would have not been reached either. Ecological involvement International organizations Germany is a member of the Council of Europe, European Union, European Space Agency, G4, G8, International Monetary Fund, NATO, OECD, Organization for Security and Co-operation in Europe, UN, World Bank Group and the World Trade Organization. European Union European integration has gone a long way since the European Coal and Steel Community (ECSC) and the Elysée Treaty. Peaceful collaborations with its neighbors remain one of Germany's biggest political objectives, and Germany has been on the forefront of most achievements made in European integration: Maastricht Treaty Most of the social issues facing European countries in general: immigration, aging populations, straining social-welfare and pension systems – are all important in Germany. Germany seeks to maintain peace through the "deepening" of integration among current members of the European Union member states European Defence Force Introduction of the single currency € Euro Germany has been the largest net contributor to EU budgets for decades (in absolute terms – given Germany's comparatively large population – not per capita) and seeks to limit the growth of these net payments in the enlarged union. European Constitution NATO Under the doctrine introduced by the 2003 Defense Policy Guidelines, Germany continues to give priority to the transatlantic partnership with the United States through the North Atlantic Treaty Organization. However, Germany is giving increasing attention to coordinating its policies with the European Union through the Common Foreign and Security Policy. UN The German Federal Government began an initiative to obtain a permanent seat in the United Nations Security Council, as part of the Reform of the United Nations. This would require approval of a two-thirds majority of the member states and approval of all five Security Council veto powers. This aspiration could be successful due to Germany's good relations with the People's Republic of China and the Russian Federation. Germany is a stable and democratic republic and a G7 country which are also favourable attributes. The United Kingdom and France support German ascension to the supreme body. The U.S. is sending mixed signals. NATO member
longest coalition talks in history, making the head of the party Sigmar Gabriel vice-chancellor and federal minister for economic affairs and energy. Together they held 504 of a total 631 seats (CDU/CSU 311 and SPD 193). The only two opposition parties were The Left (64 seats) and Alliance '90/The Greens (63 seats), which was acknowledged as creating a critical situation in which the opposition parties did not even have enough seats to use the special controlling powers of the opposition. 2017–2021 The 19th federal elections in Germany took place on 24 September 2017. The two big parties, the conservative parliamentary group CDU/CSU and the social democrat SPD were in a similar situation as in 2009, after the last grand coalition had ended, and both had suffered severe losses; reaching their second worst and worst result respectively in 2017. Many votes in the 2017 elections went to smaller parties, leading the right-wing populist party AfD (Alternative for Germany) into the Bundestag which marked a big shift in German politics since it was the first far-right party to win seats in parliament since the 1950s. With Merkel's candidacy for a fourth term, the CDU/CSU only reached 33.0% of the votes, but won the highest number of seats, leaving no realistic coalition option without the CDU/CSU. As all parties in the Bundestag strictly ruled out a coalition with the AfD, the only options for a majority coalition were a so-called "Jamaican" coalition (CDU/CSU, FDP, Greens; named after the party colors resembling those of the Jamaican flag) and a grand coalition with the SPD, which was at first opposed by the Social Democrats and their leader Martin Schulz. Coalition talks between the three parties of the "Jamaican" coalition were held but the final proposal was rejected by the liberals of the FDP, leaving the government in limbo. Following the unprecedented situation, for the first time in German history different minority coalitions or even direct snap coalitions were also heavily discussed. At this point, Federal President Steinmeier invited leaders of all parties for talks about a government, being the first president in the history of the Federal Republic to do so. Official coalition talks between CDU/CSU and SPD started in January 2018 and led to a renewal of the grand coalition on 12 March 2018 as well as the subsequent re-election of Angela Merkel as chancellor. 2021 onwards Scheduled elections for the new Bundestag were held on 26 September 2021 during the COVID-19 pandemic. Angela Merkel did not stand for a fifth term but handed her post over after the second longest term for a chancellor in German history. Olaf Scholz was sworn in as the new chancellor on 8 December 2021. His Social Democrats had won the majority of votes and formed a liberal-left coalition government with The Greens and the FDP. Constitution The "Basic Law for the Federal Republic of Germany" (Grundgesetz der Bundesrepublik Deutschland) is the Constitution of Germany. It was formally approved on 8 May 1949, and, with the signature of the Allies of World War II on 12 May, came into effect on 23 May, as the constitution of those states of West Germany that were initially included within the Federal Republic. The 1949 Basic Law is a response to the perceived flaws of the 1919 Weimar Constitution, which failed to prevent the rise of the Nazi party in 1933. Since 1990, in the course of the reunification process after the fall of the Berlin Wall, the Basic Law also applies to the eastern states of the former German Democratic Republic. Executive Head of state The German head of state is the federal president. As in Germany's parliamentary system of government, the federal chancellor runs the government and day-to-day politics, while the role of the federal president is mostly ceremonial. The federal president, by their actions and public appearances, represents the state itself, its existence, its legitimacy, and unity. Their office involves an integrative role. Nearly all actions of the federal president become valid only after a countersignature of a government member. The president is not obliged by Constitution to refrain from political views. He or she is expected to give direction to general political and societal debates, but not in a way that links him to party politics. Most German presidents were active politicians and party members prior to the office, which means that they have to change their political style when becoming president. The function comprises the official residence of Bellevue Palace. Under Article 59 (1) of the Basic Law, the federal president represents the Federal Republic of Germany in matters of international law, concludes treaties with foreign states on its behalf and accredits diplomats. All federal laws must be signed by the president before they can come into effect; he or she does not have a veto, but the conditions for refusing to sign a law on the basis of unconstitutionality are the subject of debate. The office is currently held by Frank-Walter Steinmeier (since 2017). The federal president does have a role in the political system, especially at the establishment of a new government and the dissolution of the Bundestag (parliament). This role is usually nominal but can become significant in case of political instability. Additionally, a federal president together with the Federal Council can support the government in a "legislatory emergency state" to enable laws against the will of the Bundestag (Article 81 of the Basic Law). However, so far the federal president has never had to use these "reserve powers". Head of government The Bundeskanzler (federal chancellor) heads the Bundesregierung (federal government) and thus the executive branch of the federal government. They are elected by and responsible to the Bundestag, Germany's parliament. The other members of the government are the federal ministers; they are chosen by the Chancellor. Germany, like the United Kingdom, can thus be classified as a parliamentary system. The office is currently held by Olaf Scholz (since 2021). The Chancellor cannot be removed from office during a four-year term unless the Bundestag has agreed on a successor. This constructive vote of no confidence is intended to avoid a similar situation to that of the Weimar Republic in which the executive did not have enough support in the legislature to govern effectively, but the legislature was too divided to name a successor. The current system also prevents the Chancellor from calling a snap election. Except in the periods 1969–1972 and 1976–1982, when the Social Democratic party of Chancellor Brandt and Schmidt came in second in the elections, the chancellor has always been the candidate of the largest party, usually supported by a coalition of two parties with a majority in the parliament. The chancellor appoints one of the federal ministers as their deputy, who has the unofficial title Vice Chancellor (). The office is currently held by Robert Habeck (since 2021). Cabinet The German Cabinet (Bundeskabinett or Bundesregierung) is the chief executive body of the Federal Republic of Germany. It consists of the chancellor and the cabinet ministers. The fundamentals of the cabinet's organization are set down in articles 62–69 of the Basic Law. The current cabinet is Scholz (since 2021). Agencies Agencies of the German government include: Federal Intelligence Service (Bundesnachrichtendienst) Federal Bureau of Aircraft Accident Investigation (Bundesstelle für Flugunfalluntersuchung) Federal Aviation Office (Luftfahrt-Bundesamt) Federal Bureau for Maritime Casualty Investigation (Bundesstelle für Seeunfalluntersuchung) Federal Maritime and Hydrographic Agency (Bundesamt für Seeschifffahrt und Hydrographie) Federal Railway Accident Investigation Board (Eisenbahn-Unfalluntersuchungsstelle des Bundes) Federal Railway Authority (Eisenbahn-Bundesamt) Legislature Federal legislative power is divided between the Bundestag and the Bundesrat. The Bundestag is directly elected by the German people, while the Bundesrat represents the governments of the regional states (Länder). The federal legislature has powers of exclusive jurisdiction and concurrent jurisdiction with the states in areas specified in the constitution. The Bundestag is more powerful than the Bundesrat and only needs the latter's consent for proposed legislation related to revenue shared by the federal and state governments, and the imposition of responsibilities on the states. In practice, however, the agreement of the Bundesrat in the legislative process is often required, since federal legislation frequently has to be executed by state or local agencies. In the event of disagreement between the Bundestag and the Bundesrat, either side can appeal to the (Mediation Committee), a conference committee-like body of 16 Bundesrat and 16 Bundestag members, to find a compromise. Bundestag The Bundestag (Federal Diet) is elected for a four-year term and consists of 598 or more members elected by a means of mixed-member proportional representation, which Germans call "personalised proportional representation". 299 members represent single-seat constituencies and are elected by a first past the post electoral system. Parties that obtain fewer constituency seats than their national share of the vote are allotted seats from party lists to make up the difference. In contrast, parties that obtain more constituency seats than their national share of the vote are allowed to keep these so-called overhang seats. In the parliament that was elected in 2009, there were 24
history the possibility of gaining an absolute majority. Their former coalition partner, the FDP, narrowly failed to reach the 5% threshold and did not gain seats in the Bundestag. Not having reached an absolute majority, the CDU/CSU formed a grand coalition with the social-democratic SPD after the longest coalition talks in history, making the head of the party Sigmar Gabriel vice-chancellor and federal minister for economic affairs and energy. Together they held 504 of a total 631 seats (CDU/CSU 311 and SPD 193). The only two opposition parties were The Left (64 seats) and Alliance '90/The Greens (63 seats), which was acknowledged as creating a critical situation in which the opposition parties did not even have enough seats to use the special controlling powers of the opposition. 2017–2021 The 19th federal elections in Germany took place on 24 September 2017. The two big parties, the conservative parliamentary group CDU/CSU and the social democrat SPD were in a similar situation as in 2009, after the last grand coalition had ended, and both had suffered severe losses; reaching their second worst and worst result respectively in 2017. Many votes in the 2017 elections went to smaller parties, leading the right-wing populist party AfD (Alternative for Germany) into the Bundestag which marked a big shift in German politics since it was the first far-right party to win seats in parliament since the 1950s. With Merkel's candidacy for a fourth term, the CDU/CSU only reached 33.0% of the votes, but won the highest number of seats, leaving no realistic coalition option without the CDU/CSU. As all parties in the Bundestag strictly ruled out a coalition with the AfD, the only options for a majority coalition were a so-called "Jamaican" coalition (CDU/CSU, FDP, Greens; named after the party colors resembling those of the Jamaican flag) and a grand coalition with the SPD, which was at first opposed by the Social Democrats and their leader Martin Schulz. Coalition talks between the three parties of the "Jamaican" coalition were held but the final proposal was rejected by the liberals of the FDP, leaving the government in limbo. Following the unprecedented situation, for the first time in German history different minority coalitions or even direct snap coalitions were also heavily discussed. At this point, Federal President Steinmeier invited leaders of all parties for talks about a government, being the first president in the history of the Federal Republic to do so. Official coalition talks between CDU/CSU and SPD started in January 2018 and led to a renewal of the grand coalition on 12 March 2018 as well as the subsequent re-election of Angela Merkel as chancellor. 2021 onwards Scheduled elections for the new Bundestag were held on 26 September 2021 during the COVID-19 pandemic. Angela Merkel did not stand for a fifth term but handed her post over after the second longest term for a chancellor in German history. Olaf Scholz was sworn in as the new chancellor on 8 December 2021. His Social Democrats had won the majority of votes and formed a liberal-left coalition government with The Greens and the FDP. Constitution The "Basic Law for the Federal Republic of Germany" (Grundgesetz der Bundesrepublik Deutschland) is the Constitution of Germany. It was formally approved on 8 May 1949, and, with the signature of the Allies of World War II on 12 May, came into effect on 23 May, as the constitution of those states of West Germany that were initially included within the Federal Republic. The 1949 Basic Law is a response to the perceived flaws of the 1919 Weimar Constitution, which failed to prevent the rise of the Nazi party in 1933. Since 1990, in the course of the reunification process after the fall of the Berlin Wall, the Basic Law also applies to the eastern states of the former German Democratic Republic. Executive Head of state The German head of state is the federal president. As in Germany's parliamentary system of government, the federal chancellor runs the government and day-to-day politics, while the role of the federal president is mostly ceremonial. The federal president, by their actions and public appearances, represents the state itself, its existence, its legitimacy, and unity. Their office involves an integrative role. Nearly all actions of the federal president become valid only after a countersignature of a government member. The president is not obliged by Constitution to refrain from political views. He or she is expected to give direction to general political and societal debates, but not in a way that links him to party politics. Most German presidents were active politicians and party members prior to the office, which means that they have to change their political style when becoming president. The function comprises the official residence of Bellevue Palace. Under Article 59 (1) of the Basic Law, the federal president represents the Federal Republic of Germany in matters of international law, concludes treaties with foreign states on its behalf and accredits diplomats. All federal laws must be signed by the president before they can come into effect; he or she does not have a veto, but the conditions for refusing to sign a law on the basis of unconstitutionality are the subject of debate. The office is currently held by Frank-Walter Steinmeier (since 2017). The federal president does have a role in the political system, especially at the establishment of a new government and the dissolution of the Bundestag (parliament). This role is usually nominal but can become significant in case of political instability. Additionally, a federal president together with the Federal Council can support the government in a "legislatory emergency state" to enable laws against the will of the Bundestag (Article 81 of the Basic Law). However, so far the federal president has never had to use these "reserve powers". Head of government The Bundeskanzler (federal chancellor) heads the Bundesregierung (federal government) and thus the executive branch of the federal government. They are
diagonals of a cyclic quadrilateral: Brahmagupta's theorem: If a cyclic quadrilateral has diagonals that are perpendicular to each other, then the perpendicular line drawn from the point of intersection of the diagonals to any side of the quadrilateral always bisects the opposite side. Chapter 12 also included a formula for the area of a cyclic quadrilateral (a generalization of Heron's formula), as well as a complete description of rational triangles (i.e. triangles with rational sides and rational areas). Brahmagupta's formula: The area, A, of a cyclic quadrilateral with sides of lengths a, b, c, d, respectively, is given by where s, the semiperimeter, given by: Brahmagupta's Theorem on rational triangles: A triangle with rational sides and rational area is of the form: for some rational numbers and . Chinese geometry The first definitive work (or at least oldest existent) on geometry in China was the Mo Jing, the Mohist canon of the early philosopher Mozi (470-390 BC). It was compiled years after his death by his followers around the year 330 BC. Although the Mo Jing is the oldest existent book on geometry in China, there is the possibility that even older written material existed. However, due to the infamous Burning of the Books in a political maneuver by the Qin Dynasty ruler Qin Shihuang (r. 221-210 BC), multitudes of written literature created before his time were purged. In addition, the Mo Jing presents geometrical concepts in mathematics that are perhaps too advanced not to have had a previous geometrical base or mathematic background to work upon. The Mo Jing described various aspects of many fields associated with physical science, and provided a small wealth of information on mathematics as well. It provided an 'atomic' definition of the geometric point, stating that a line is separated into parts, and the part which has no remaining parts (i.e. cannot be divided into smaller parts) and thus forms the extreme end of a line is a point. Much like Euclid's first and third definitions and Plato's 'beginning of a line', the Mo Jing stated that "a point may stand at the end (of a line) or at its beginning like a head-presentation in childbirth. (As to its invisibility) there is nothing similar to it." Similar to the atomists of Democritus, the Mo Jing stated that a point is the smallest unit, and cannot be cut in half, since 'nothing' cannot be halved. It stated that two lines of equal length will always finish at the same place, while providing definitions for the comparison of lengths and for parallels, along with principles of space and bounded space. It also described the fact that planes without the quality of thickness cannot be piled up since they cannot mutually touch. The book provided definitions for circumference, diameter, and radius, along with the definition of volume. The Han Dynasty (202 BC-220 AD) period of China witnessed a new flourishing of mathematics. One of the oldest Chinese mathematical texts to present geometric progressions was the Suàn shù shū of 186 BC, during the Western Han era. The mathematician, inventor, and astronomer Zhang Heng (78-139 AD) used geometrical formulas to solve mathematical problems. Although rough estimates for pi (π) were given in the Zhou Li (compiled in the 2nd century BC), it was Zhang Heng who was the first to make a concerted effort at creating a more accurate formula for pi. Zhang Heng approximated pi as 730/232 (or approx 3.1466), although he used another formula of pi in finding a spherical volume, using the square root of 10 (or approx 3.162) instead. Zu Chongzhi (429-500 AD) improved the accuracy of the approximation of pi to between 3.1415926 and 3.1415927, with 355⁄113 (密率, Milü, detailed approximation) and 22⁄7 (约率, Yuelü, rough approximation) being the other notable approximation. In comparison to later works, the formula for pi given by the French mathematician Franciscus Vieta (1540-1603) fell halfway between Zu's approximations. The Nine Chapters on the Mathematical Art The Nine Chapters on the Mathematical Art, the title of which first appeared by 179 AD on a bronze inscription, was edited and commented on by the 3rd century mathematician Liu Hui from the Kingdom of Cao Wei. This book included many problems where geometry was applied, such as finding surface areas for squares and circles, the volumes of solids in various three-dimensional shapes, and included the use of the Pythagorean theorem. The book provided illustrated proof for the Pythagorean theorem, contained a written dialogue between of the earlier Duke of Zhou and Shang Gao on the properties of the right angle triangle and the Pythagorean theorem, while also referring to the astronomical gnomon, the circle and square, as well as measurements of heights and distances. The editor Liu Hui listed pi as 3.141014 by using a 192 sided polygon, and then calculated pi as 3.14159 using a 3072 sided polygon. This was more accurate than Liu Hui's contemporary Wang Fan, a mathematician and astronomer from Eastern Wu, would render pi as 3.1555 by using 142⁄45. Liu Hui also wrote of mathematical surveying to calculate distance measurements of depth, height, width, and surface area. In terms of solid geometry, he figured out that a wedge with rectangular base and both sides sloping could be broken down into a pyramid and a tetrahedral wedge. He also figured out that a wedge with trapezoid base and both sides sloping could be made to give two tetrahedral wedges separated by a pyramid. Furthermore, Liu Hui described Cavalieri's principle on volume, as well as Gaussian elimination. From the Nine Chapters, it listed the following geometrical formulas that were known by the time of the Former Han Dynasty (202 BCE–9 CE). Areas for the Square Rectangle Circle Isosceles triangle Rhomboid Trapezoid Double trapezium Segment of a circle Annulus ('ring' between two concentric circles) Volumes for the Parallelepiped with two square surfaces Parallelepiped with no square surfaces Pyramid Frustum of pyramid with square base Frustum of pyramid with rectangular base of unequal sides Cube Prism Wedge with rectangular base and both sides sloping Wedge with trapezoid base and both sides sloping Tetrahedral wedge Frustum of a wedge of the second type (used for applications in engineering) Cylinder Cone with circular base Frustum of a cone Sphere Continuing the geometrical legacy of ancient China, there were many later figures to come, including the famed astronomer and mathematician Shen Kuo (1031-1095 CE), Yang Hui (1238-1298) who discovered Pascal's Triangle, Xu Guangqi (1562-1633), and many others. Islamic Golden Age By the beginning of the 9th century, the "Islamic Golden Age" flourished, the establishment of the House of Wisdom in Baghdad marking a separate tradition of science in the medieval Islamic world, building not only Hellenistic but also on Indian sources. Although the Islamic mathematicians are most famed for their work on algebra, number theory and number systems, they also made considerable contributions to geometry, trigonometry and mathematical astronomy, and were responsible for the development of algebraic geometry. Al-Mahani (born 820) conceived the idea of reducing geometrical problems such as duplicating the cube to problems in algebra. Al-Karaji (born 953) completely freed algebra from geometrical operations and replaced them with the arithmetical type of operations which are at the core of algebra today. Thābit ibn Qurra (known as Thebit in Latin) (born 836) contributed to a number of areas in mathematics, where he played an important role in preparing the way for such important mathematical discoveries as the extension of the concept of number to (positive) real numbers, integral calculus, theorems in spherical trigonometry, analytic geometry, and non-Euclidean geometry. In astronomy Thabit was one of the first reformers of the Ptolemaic system, and in mechanics he was a founder of statics. An important geometrical aspect of Thabit's work was his book on the composition of ratios. In this book, Thabit deals with arithmetical operations applied to ratios of geometrical quantities. The Greeks had dealt with geometric quantities but had not thought of them in the same way as numbers to which the usual rules of arithmetic could be applied. By introducing arithmetical operations on quantities previously regarded as geometric and non-numerical, Thabit started a trend which led eventually to the generalisation of the number concept. In some respects, Thabit is critical of the ideas of Plato and Aristotle, particularly regarding motion. It would seem that here his ideas are based on an acceptance of using arguments concerning motion in his geometrical arguments. Another important contribution Thabit made to geometry was his generalization of the Pythagorean theorem, which he extended from special right triangles to all triangles in general, along with a general proof. Ibrahim ibn Sinan ibn Thabit (born 908), who introduced a method of integration more general than that of Archimedes, and al-Quhi (born 940) were leading figures in a revival and continuation of Greek higher geometry in the Islamic world. These mathematicians, and in particular Ibn al-Haytham, studied optics and investigated the optical properties of mirrors made from conic sections. Astronomy, time-keeping and geography provided other motivations for geometrical and trigonometrical research. For example, Ibrahim ibn Sinan and his grandfather Thabit ibn Qurra both studied curves required in the construction of sundials. Abu'l-Wafa and Abu Nasr Mansur both applied spherical geometry to astronomy. A 2007 paper in the journal Science suggested that girih tiles possessed properties consistent with self-similar fractal quasicrystalline tilings such as the Penrose tilings. Renaissance The transmission of the Greek Classics to medieval Europe via the Arabic literature of the 9th to 10th century "Islamic Golden Age" began in the 10th century and culminated in the Latin translations of the 12th century. A copy of Ptolemy's Almagest was brought back to Sicily by Henry Aristippus (d. 1162), as a gift from the Emperor to King William I (r. 1154–1166). An anonymous student at Salerno travelled to Sicily and translated the Almagest as well as several works by Euclid from Greek to Latin. Although the Sicilians generally translated directly from the Greek, when Greek texts were not available, they would translate from Arabic. Eugenius of Palermo (d. 1202) translated Ptolemy's Optics into Latin, drawing on his knowledge of all three languages in the task. The rigorous deductive methods of geometry found in Euclid's Elements of Geometry were relearned, and further development of geometry in the styles of both Euclid (Euclidean geometry) and Khayyam (algebraic geometry) continued, resulting in an abundance of new theorems and concepts, many of them very profound and elegant. Advances in the treatment of perspective were made in Renaissance art of the 14th to 15th century which went beyond what had
Qin Dynasty ruler Qin Shihuang (r. 221-210 BC), multitudes of written literature created before his time were purged. In addition, the Mo Jing presents geometrical concepts in mathematics that are perhaps too advanced not to have had a previous geometrical base or mathematic background to work upon. The Mo Jing described various aspects of many fields associated with physical science, and provided a small wealth of information on mathematics as well. It provided an 'atomic' definition of the geometric point, stating that a line is separated into parts, and the part which has no remaining parts (i.e. cannot be divided into smaller parts) and thus forms the extreme end of a line is a point. Much like Euclid's first and third definitions and Plato's 'beginning of a line', the Mo Jing stated that "a point may stand at the end (of a line) or at its beginning like a head-presentation in childbirth. (As to its invisibility) there is nothing similar to it." Similar to the atomists of Democritus, the Mo Jing stated that a point is the smallest unit, and cannot be cut in half, since 'nothing' cannot be halved. It stated that two lines of equal length will always finish at the same place, while providing definitions for the comparison of lengths and for parallels, along with principles of space and bounded space. It also described the fact that planes without the quality of thickness cannot be piled up since they cannot mutually touch. The book provided definitions for circumference, diameter, and radius, along with the definition of volume. The Han Dynasty (202 BC-220 AD) period of China witnessed a new flourishing of mathematics. One of the oldest Chinese mathematical texts to present geometric progressions was the Suàn shù shū of 186 BC, during the Western Han era. The mathematician, inventor, and astronomer Zhang Heng (78-139 AD) used geometrical formulas to solve mathematical problems. Although rough estimates for pi (π) were given in the Zhou Li (compiled in the 2nd century BC), it was Zhang Heng who was the first to make a concerted effort at creating a more accurate formula for pi. Zhang Heng approximated pi as 730/232 (or approx 3.1466), although he used another formula of pi in finding a spherical volume, using the square root of 10 (or approx 3.162) instead. Zu Chongzhi (429-500 AD) improved the accuracy of the approximation of pi to between 3.1415926 and 3.1415927, with 355⁄113 (密率, Milü, detailed approximation) and 22⁄7 (约率, Yuelü, rough approximation) being the other notable approximation. In comparison to later works, the formula for pi given by the French mathematician Franciscus Vieta (1540-1603) fell halfway between Zu's approximations. The Nine Chapters on the Mathematical Art The Nine Chapters on the Mathematical Art, the title of which first appeared by 179 AD on a bronze inscription, was edited and commented on by the 3rd century mathematician Liu Hui from the Kingdom of Cao Wei. This book included many problems where geometry was applied, such as finding surface areas for squares and circles, the volumes of solids in various three-dimensional shapes, and included the use of the Pythagorean theorem. The book provided illustrated proof for the Pythagorean theorem, contained a written dialogue between of the earlier Duke of Zhou and Shang Gao on the properties of the right angle triangle and the Pythagorean theorem, while also referring to the astronomical gnomon, the circle and square, as well as measurements of heights and distances. The editor Liu Hui listed pi as 3.141014 by using a 192 sided polygon, and then calculated pi as 3.14159 using a 3072 sided polygon. This was more accurate than Liu Hui's contemporary Wang Fan, a mathematician and astronomer from Eastern Wu, would render pi as 3.1555 by using 142⁄45. Liu Hui also wrote of mathematical surveying to calculate distance measurements of depth, height, width, and surface area. In terms of solid geometry, he figured out that a wedge with rectangular base and both sides sloping could be broken down into a pyramid and a tetrahedral wedge. He also figured out that a wedge with trapezoid base and both sides sloping could be made to give two tetrahedral wedges separated by a pyramid. Furthermore, Liu Hui described Cavalieri's principle on volume, as well as Gaussian elimination. From the Nine Chapters, it listed the following geometrical formulas that were known by the time of the Former Han Dynasty (202 BCE–9 CE). Areas for the Square Rectangle Circle Isosceles triangle Rhomboid Trapezoid Double trapezium Segment of a circle Annulus ('ring' between two concentric circles) Volumes for the Parallelepiped with two square surfaces Parallelepiped with no square surfaces Pyramid Frustum of pyramid with square base Frustum of pyramid with rectangular base of unequal sides Cube Prism Wedge with rectangular base and both sides sloping Wedge with trapezoid base and both sides sloping Tetrahedral wedge Frustum of a wedge of the second type (used for applications in engineering) Cylinder Cone with circular base Frustum of a cone Sphere Continuing the geometrical legacy of ancient China, there were many later figures to come, including the famed astronomer and mathematician Shen Kuo (1031-1095 CE), Yang Hui (1238-1298) who discovered Pascal's Triangle, Xu Guangqi (1562-1633), and many others. Islamic Golden Age By the beginning of the 9th century, the "Islamic Golden Age" flourished, the establishment of the House of Wisdom in Baghdad marking a separate tradition of science in the medieval Islamic world, building not only Hellenistic but also on Indian sources. Although the Islamic mathematicians are most famed for their work on algebra, number theory and number systems, they also made considerable contributions to geometry, trigonometry and mathematical astronomy, and were responsible for the development of algebraic geometry. Al-Mahani (born 820) conceived the idea of reducing geometrical problems such as duplicating the cube to problems in algebra. Al-Karaji (born 953) completely freed algebra from geometrical operations and replaced them with the arithmetical type of operations which are at the core of algebra today. Thābit ibn Qurra (known as Thebit in Latin) (born 836) contributed to a number of areas in mathematics, where he played an important role in preparing the way for such important mathematical discoveries as the extension of the concept of number to (positive) real numbers, integral calculus, theorems in spherical trigonometry, analytic geometry, and non-Euclidean geometry. In astronomy Thabit was one of the first reformers of the Ptolemaic system, and in mechanics he was a founder of statics. An important geometrical aspect of Thabit's work was his book on the composition of ratios. In this book, Thabit deals with arithmetical operations applied to ratios of geometrical quantities. The Greeks had dealt with geometric quantities but had not thought of them in the same way as numbers to which the usual rules of arithmetic could be applied. By introducing arithmetical operations on quantities previously regarded as geometric and non-numerical, Thabit started a trend which led eventually to the generalisation of the number concept. In some respects, Thabit is critical of the ideas of Plato and Aristotle, particularly regarding motion. It would seem that here his ideas are based on an acceptance of using arguments concerning motion in his geometrical arguments. Another important contribution Thabit made to geometry was his generalization of the Pythagorean theorem, which he extended from special right triangles to all triangles in general, along with a general proof. Ibrahim ibn Sinan ibn Thabit (born 908), who introduced a method of integration more general than that of Archimedes, and al-Quhi (born 940) were leading figures in a revival and continuation of Greek higher geometry in the Islamic world. These mathematicians, and in particular Ibn al-Haytham, studied optics and investigated the optical properties of mirrors made from conic sections. Astronomy, time-keeping and geography provided other motivations for geometrical and trigonometrical research. For example, Ibrahim ibn Sinan and his grandfather Thabit ibn Qurra both studied curves required in the construction of sundials. Abu'l-Wafa and Abu Nasr Mansur both applied spherical geometry to astronomy. A 2007 paper in the journal Science suggested that girih tiles possessed properties consistent with self-similar fractal quasicrystalline tilings such as the Penrose tilings. Renaissance The transmission of the Greek Classics to medieval Europe via the Arabic literature of the 9th to 10th century "Islamic Golden Age" began in the 10th century and culminated in the Latin translations of the 12th century. A copy of Ptolemy's Almagest was brought back to Sicily by Henry Aristippus (d. 1162), as a gift from the Emperor to King William I (r. 1154–1166). An anonymous student at Salerno travelled to Sicily and translated the Almagest as well as several works by Euclid from Greek to Latin. Although the Sicilians generally translated directly from the Greek, when Greek texts were not available, they would translate from Arabic. Eugenius of Palermo (d. 1202) translated Ptolemy's Optics into Latin, drawing on his knowledge of all three languages in the task. The rigorous deductive methods of geometry found in Euclid's Elements of Geometry were relearned, and further development of geometry in the styles of both Euclid (Euclidean geometry) and Khayyam (algebraic geometry) continued, resulting in an abundance of new theorems and concepts, many of them very profound and elegant. Advances in the treatment of perspective were made in Renaissance art of the 14th to 15th century which went beyond what had been achieved in antiquity. In Renaissance architecture of the Quattrocento, concepts of architectural order were explored and rules were formulated. A prime example of is the Basilica di San Lorenzo in Florence by Filippo Brunelleschi (1377–1446). In c. 1413 Filippo Brunelleschi demonstrated the geometrical method of perspective, used today by artists, by painting the outlines of various Florentine buildings onto a mirror. Soon after, nearly every artist in Florence and in Italy used geometrical perspective in their paintings, notably Masolino da Panicale and Donatello. Melozzo da Forlì first used the technique of upward foreshortening (in Rome, Loreto, Forlì and others), and was celebrated for that. Not only was perspective a way of showing depth, it was also a new method of composing a painting. Paintings began to show a single, unified scene, rather than a combination of several. As shown by the quick proliferation of accurate perspective paintings in Florence, Brunelleschi likely understood (with help from his friend the mathematician Toscanelli), but did not publish, the mathematics behind perspective. Decades later, his friend Leon Battista Alberti wrote De pictura (1435/1436), a treatise on proper methods of showing distance in painting based on Euclidean geometry. Alberti was also trained in the science of optics through the school of Padua and under the influence of Biagio Pelacani da Parma who studied Alhazen's Optics'. Piero della Francesca elaborated on Della Pittura in his De Prospectiva Pingendi in the 1470s. Alberti had limited himself to figures on the ground plane and giving an overall basis for perspective. Della Francesca fleshed it out, explicitly covering solids in any area of the picture plane. Della Francesca also started the now common practice of using illustrated figures to explain the mathematical concepts, making his treatise easier to understand than Alberti's. Della Francesca was also the first to accurately draw the Platonic solids as they would appear in perspective. Perspective remained, for a while, the domain of Florence. Jan van Eyck,
enrolled at Yale College, where he took part in an accelerated program that enabled him to graduate in two and a half years rather than the usual four. He was a member of the Delta Kappa Epsilon fraternity and was elected its president. He also captained the Yale baseball team and played in the first two College World Series as a left-handed first baseman. Like his father, he was a member of the Yale cheerleading squad and was initiated into the Skull and Bones secret society. He graduated Phi Beta Kappa in 1948 with a Bachelor of Arts degree, majoring in economics and minoring in sociology. Business career (1948–1963) After graduating from Yale, Bush moved his young family to West Texas. Biographer Jon Meacham writes that Bush's relocation to Texas allowed him to move out of the "daily shadow of his Wall Street father and Grandfather Walker, two dominant figures in the financial world", but would still allow Bush to "call on their connections if he needed to raise capital." His first position in Texas was an oil field equipment salesman for Dresser Industries, which was led by family friend Neil Mallon. While working for Dresser, Bush lived in various places with his family: Odessa, Texas; Ventura, Bakersfield and Compton, California; and Midland, Texas. In 1952, he volunteered for the successful presidential campaign of Republican candidate Dwight D. Eisenhower. That same year, his father won election to represent Connecticut in the United States Senate as a member of the Republican Party. With support from Mallon and Bush's uncle, George Herbert Walker Jr., Bush and John Overbey launched the Bush-Overbey Oil Development Company in 1951. In 1953 he co-founded the Zapata Petroleum Corporation, an oil company that drilled in the Permian Basin in Texas. In 1954, he was named president of the Zapata Offshore Company, a subsidiary which specialized in offshore drilling. Shortly after the subsidiary became independent in 1959, Bush moved the company and his family from Midland to Houston. There, he befriended James Baker, a prominent attorney who later became an important political ally. Bush remained involved with Zapata until the mid-1960s, when he sold his stock in the company for approximately $1 million. In 1988, The Nation published an article alleging that Bush worked as an operative of the Central Intelligence Agency (CIA) during the 1960s; Bush denied this claim. Early political career (1963–1971) Entry into politics By the early 1960s, Bush was widely regarded as an appealing political candidate, and some leading Democrats attempted to convince Bush to become a Democrat. He declined to leave the Republican Party, later citing his belief that the national Democratic Party favored "big, centralized government". The Democratic Party had historically dominated Texas, but Republicans scored their first major victory in the state with John G. Tower's victory in a 1961 special election to the United States Senate. Motivated by Tower's victory, and hoping to prevent the far-right John Birch Society from coming to power, Bush ran for the chairmanship of the Harris County Republican Party, winning election in February 1963. Like most other Texas Republicans, Bush supported conservative Senator Barry Goldwater over the more centrist Nelson Rockefeller in the 1964 Republican Party presidential primaries. In 1964, Bush sought to unseat liberal Democrat Ralph W. Yarborough in Texas's U.S. Senate election. Bolstered by superior fundraising, Bush won the Republican primary by defeating former gubernatorial nominee Jack Cox in a run-off election. In the general election, Bush attacked Yarborough's vote for the Civil Rights Act of 1964, which banned racial and gender discrimination in public institutions and in many privately owned businesses. Bush argued that the act unconstitutionally expanded the powers of the federal government, but he was privately uncomfortable with the racial politics of opposing the act. He lost the election 56 percent to 44 percent, though he did run well ahead of Barry Goldwater, the Republican presidential nominee. Despite the loss, the New York Times reported that Bush was "rated by political friend and foe alike as the Republicans' best prospect in Texas because of his attractive personal qualities and the strong campaign he put up for the Senate". U.S. House of Representatives In 1966, Bush ran for the United States House of Representatives in Texas's 7th congressional district, a newly redistricted seat in the Greater Houston area. Initial polling showed him trailing his Democratic opponent, Harris County District Attorney Frank Briscoe, but he ultimately won the race with 57 percent of the vote. In an effort to woo potential candidates in the South and Southwest, House Republicans secured Bush an appointment to the powerful United States House Committee on Ways and Means, making Bush the first freshman to serve on the committee since 1904. His voting record in the House was generally conservative. He supported the Nixon administration's Vietnam policies, but broke with Republicans on the issue of birth control, which he supported. He also voted for the Civil Rights Act of 1968, although it was generally unpopular in his district. In 1968, Bush joined several other Republicans in issuing the party's Response to the State of the Union address; Bush's part of the address focused on a call for fiscal responsibility. Though most other Texas Republicans supported Ronald Reagan in the 1968 Republican Party presidential primaries, Bush endorsed Richard Nixon, who went on to win the party's nomination. Nixon considered selecting Bush as his running mate in the 1968 presidential election, but he ultimately chose Spiro Agnew instead. Bush won re-election to the House unopposed, while Nixon defeated Hubert Humphrey in the presidential election. In 1970, with President Nixon's support, Bush gave up his seat in the House to run for the Senate against Yarborough. Bush easily won the Republican primary, but Yarborough was defeated by the more conservative Lloyd Bentsen in the Democratic primary. Ultimately, Bentsen defeated Bush, taking 53.5 percent of the vote. Nixon and Ford administrations (1971–1977) Ambassador to the United Nations After the 1970 Senate election, Bush accepted a position as a senior adviser to the president, but he convinced Nixon to instead appoint him as the U.S. Ambassador to the United Nations. The position represented Bush's first foray into foreign policy, as well as his first major experiences with the Soviet Union and China, the two major U.S. rivals in the Cold War. During Bush's tenure, the Nixon administration pursued a policy of détente, seeking to ease tensions with both the Soviet Union and China. Bush's ambassadorship was marked by a defeat on the China question, as the United Nations General Assembly voted, in Resolution 2758, to expel the Republic of China and replace it with the People's Republic of China in October 1971. In the 1971 crisis in Pakistan, Bush supported an Indian motion at the UN General Assembly to condemn the Pakistani government of Yahya Khan for waging genocide in East Pakistan (modern Bangladesh), referring to the "tradition which we have supported that the human rights question transcended domestic jurisdiction and should be freely debated". Bush's support for India at the UN put him into conflict with Nixon who was supporting Pakistan, partly because Yahya Khan was a useful intermediary in his attempts to reach out to China and partly because the president was fond of Yahya Khan. Chairman of the Republican National Committee After Nixon won a landslide victory in the 1972 presidential election, he appointed Bush as chair of the Republican National Committee (RNC). In that position, he was charged with fundraising, candidate recruitment, and making appearances on behalf of the party in the media. When Agnew was being investigated for corruption, Bush assisted, at the request of Nixon and Agnew, in pressuring John Glenn Beall Jr., the U.S. Senator from Maryland, to force his brother, George Beall the U.S. Attorney in Maryland, who was supervising the investigation into Agnew. Attorney Beall ignored the pressure. During Bush's tenure at the RNC, the Watergate scandal emerged into public view; the scandal originated from the June 1972 break-in of the Democratic National Committee, but also involved later efforts to cover up the break-in by Nixon and other members of the White House. Bush initially defended Nixon steadfastly, but as Nixon's complicity became clear he focused more on defending the Republican Party. Following the resignation of Vice President Agnew in 1973 for a scandal unrelated to Watergate, Bush was considered for the position of vice president, but the appointment instead went to Gerald Ford. After the public release of an audio recording that confirmed that Nixon had plotted to use the CIA to cover up the Watergate break-in, Bush joined other party leaders in urging Nixon to resign. When Nixon resigned on August 9, 1974, Bush noted in his diary that "There was an aura of sadness, like somebody died... The [resignation] speech was vintage Nixon—a kick or two at the press—enormous strains. One couldn't help but look at the family and the whole thing and think of his accomplishments and then think of the shame... [President Gerald Ford's swearing-in offered] indeed a new spirit, a new lift." Head of U.S. Liaison Office in China Upon his ascension to the presidency, Ford strongly considered Bush, Donald Rumsfeld, and Nelson Rockefeller for the vacant position of vice president. Ford ultimately chose Nelson Rockefeller, partly because of the publication of a news report claiming that Bush's 1970 campaign had benefited from a secret fund set up by Nixon; Bush was later cleared of any suspicion by a special prosecutor. Bush accepted appointment as Chief of the U.S. Liaison Office in the People's Republic of China, making him the de facto ambassador to China. According to biographer Jon Meacham, Bush's time in China convinced him that American engagement abroad was needed to ensure global stability, and that the United States "needed to be visible but not pushy, muscular but not domineering." Director of Central Intelligence In January 1976, Ford brought Bush back to Washington to become the Director of Central Intelligence (DCI), placing him in charge of the CIA. In the aftermath of the Watergate scandal and the Vietnam War, the CIA's reputation had been damaged for its role in various covert operations, and Bush was tasked with restoring the agency's morale and public reputation. During Bush's year in charge of the CIA, the U.S. national security apparatus actively supported Operation Condor operations and right-wing military dictatorships in Latin America. Meanwhile, Ford decided to drop Rockefeller from the ticket for the 1976 presidential election; he considered Bush as his running mate, but ultimately chose Bob Dole. In his capacity as DCI, Bush gave national security briefings to Jimmy Carter both as a presidential candidate and as president-elect. 1980 presidential election Bush's tenure at the CIA ended after Carter narrowly defeated Ford in the 1976 presidential election. Out of public office for the first time since the 1960s, Bush became chairman on the executive committee of the First International Bank in Houston. He also spent a year as a part-time professor of Administrative Science at Rice University's Jones School of Business, continued his membership in the Council on Foreign Relations, and joined the Trilateral Commission. Meanwhile, he began to lay the groundwork for his candidacy in the 1980 Republican Party presidential primaries. In the 1980 Republican primary campaign, Bush faced Ronald Reagan, who was widely regarded as the front-runner, as well as other contenders like Senator Bob Dole, Senator Howard Baker, Texas Governor John Connally, Congressman Phil Crane, and Congressman John B. Anderson. Bush's campaign cast him as a youthful, "thinking man's candidate" who would emulate the pragmatic conservatism of President Eisenhower. In the midst of the Soviet–Afghan War, which brought an end to a period of détente, and the Iran hostage crisis, in which 52 Americans were taken hostage, the campaign highlighted Bush's foreign policy experience. At the outset of the race, Bush focused heavily on winning the January 21 Iowa caucuses, making 31 visits to the state. He won a close victory in Iowa with 31.5% to Reagan's 29.4%. After the win, Bush stated that his campaign was full of momentum, or "the Big Mo", and Reagan reorganized his campaign. Partly in response to the Bush campaign's frequent questioning of Reagan's age (Reagan turned 69 in 1980), the Reagan campaign stepped up attacks on Bush, painting him as an elitist who was not truly committed to conservatism. Prior to the New Hampshire primary, Bush and Reagan agreed to a two-person debate, organized by The Nashua Telegraph but paid for by the Reagan campaign. Days before the debate, Reagan announced that he would invite four other candidates to the debate; Bush, who had hoped that the one-on-one debate would allow him to emerge as the main alternative to Reagan in the primaries, refused to debate the other candidates. All six candidates took the stage, but Bush refused to speak in the presence of the other candidates. Ultimately, the other four candidates left the stage and the debate continued, but Bush's refusal to debate anyone other than Reagan badly damaged his campaign in New Hampshire. He ended up decisively losing New Hampshire's primary to Reagan, winning just 23 percent of the vote. Bush revitalized his campaign with a victory in Massachusetts, but lost the next several primaries. As Reagan built up a commanding delegate lead, Bush refused to end his campaign, but the other candidates dropped out of the race. Criticizing his more conservative rival's policy proposals, Bush famously labeled Reagan's supply side-influenced plans for massive tax cuts as "voodoo economics". Though he favored lower taxes, Bush feared that dramatic reductions in taxation would lead to deficits and, in turn, cause inflation. After Reagan clinched a majority of delegates in late May, Bush reluctantly dropped out of the race. At the 1980 Republican National Convention, Reagan made the last-minute decision to select Bush as his vice presidential nominee after negotiations with Ford regarding a Reagan–Ford ticket collapsed. Though Reagan had resented many of the Bush campaign's attacks during the primary campaign, and several conservative leaders had actively opposed Bush's nomination, Reagan ultimately decided that Bush's popularity with moderate Republicans made him the best and safest pick. Bush, who had believed his political career might be over following the primaries, eagerly accepted the position and threw himself into campaigning for the Reagan–Bush ticket. The 1980 general election campaign between Reagan and Carter was conducted amid a multitude of domestic concerns and the ongoing Iran hostage crisis, and Reagan sought to focus the race on Carter's handling of the economy. Though the race was widely regarded as a close contest for most of the campaign, Reagan ultimately won over the large majority of undecided voters. Reagan took 50.7 percent of the popular vote and 489 of the 538 electoral votes, while Carter won 41% of the popular vote and John Anderson, running as an independent candidate, won 6.6% of the popular vote. Vice Presidency (1981–1989) As vice president, Bush generally maintained a low profile, recognizing the constitutional limits of the office; he avoided decision-making or criticizing Reagan in any way. This approach helped him earn Reagan's trust, easing tensions left over from their earlier rivalry. Bush also generally enjoyed a good relationship with Reagan staffers, including his close friend Jim Baker, who served as Reagan's initial chief of staff. His understanding of the vice presidency was heavily influenced by Vice President Walter Mondale, who enjoyed a strong relationship with President Carter in part because of his ability to avoid confrontations with senior staff and Cabinet members, and by Vice President Nelson Rockefeller's difficult relationship with some members of the White House staff during the Ford administration. The Bushes attended a large number of public and ceremonial events in their positions, including many state funerals, which became a common joke for comedians. As the President of the Senate, Bush also stayed in contact with members of Congress and kept the president informed on occurrences on Capitol Hill. First term On March 30, 1981, while Bush was in Texas, Reagan was shot and seriously wounded by John Hinckley Jr. Bush immediately flew back to Washington D.C.; when his plane landed, his aides advised him to proceed directly to the White House by helicopter in order to show that the government was still functioning. Bush rejected the idea, as he feared that such a dramatic scene risked giving the impression that he sought to usurp Reagan's powers and prerogatives. During Reagan's short period of incapacity, Bush presided over Cabinet meetings, met with congressional leaders and foreign leaders, and briefed reporters, but he consistently rejected the possibility of invoking the Twenty-fifth Amendment. Bush's handling of the attempted assassination and its aftermath made a positive impression on Reagan, who recovered and returned to work within two weeks of the shooting. From then on, the two men would have regular Thursday lunches in the Oval Office. Bush was assigned by Reagan to chair two special task forces, one on deregulation and one on international drug smuggling. Both were popular issues with conservatives, and Bush, largely a moderate, began courting them through his work. The deregulation task force reviewed hundreds of rules, making specific recommendations on which ones to amend or revise, in order to curb the size of the federal government. The Reagan administration's deregulation push had a strong impact on broadcasting, finance, resource extraction, and other economic activities, and the administration eliminated numerous government positions. Bush also oversaw the administration's national security crisis management organization, which had traditionally been the responsibility of the National Security Advisor. In 1983, Bush toured Western Europe as part of the Reagan administration's ultimately successful efforts to convince skeptical NATO allies to support the deployment of Pershing II missiles. Reagan's approval ratings fell after his first year in office, but they bounced back when the United States began to emerge from recession in 1983. Former Vice President Walter Mondale was nominated by the Democratic Party in the 1984 presidential election. Down in the polls, Mondale selected Congresswoman Geraldine Ferraro as his running mate in hopes of galvanizing support for his campaign, thus making Ferraro the first female major party vice presidential nominee in U.S. history. She and Bush squared off in a single televised vice presidential debate. Public opinion polling consistently showed a Reagan lead in the 1984 campaign, and Mondale was unable to shake up the race. In the end, Reagan won re-election, winning 49 of 50 states and receiving 59% of the popular vote to Mondale's 41%. Second term Mikhail Gorbachev came to power in the Soviet Union in 1985. Rejecting the ideologically rigidity of his three elderly sick predecessors, Gorbachev insisted on urgently needed economic and political reforms called "glasnost" (openness) and "perestroika" (restructuring). At the 1987 Washington Summit, Gorbachev and Reagan signed the Intermediate-Range Nuclear Forces Treaty, which committed both signatories to the total abolition of their respective short-range and medium-range missile stockpiles. The treaty marked the beginning of a new era of trade, openness, and cooperation between the two powers. President Reagan and Secretary of State George Shultz took the lead in these negotiations, but Bush sat in on many meetings. Bush did not agree with many of the Reagan policies, but he did tell Gorbachev that he would seek to continue improving relations if he succeeded Reagan. On July 13, 1985, Bush became the first vice president to serve as acting president when Reagan underwent surgery to remove polyps from his colon; Bush served as the acting president for approximately eight hours. In 1986, the Reagan administration was shaken by a scandal when it was revealed that administration officials had secretly arranged weapon sales to Iran during the Iran–Iraq War. The officials had used the proceeds to fund the Contra rebels in their fight against the leftist Sandinista government in Nicaragua. Democrats had passed a law that appropriated funds could not be used to help the Contras. Instead the administration used non-appropriated funds from the sales. When news of affair broke to the media, Bush stated that he had been "out of the loop" and unaware of the diversion of funds. Biographer Jon Meacham writes that "no evidence was ever produced proving Bush was aware of the diversion to the contras," but he criticizes Bush's "out of the loop" characterization, writing that the "record is clear that Bush was aware that the United States, in contravention of its own stated policy, was trading arms for hostages". The Iran–Contra scandal, as it became known, did serious damage to the Reagan presidency, raising questions about Reagan's competency. Congress established the Tower Commission to investigate the scandal, and, at Reagan's request, a panel of federal judges appointed Lawrence Walsh as a special prosecutor charged with investigating the Iran–Contra scandal. The investigations continued after Reagan left office and, though Bush was never charged with a crime, the Iran–Contra scandal would remain a political liability for him. On July 3, 1988, the guided missile cruiser accidentally shot down Iran Air Flight 655, killing 290 passengers. Bush, then-vice president, defended his country at the UN by arguing that the U.S. attack had been a wartime incident and the crew of Vincennes had acted appropriately to the situation. 1988 presidential election Bush began planning for a presidential run after the 1984 election, and he officially entered the 1988 Republican Party presidential primaries in October 1987. He put together a campaign led by Reagan staffer Lee Atwater, and which also included his son, George W. Bush, and media consultant Roger Ailes. Though he had moved to the right during his time as vice president, endorsing a Human Life Amendment and repudiating his earlier comments on "voodoo economics," Bush still faced opposition from many conservatives in the Republican Party. His major rivals for the Republican nomination were Senate Minority Leader Bob Dole of Kansas, Congressman Jack Kemp of New York, and Christian televangelist Pat Robertson. Reagan did not publicly endorse any candidate, but he privately expressed support for Bush. Though considered the early front-runner for the nomination, Bush came in third in the Iowa caucus, behind Dole and Robertson. Much as Reagan had done in 1980, Bush reorganized his staff and concentrated on the New Hampshire primary. With help from Governor John H. Sununu and an effective campaign attacking Dole for raising taxes, Bush overcame an initial polling deficit and won New Hampshire with 39 percent of the vote. After Bush won South Carolina and 16 of the 17 states holding a primary on Super Tuesday, his competitors dropped out of the race. Bush, occasionally criticized for his lack of eloquence when compared to Reagan, delivered a well-received speech at the Republican convention. Known as the "thousand points of light" speech, it described Bush's vision of America: he endorsed the Pledge of Allegiance, prayer in schools, capital punishment, and gun rights. Bush also pledged that he would not raise taxes, stating: "Congress will push me to raise taxes, and I'll say no, and they'll push, and I'll say no, and they'll push again. And all I can say to them is: read my lips. No new taxes." Bush selected little-known Senator Dan Quayle of Indiana as his running mate. Though Quayle had compiled an unremarkable record in Congress, he was popular among many conservatives, and the campaign hoped that Quayle's youth would appeal to younger voters. Meanwhile, the Democratic Party nominated Governor Michael Dukakis, who was known for presiding over an economic turnaround in Massachusetts. Leading in the general election polls against Bush, Dukakis ran an ineffective, low-risk campaign. The Bush campaign attacked Dukakis as an unpatriotic liberal extremist and seized on the Willie Horton case, in which a convicted felon from Massachusetts raped a woman while on a prison furlough, a program Dukakis supported as governor. The Bush campaign charged that Dukakis presided over a "revolving door" that allowed dangerous convicted felons to leave prison. Dukakis damaged his own campaign with a widely mocked ride in an M1 Abrams tank and a poor performance at the second presidential debate. Bush also attacked Dukakis for opposing a law that would require all students to recite the Pledge of Allegiance. The election is widely considered to have had a high level of negative campaigning, though political scientist John Geer has argued that the share of negative ads was in line with previous presidential elections. Bush defeated Dukakis by a margin of 426 to 111 in the Electoral College, and he took 53.4 percent of the national popular vote. Bush ran well in all the major regions of the country, but especially in the South. He became the fourth sitting vice president to be elected president and the first to do so since Martin Van Buren in 1836 and the first person to succeed a president from his own party via election since Herbert Hoover in 1929. In the concurrent congressional elections, Democrats retained control of both houses of Congress. Presidency (1989–1993) Bush was inaugurated on January 20, 1989, succeeding Ronald Reagan. In his inaugural address, Bush said: Bush's first major appointment was that of James Baker as Secretary of State. Leadership of the Department of Defense went to Dick Cheney, who had previously served as Gerald Ford's chief of staff and would later serve as vice president under his son George W. Bush. Jack Kemp joined the administration as Secretary of Housing and Urban Development, while Elizabeth Dole, the wife of Bob Dole and a former Secretary of Transportation, became the Secretary of Labor under Bush. Bush retained several Reagan officials, including Secretary of the Treasury Nicholas F. Brady, Attorney General Dick Thornburgh, and Secretary of Education Lauro Cavazos. New Hampshire Governor John Sununu, a strong supporter of Bush during the 1988 campaign, became chief of staff. Brent Scowcroft was appointed as the National Security Advisor, a role he had also held under Ford. Foreign affairs End of the Cold War During the first year of his tenure, Bush put a pause on Reagan's détente policy toward the USSR. Bush and his advisers were initially divided on Gorbachev; some administration officials saw him as a democratic reformer, but others suspected him of trying to make the minimum changes necessary to restore the Soviet Union to a competitive position with the United States. In 1989, all the Communist governments collapsed in Eastern Europe. Gorbachev declined to send in the Soviet military, effectively abandoning the Brezhnev Doctrine. The U.S. was not directly involved in these upheavals, but the Bush administration avoided gloating over the demise of the Eastern Bloc to avoid undermining further democratic reforms. Bush and Gorbachev met at the Malta Summit in December 1989. Though many on the right remained wary of Gorbachev, Bush came away with the belief that Gorbachev would negotiate in good faith. For the
him earn Reagan's trust, easing tensions left over from their earlier rivalry. Bush also generally enjoyed a good relationship with Reagan staffers, including his close friend Jim Baker, who served as Reagan's initial chief of staff. His understanding of the vice presidency was heavily influenced by Vice President Walter Mondale, who enjoyed a strong relationship with President Carter in part because of his ability to avoid confrontations with senior staff and Cabinet members, and by Vice President Nelson Rockefeller's difficult relationship with some members of the White House staff during the Ford administration. The Bushes attended a large number of public and ceremonial events in their positions, including many state funerals, which became a common joke for comedians. As the President of the Senate, Bush also stayed in contact with members of Congress and kept the president informed on occurrences on Capitol Hill. First term On March 30, 1981, while Bush was in Texas, Reagan was shot and seriously wounded by John Hinckley Jr. Bush immediately flew back to Washington D.C.; when his plane landed, his aides advised him to proceed directly to the White House by helicopter in order to show that the government was still functioning. Bush rejected the idea, as he feared that such a dramatic scene risked giving the impression that he sought to usurp Reagan's powers and prerogatives. During Reagan's short period of incapacity, Bush presided over Cabinet meetings, met with congressional leaders and foreign leaders, and briefed reporters, but he consistently rejected the possibility of invoking the Twenty-fifth Amendment. Bush's handling of the attempted assassination and its aftermath made a positive impression on Reagan, who recovered and returned to work within two weeks of the shooting. From then on, the two men would have regular Thursday lunches in the Oval Office. Bush was assigned by Reagan to chair two special task forces, one on deregulation and one on international drug smuggling. Both were popular issues with conservatives, and Bush, largely a moderate, began courting them through his work. The deregulation task force reviewed hundreds of rules, making specific recommendations on which ones to amend or revise, in order to curb the size of the federal government. The Reagan administration's deregulation push had a strong impact on broadcasting, finance, resource extraction, and other economic activities, and the administration eliminated numerous government positions. Bush also oversaw the administration's national security crisis management organization, which had traditionally been the responsibility of the National Security Advisor. In 1983, Bush toured Western Europe as part of the Reagan administration's ultimately successful efforts to convince skeptical NATO allies to support the deployment of Pershing II missiles. Reagan's approval ratings fell after his first year in office, but they bounced back when the United States began to emerge from recession in 1983. Former Vice President Walter Mondale was nominated by the Democratic Party in the 1984 presidential election. Down in the polls, Mondale selected Congresswoman Geraldine Ferraro as his running mate in hopes of galvanizing support for his campaign, thus making Ferraro the first female major party vice presidential nominee in U.S. history. She and Bush squared off in a single televised vice presidential debate. Public opinion polling consistently showed a Reagan lead in the 1984 campaign, and Mondale was unable to shake up the race. In the end, Reagan won re-election, winning 49 of 50 states and receiving 59% of the popular vote to Mondale's 41%. Second term Mikhail Gorbachev came to power in the Soviet Union in 1985. Rejecting the ideologically rigidity of his three elderly sick predecessors, Gorbachev insisted on urgently needed economic and political reforms called "glasnost" (openness) and "perestroika" (restructuring). At the 1987 Washington Summit, Gorbachev and Reagan signed the Intermediate-Range Nuclear Forces Treaty, which committed both signatories to the total abolition of their respective short-range and medium-range missile stockpiles. The treaty marked the beginning of a new era of trade, openness, and cooperation between the two powers. President Reagan and Secretary of State George Shultz took the lead in these negotiations, but Bush sat in on many meetings. Bush did not agree with many of the Reagan policies, but he did tell Gorbachev that he would seek to continue improving relations if he succeeded Reagan. On July 13, 1985, Bush became the first vice president to serve as acting president when Reagan underwent surgery to remove polyps from his colon; Bush served as the acting president for approximately eight hours. In 1986, the Reagan administration was shaken by a scandal when it was revealed that administration officials had secretly arranged weapon sales to Iran during the Iran–Iraq War. The officials had used the proceeds to fund the Contra rebels in their fight against the leftist Sandinista government in Nicaragua. Democrats had passed a law that appropriated funds could not be used to help the Contras. Instead the administration used non-appropriated funds from the sales. When news of affair broke to the media, Bush stated that he had been "out of the loop" and unaware of the diversion of funds. Biographer Jon Meacham writes that "no evidence was ever produced proving Bush was aware of the diversion to the contras," but he criticizes Bush's "out of the loop" characterization, writing that the "record is clear that Bush was aware that the United States, in contravention of its own stated policy, was trading arms for hostages". The Iran–Contra scandal, as it became known, did serious damage to the Reagan presidency, raising questions about Reagan's competency. Congress established the Tower Commission to investigate the scandal, and, at Reagan's request, a panel of federal judges appointed Lawrence Walsh as a special prosecutor charged with investigating the Iran–Contra scandal. The investigations continued after Reagan left office and, though Bush was never charged with a crime, the Iran–Contra scandal would remain a political liability for him. On July 3, 1988, the guided missile cruiser accidentally shot down Iran Air Flight 655, killing 290 passengers. Bush, then-vice president, defended his country at the UN by arguing that the U.S. attack had been a wartime incident and the crew of Vincennes had acted appropriately to the situation. 1988 presidential election Bush began planning for a presidential run after the 1984 election, and he officially entered the 1988 Republican Party presidential primaries in October 1987. He put together a campaign led by Reagan staffer Lee Atwater, and which also included his son, George W. Bush, and media consultant Roger Ailes. Though he had moved to the right during his time as vice president, endorsing a Human Life Amendment and repudiating his earlier comments on "voodoo economics," Bush still faced opposition from many conservatives in the Republican Party. His major rivals for the Republican nomination were Senate Minority Leader Bob Dole of Kansas, Congressman Jack Kemp of New York, and Christian televangelist Pat Robertson. Reagan did not publicly endorse any candidate, but he privately expressed support for Bush. Though considered the early front-runner for the nomination, Bush came in third in the Iowa caucus, behind Dole and Robertson. Much as Reagan had done in 1980, Bush reorganized his staff and concentrated on the New Hampshire primary. With help from Governor John H. Sununu and an effective campaign attacking Dole for raising taxes, Bush overcame an initial polling deficit and won New Hampshire with 39 percent of the vote. After Bush won South Carolina and 16 of the 17 states holding a primary on Super Tuesday, his competitors dropped out of the race. Bush, occasionally criticized for his lack of eloquence when compared to Reagan, delivered a well-received speech at the Republican convention. Known as the "thousand points of light" speech, it described Bush's vision of America: he endorsed the Pledge of Allegiance, prayer in schools, capital punishment, and gun rights. Bush also pledged that he would not raise taxes, stating: "Congress will push me to raise taxes, and I'll say no, and they'll push, and I'll say no, and they'll push again. And all I can say to them is: read my lips. No new taxes." Bush selected little-known Senator Dan Quayle of Indiana as his running mate. Though Quayle had compiled an unremarkable record in Congress, he was popular among many conservatives, and the campaign hoped that Quayle's youth would appeal to younger voters. Meanwhile, the Democratic Party nominated Governor Michael Dukakis, who was known for presiding over an economic turnaround in Massachusetts. Leading in the general election polls against Bush, Dukakis ran an ineffective, low-risk campaign. The Bush campaign attacked Dukakis as an unpatriotic liberal extremist and seized on the Willie Horton case, in which a convicted felon from Massachusetts raped a woman while on a prison furlough, a program Dukakis supported as governor. The Bush campaign charged that Dukakis presided over a "revolving door" that allowed dangerous convicted felons to leave prison. Dukakis damaged his own campaign with a widely mocked ride in an M1 Abrams tank and a poor performance at the second presidential debate. Bush also attacked Dukakis for opposing a law that would require all students to recite the Pledge of Allegiance. The election is widely considered to have had a high level of negative campaigning, though political scientist John Geer has argued that the share of negative ads was in line with previous presidential elections. Bush defeated Dukakis by a margin of 426 to 111 in the Electoral College, and he took 53.4 percent of the national popular vote. Bush ran well in all the major regions of the country, but especially in the South. He became the fourth sitting vice president to be elected president and the first to do so since Martin Van Buren in 1836 and the first person to succeed a president from his own party via election since Herbert Hoover in 1929. In the concurrent congressional elections, Democrats retained control of both houses of Congress. Presidency (1989–1993) Bush was inaugurated on January 20, 1989, succeeding Ronald Reagan. In his inaugural address, Bush said: Bush's first major appointment was that of James Baker as Secretary of State. Leadership of the Department of Defense went to Dick Cheney, who had previously served as Gerald Ford's chief of staff and would later serve as vice president under his son George W. Bush. Jack Kemp joined the administration as Secretary of Housing and Urban Development, while Elizabeth Dole, the wife of Bob Dole and a former Secretary of Transportation, became the Secretary of Labor under Bush. Bush retained several Reagan officials, including Secretary of the Treasury Nicholas F. Brady, Attorney General Dick Thornburgh, and Secretary of Education Lauro Cavazos. New Hampshire Governor John Sununu, a strong supporter of Bush during the 1988 campaign, became chief of staff. Brent Scowcroft was appointed as the National Security Advisor, a role he had also held under Ford. Foreign affairs End of the Cold War During the first year of his tenure, Bush put a pause on Reagan's détente policy toward the USSR. Bush and his advisers were initially divided on Gorbachev; some administration officials saw him as a democratic reformer, but others suspected him of trying to make the minimum changes necessary to restore the Soviet Union to a competitive position with the United States. In 1989, all the Communist governments collapsed in Eastern Europe. Gorbachev declined to send in the Soviet military, effectively abandoning the Brezhnev Doctrine. The U.S. was not directly involved in these upheavals, but the Bush administration avoided gloating over the demise of the Eastern Bloc to avoid undermining further democratic reforms. Bush and Gorbachev met at the Malta Summit in December 1989. Though many on the right remained wary of Gorbachev, Bush came away with the belief that Gorbachev would negotiate in good faith. For the remainder of his term, Bush sought cooperative relations with Gorbachev, believing that he was the key to peace. The primary issue at the Malta Summit was the potential reunification of Germany. While Britain and France were wary of a re-unified Germany, Bush joined West German Chancellor Helmut Kohl in pushing for German reunification. Bush believed that a reunified Germany would serve American interests. After extensive negotiations, Gorbachev agreed to allow a reunified Germany to be a part of NATO, and Germany officially reunified in October 1990 after paying billions of marks to Moscow. Gorbachev used force to suppress nationalist movements within the Soviet Union itself. A crisis in Lithuania left Bush in a difficult position, as he needed Gorbachev's cooperation in the reunification of Germany and feared that the collapse of the Soviet Union could leave nuclear arms in dangerous hands. The Bush administration mildly protested Gorbachev's suppression of Lithuania's independence movement, but took no action to directly intervene. Bush warned independence movements of the disorder that could come with secession from the Soviet Union; in a 1991 address that critics labeled the "Chicken Kiev speech", he cautioned against "suicidal nationalism". In July 1991, Bush and Gorbachev signed the Strategic Arms Reduction Treaty (START I) treaty, in which both countries agreed to cut their strategic nuclear weapons by 30 percent. In August 1991, hard-line Communists launched a coup against Gorbachev; while the coup quickly fell apart, it broke the remaining power of Gorbachev and the central Soviet government. Later that month, Gorbachev resigned as general secretary of the Communist party, and Russian president Boris Yeltsin ordered the seizure of Soviet property. Gorbachev clung to power as the President of the Soviet Union until December 1991, when the Soviet Union dissolved. Fifteen states emerged from the Soviet Union, and of those states, Russia was the largest and most populous. Bush and Yeltsin met in February 1992, declaring a new era of "friendship and partnership". In January 1993, Bush and Yeltsin agreed to START II, which provided for further nuclear arms reductions on top of the original START treaty. The collapse of the Soviet Union prompted reflections on the future of the world following the end of the Cold War; one political scientist, Francis Fukuyama, speculated that humanity had reached the "end of history" in that liberal, capitalist democracy had permanently triumphed over Communism and fascism. Meanwhile, the collapse of the Soviet Union and other Communist governments led to post-Soviet conflicts in Central Europe, Eastern Europe, Central Asia, and Africa that would continue long after Bush left office. Invasion of Panama Through the late 1980s, the U.S. provided aid to Manuel Noriega, the anti-Communist leader of Panama. Noriega had long standing ties to United States intelligence agencies, including during Bush's tenure as Director of Central Intelligence, and was also deeply involved in drug trafficking. In May 1989, Noriega annulled the results of a democratic presidential election in which Guillermo Endara had been elected. Bush objected to the annulment of the election and worried about the status of the Panama Canal with Noriega still in office. Bush dispatched 2,000 soldiers to the country, where they began conducting regular military exercises in violation of prior treaties. After a U.S. serviceman was shot by Panamanian forces in December 1989, Bush ordered the United States invasion of Panama, known as "Operation Just Cause". The invasion was the first large-scale American military operation in more than 40 years that was not related to the Cold War. American forces quickly took control of the Panama Canal Zone and Panama City. Noriega surrendered on January 3, 1990, and was quickly transported to a prison in the United States. Twenty-three Americans died in the operation, while another 394 were wounded. Noriega was convicted and imprisoned on racketeering and drug trafficking charges in April 1992. Historian Stewart Brewer argues that the invasion "represented a new era in American foreign policy" because Bush did not justify the invasion under the Monroe Doctrine or the threat of Communism, but rather on the grounds that it was in the best interests of the United States. Gulf War Faced with massive debts and low oil prices in the aftermath of the Iran–Iraq War, Iraqi leader Saddam Hussein decided to conquer the country of Kuwait, a small, oil-rich country situated on Iraq's southern border. After Iraq invaded Kuwait in August 1990, Bush imposed economic sanctions on Iraq and assembled a multi-national coalition opposed to the invasion. The administration feared that a failure to respond to the invasion would embolden Hussein to attack Saudi Arabia or Israel, and wanted to discourage other countries from similar aggression. Bush also wanted to ensure continued access to oil, as Iraq and Kuwait collectively accounted for 20 percent of the world's oil production, and Saudi Arabia produced another 26 percent of the world's oil supply. At Bush's insistence, in November 1990, the United Nations Security Council approved a resolution authorizing the use of force if Iraq did not withdraw from Kuwait by January 15, 1991. Gorbachev's support, as well as China's abstention, helped ensure passage of the UN resolution. Bush convinced Britain, France, and other nations to commit soldiers to an operation against Iraq, and he won important financial backing from Germany, Japan, South Korea, Saudi Arabia, and the United Arab Emirates. In January 1991, Bush asked Congress to approve a joint resolution authorizing a war against Iraq. Bush believed that the UN resolution had already provided him with the necessary authorization to launch a military operation against Iraq, but he wanted to show that the nation was united behind a military action. Despite the opposition of a majority of Democrats in both the House and the Senate, Congress approved the Authorization for Use of Military Force Against Iraq Resolution of 1991. After the January 15 deadline passed without an Iraqi withdrawal from Kuwait, U.S. and coalition forces conducted a bombing campaign that devastated Iraq's power grid and communications network, and resulted in the desertion of about 100,000 Iraqi soldiers. In retaliation, Iraq launched Scud missiles at Israel and Saudi Arabia, but most of the missiles did little damage. On February 23, coalition forces began a ground invasion into Kuwait, evicting Iraqi forces by the end of February 27. About 300 Americans, as well as approximately 65 soldiers from other coalition nations, died during the military action. A cease fire was arranged on March 3, and the UN passed a resolution establishing a peacekeeping force in a demilitarized zone between Kuwait and Iraq. A March 1991 Gallup poll showed that Bush had an approval rating of 89 percent, the highest presidential approval rating in the history of Gallup polling. After 1991, the UN maintained economic sanctions against Iraq, and the United Nations Special Commission was assigned to ensure that Iraq did not revive its weapons of mass destruction program. NAFTA In 1987, the U.S. and Canada had reached a free trade agreement that eliminated many tariffs between the two countries. President Reagan had intended it as the first step towards a larger trade agreement to eliminate most tariffs among the United States, Canada, and Mexico. The Bush administration, along with the Progressive Conservative Canadian Prime Minister Brian Mulroney, spearheaded the negotiations of the North American Free Trade Agreement (NAFTA) with Mexico. In addition to lowering tariffs, the proposed treaty would affect patents, copyrights, and trademarks. In 1991, Bush sought fast track authority, which grants the president the power to submit an international trade agreement to Congress without the possibility of amendment. Despite congressional opposition led by House Majority Leader Dick Gephardt, both houses of Congress voted to grant Bush fast track authority. NAFTA was signed in December 1992, after Bush lost re-election, but President Clinton won ratification of NAFTA in 1993. NAFTA remains controversial for its impact on wages, jobs, and overall economic growth. Domestic affairs Economy and fiscal issues The U.S. economy had generally performed well since emerging from recession in late 1982, but it slipped into a mild recession in 1990. The unemployment rate rose from 5.9 percent in 1989 to a high of 7.8 percent in mid-1991. Large federal deficits, spawned during the Reagan years, rose from $152.1 billion in 1989 to $220 billion for 1990; the $220 billion deficit represented a threefold increase since 1980. As the public became increasingly concerned about the economy and other domestic affairs, Bush's well-received handling of foreign affairs became less of an issue for most voters. Bush's top domestic priority was to bring an end to federal budget deficits, which he saw as a liability for the country's long-term economic health and standing in the world. As he was opposed to major defense spending cuts and had pledged to not raise taxes, the president had major difficulties in balancing the budget. Bush and congressional leaders agreed to avoid major changes to the budget for fiscal year 1990, which began in October 1989. However, both sides knew that spending cuts or new taxes would be necessary in the following year's budget in order to avoid the draconian automatic domestic spending cuts required by the Gramm–Rudman–Hollings Balanced Budget Act of 1987. Bush and other leaders also wanted to cut deficits because Federal Reserve Chair Alan Greenspan refused to lower interest rates, and thus stimulate economic growth, unless the federal budget deficit was reduced. In a statement released in late June 1990, Bush said that he would be open to a deficit reduction program which included spending cuts, incentives for economic growth, budget process reform, as well as tax increases. To fiscal conservatives in the Republican Party, Bush's statement represented a betrayal, and they heavily criticized him for compromising so early in the negotiations. In September 1990, Bush and Congressional Democrats announced a compromise to cut funding for mandatory and discretionary programs while also raising revenue, partly through a higher gas tax. The compromise additionally included a "pay as you go" provision that required that new programs be paid for at the time of implementation. House Minority Whip Newt Gingrich led the conservative opposition to the bill, strongly opposing any form of tax increase. Some liberals also criticized the budget cuts in the compromise, and in October, the House rejected the deal, resulting in a brief government shutdown. Without the strong backing of the Republican Party, Bush agreed to another compromise bill, this one more favorable to Democrats. The Omnibus Budget Reconciliation Act of 1990 (OBRA-90), enacted on October 27, 1990, dropped much of the gasoline tax increase in favor of higher income taxes on top earners. It included cuts to domestic spending, but the cuts were not as deep as those that had been proposed in the original compromise. Bush's decision to sign the bill damaged his standing with conservatives and the general public, but it also laid the groundwork for the budget surpluses of the late 1990s. Discrimination The disabled had not received legal protections under the landmark Civil Rights Act of 1964, and many faced discrimination and segregation by the time Bush took office. In 1988, Lowell P. Weicker Jr. and Tony Coelho had introduced the Americans with Disabilities Act, which barred employment discrimination against qualified individuals with disabilities. The bill had passed the Senate but not the House, and it was reintroduced in 1989. Though some conservatives opposed the bill due to its costs and potential burdens on businesses, Bush strongly supported it, partly because his son, Neil, had struggled with dyslexia. After the bill passed both houses of Congress, Bush signed the Americans with Disabilities Act of 1990 into law in July 1990. The act required employers and public accommodations to make "reasonable accommodations" for the disabled, while providing an exception when such accommodations imposed an "undue hardship". Senator Ted Kennedy later led the congressional passage of a separate civil rights bill designed to facilitate launching employment discrimination lawsuits. In vetoing the bill, Bush argued that it would lead to racial quotas in hiring. In November 1991, Bush signed the Civil Rights Act of 1991, which was largely similar to the bill he had vetoed in the previous year. In August 1990, Bush signed the Ryan White CARE Act, the largest federally funded program dedicated to assisting persons living with HIV/AIDS. Throughout his presidency, the AIDS epidemic grew dramatically in the U.S. and around the world, and Bush often found himself at odds with AIDS activist groups who criticized him for not placing a high priority on HIV/AIDS research and funding. Frustrated by the administration's lack of urgency on the issue, ACT UP, dumped the ashes of HIV/AIDS victims on the White House lawn during a viewing of the AIDS Quilt in 1992. By that time, HIV had become the leading cause of death in the U.S. for men aged 25–44. Environment In June 1989, the Bush administration proposed a bill to amend the Clean Air Act. Working with Senate Majority Leader George J. Mitchell, the administration won passage of the amendments over the opposition of business-aligned members of Congress who feared the impact of tougher regulations. The legislation sought to curb acid rain and smog by requiring decreased emissions of chemicals such as sulfur dioxide, and was the first major update to the Clean Air Act since 1977. Bush also signed the Oil Pollution Act of 1990 in response to the Exxon Valdez oil spill. However, the League of Conservation Voters criticized some of Bush's other environmental actions, including his opposition to stricter auto-mileage standards. Points of Light President Bush devoted attention to voluntary service as a means of solving some of America's most serious social problems. He often used the "thousand points of light" theme to describe the power of citizens to solve community problems. In his 1989 inaugural address, President Bush said, "I have spoken of a thousand points of light, of all the community organizations that are spread like stars throughout the Nation, doing good." During his presidency, Bush honored numerous volunteers with the Daily Point of Light Award, a tradition that was continued by his presidential successors. In 1990, the Points of Light Foundation was created as a nonprofit organization in Washington to promote this spirit of volunteerism. In 2007, the Points of Light Foundation merged with the Hands On Network to create a new organization, Points of Light. Judicial appointments Bush appointed two justices to the Supreme Court of the United States. In 1990, Bush appointed a largely unknown state appellate judge, David Souter, to replace liberal icon William Brennan. Souter was easily confirmed and served until 2009, but joined the liberal bloc of the court, disappointing Bush. In 1991, Bush nominated conservative federal judge Clarence Thomas to succeed Thurgood Marshall, a long-time liberal stalwart. Thomas, the former head of the Equal Employment Opportunity Commission (EEOC), faced heavy opposition in the Senate, as well as from pro-choice groups and the NAACP. His nomination faced another difficulty when Anita Hill accused Thomas of having sexually harassed her during his time as the chair of EEOC. Thomas won confirmation in a narrow 52–48 vote; 43 Republicans and 9 Democrats voted to confirm Thomas's nomination, while 46 Democrats and 2 Republicans voted against confirmation. Thomas became one of the most conservative justices of his era. Other issues Bush's education platform consisted mainly of offering federal support for a variety of innovations, such as open enrollment, incentive pay for outstanding teachers, and rewards for schools that improve performance with underprivileged children. Though Bush did not pass a major educational reform package during his presidency, his ideas influenced later reform efforts, including Goals 2000 and the No Child Left Behind Act. Bush signed the Immigration Act of 1990, which led to a 40 percent increase in legal immigration to the United States. The act more than doubled the number of visas given to immigrants on the basis of job skills. In the wake of the savings and loan crisis, Bush proposed a $50 billion package to rescue the savings and loans industry, and also proposed the creation of the Office of Thrift Supervision to regulate the industry. Congress passed the Financial Institutions Reform, Recovery, and Enforcement Act of 1989, which incorporated most of Bush's proposals. Public image Bush was widely seen as a "pragmatic caretaker" president who lacked a unified and compelling long-term theme in his efforts. Indeed, Bush's sound bite where he refers to the issue of overarching purpose as "the vision thing" has become a metonym applied to other political figures accused of similar difficulties. His ability to gain broad international support for the Gulf War and the war's result were seen as both a diplomatic and military triumph, rousing bipartisan approval, though his decision to withdraw without removing Saddam Hussein left mixed feelings, and attention returned to the domestic front and a souring economy. A New York Times article mistakenly depicted Bush as being surprised to see a supermarket barcode reader; the report of his reaction exacerbated the notion that he was "out of touch". Amid the early 1990s recession, his image shifted from "conquering hero" to "politician befuddled by economic matters". At the elite level, a number of commentators and political experts deplored the state of American politics in 1991–1992, and reported the voters were angry. Many analysts blamed the poor quality of national election campaigns. 1992 presidential campaign Bush announced his reelection bid in early 1992; with a coalition victory in the Persian Gulf War and high approval ratings, Bush's reelection initially looked likely. As a result, many leading Democrats, including Mario Cuomo, Dick Gephardt, and Al Gore, declined to seek their party's presidential nomination. However, Bush's tax increase had angered many conservatives, who believed that Bush had strayed from the conservative principles of Ronald Reagan. He faced a challenge from conservative political columnist Pat Buchanan in the 1992 Republican primaries. Bush fended off Buchanan's challenge and won his party's nomination at the 1992 Republican National Convention, but the convention adopted a socially conservative platform strongly influenced by the Christian right. Meanwhile, the Democrats nominated Governor Bill Clinton of Arkansas. A moderate who was affiliated with the Democratic Leadership Council (DLC), Clinton favored welfare reform, deficit reduction, and a tax cut for the middle class. In early 1992, the race took an unexpected twist when Texas billionaire H. Ross Perot launched a third party bid, claiming that neither Republicans nor Democrats could eliminate the deficit and make government more efficient. His message appealed to voters across the political spectrum disappointed with both parties' perceived fiscal irresponsibility. Perot also attacked NAFTA, which he claimed would lead to major job losses. National polling taken in mid-1992 showed Perot in the lead, but Clinton experienced a surge through effective campaigning and the selection of Senator Al Gore, a popular and relatively young Southerner, as his running mate. Clinton won the election, taking 43 percent of the popular vote and 370 electoral votes, while Bush won 37.5 percent of the popular vote and 168 electoral votes. Perot won 19% of the popular vote, one of the highest totals for a third-party candidate in U.S. history, drawing equally from both major candidates, according to exit polls. Clinton performed well in the Northeast, the Midwest, and the West Coast, while also waging the strongest Democratic campaign in the South since the 1976 election. Several factors were important in Bush's defeat. The ailing economy which arose from recession may have been the main factor in Bush's loss, as 7 in 10 voters said on election day that the economy was either "not so good" or "poor". On the eve of the 1992 election, the unemployment rate stood at 7.8%, which was the highest it had been since 1984. The president was also damaged by his alienation of many conservatives in his party. Bush blamed Perot in part for his defeat, though exit polls showed that Perot drew his voters about equally from Clinton and Bush. Despite his defeat, Bush left office with a 56 percent job approval rating in January 1993. Like many of his predecessors, Bush issued a series of pardons during his last days in office. In December 1992, he granted executive clemency to six former senior government officials implicated in the Iran-Contra scandal, most prominently former Secretary of Defense Caspar Weinberger. The charges against the six were that they lied to or withheld information from Congress. The pardons effectively brought an end to the Iran-Contra scandal. According to Seymour Martin Lipset, the 1992 election had several unique characteristics. Voters felt that economic conditions were worse than they actually were, which harmed Bush. A rare event was the presence of a strong third-party candidate. Liberals launched a backlash against 12 years of a conservative White House. The chief factor was Clinton uniting his party, and winning over a number of heterogeneous groups. Post-presidency (1993–2018) Appearances After leaving office, Bush and his wife built a retirement house in the community of West Oaks, Houston. He established a presidential office within the Park Laureate Building on Memorial Drive in Houston. He also frequently spent time at his vacation home in Kennebunkport, took annual cruises in Greece, went on fishing trips in Florida, and visited the Bohemian Club in Northern California. He declined to serve on corporate boards, but delivered numerous paid speeches and served as an adviser to The Carlyle Group, a private equity firm. He never published his memoirs, but he and Brent Scowcroft co-wrote A World Transformed, a 1999 work on foreign policy. Portions of his letters and his diary were later published as The China Diary of George H. W. Bush and All The Best, George Bush. During a 1993 visit to Kuwait, Bush was targeted in an assassination plot directed by the Iraqi Intelligence Service. President Clinton retaliated when he ordered the firing of 23 cruise missiles at Iraqi Intelligence Service headquarters in Baghdad. Bush did not publicly comment on the assassination attempt or the missile strike, but privately spoke with Clinton shortly before the strike took place. In the 1994 gubernatorial elections, his sons George W. and Jeb concurrently ran for Governor of Texas and Governor of Florida. Concerning their political careers, he advised them both that "[a]t some point both of you may want to say 'Well, I don't agree with my Dad on that point' or 'Frankly I think Dad was wrong on that.' Do it. Chart your own course, not just on the issues but on defining yourselves". George W. won his race against Ann Richards while Jeb lost to Lawton Chiles. After the results came in, the elder Bush told ABC, "I have very mixed emotions. Proud father, is the way I would sum it all up." Jeb would again run for governor of Florida in 1998 and win at the same time that his brother George W. won re-election in Texas. It marked the second time in United States history that a pair of brothers served simultaneously as governors. Bush supported his son's candidacy in the 2000 presidential election, but did not actively campaign in the election and did not deliver a speech at the 2000 Republican National Convention. George W. Bush defeated Al Gore in the 2000 election and was re-elected in 2004. Bush and his son thus became the second father–son pair to each serve as President of the United States, following John Adams and John Quincy Adams. Through previous administrations, the elder Bush had ubiquitously been known as "George Bush" or "President Bush", but following his son's election the need to distinguish between them has
Association of the Great Public Schools of New South Wales, an association of private boys' schools, Australia Great Public Schools Association of Queensland Inc., an association of nine Australian schools Grosse Pointe South High School, a public high school in Grosse Pointe, Michigan, US Greenville Public Schools (a.k.a. Greenville Public School District), a school district in Greenville, Mississippi, US The School of Global Policy and Strategy, an institute of international studies at the University of California, San Diego Medicine Goodpasture syndrome, a rare autoimmune
public high school in Grosse Pointe, Michigan, US Greenville Public Schools (a.k.a. Greenville Public School District), a school district in Greenville, Mississippi, US The School of Global Policy and Strategy, an institute of international studies at the University of California, San Diego Medicine Goodpasture syndrome, a rare autoimmune disease Gray platelet syndrome, a rare congenital autosomal recessive bleeding disorder Other uses Seymour Airport (IATA code), Galápagos Islands, Ecuador Fareed Zakaria GPS (Global Public Square), a CNN television show Geometrical Products Specification, an international standard for geometric dimensioning and tolerancing Genealogical Proof Standard (see also Cluster genealogy) "GPS" (song), a song by Maluma "Var är jag", renamed to "GPS", a song by Basshunter from
the distance between the observer and the object, and "thus, we cannot conceive of mechanist material bodies which are extended but not (in themselves) colored". What perceived can be the same type of quality, but completely opposite form each other because of different positions and perceptions, what we perceive can be different even when the same types of things consist of contrary qualities. Secondary qualities aid in people's conception of primary qualities in an object, like how the color of an object leads people to recognize the object itself. More specifically, the color red can be perceived in apples, strawberries, and tomatoes, yet we would not know what these might look like without its color. We would also be unaware of what the color red looked like if red paint, or any object that has a perceived red color, failed to exist. From this, we can see that colors cannot exist on their own and can solely represent a group of perceived objects. Therefore, both primary and secondary qualities are mind-dependent: they cannot exist without our minds. George Berkeley was a philosopher who was against rationalism and "classical" empiricism. He was a "subjective idealist" or "empirical idealist", who believed that reality is constructed entirely of immaterial, conscious minds and their ideas; everything that exists is somehow dependent on the subject perceiving it, except the subject themselves. He refuted the existence of abstract objects that many other philosophers believed to exist, notably Plato. According to Berkeley, "an abstract object does not exist in space or time and which is therefore entirely non-physical and non-mental"; however, this argument contradicts with his relativity argument. If "esse est percipi", (Latin meaning that to exist is to be perceived) is true, then the objects in the relativity argument made by Berkeley can either exist or not. Berkeley believed that only the minds' perceptions and the Spirit that perceives are what exists in reality; what people perceive every day is only the idea of an object's existence, but the objects themselves are not perceived. Berkeley also discussed how, at times, materials cannot be perceived by oneself, and the mind of oneself cannot understand the objects. However, there also exists an "omnipresent, eternal mind" that Berkeley believed to consist of God and the Spirit, both omniscient and all-perceiving. According to Berkeley, God is the entity who controls everything, yet Berkeley also argued that "abstract object[s] do not exist in space or time". In other words, as Warnock argues, Berkeley "had recognized that he could not square with his own talk of spirits, of our minds and of God; for these are perceivers and not among objects of perception. Thus he says, rather weakly and without elucidation, that in addition to our ideas we also have notions—we know what it means to speak of spirits and their operations." However, the relativity argument violates the idea of immaterialism. Berkeley's immaterialism argues that "esse est percipi (aut percipere)", which in English is to be is to be perceived (or to perceive). That is saying only what perceived or perceives is real, and without our perception or God's nothing can be real. Yet, if the relativity argument, also by Berkeley, argues that the perception of an object depends on the different positions, then this means that what perceived can either be real or not because the perception does not show that whole picture and the whole picture cannot be perceived. Berkeley also believes that "when one perceives mediately, one perceives one idea by means of perceiving another". By this, it can be elaborated that if the standards of what perceived at first are different, what perceived after that can be different, as well. In the heat perception described above, one hand perceived the water to be hot and the other hand perceived the water to be cold due to relativity. If applying the idea "to be is to be perceived", the water should be both cold and hot because both perceptions are perceived by different hands. However, the water cannot be cold and hot at the same time for it self-contradicts, so this shows that what perceived is not always true because it sometimes can break the law of noncontradiction. In this case, "it would be arbitrary anthropocentrism to claim that humans have special access to the true qualities of objects". The truth for different people can be different, and humans are limited to accessing the absolute truth due to relativity. Summing up, nothing can be absolutely true due to relativity or the two arguments, to be is to be perceived and the relativity argument, do not always work together. New theory of vision In his Essay Towards a New Theory of Vision, Berkeley frequently criticised the views of the Optic Writers, a title that seems to include Molyneux, Wallis, Malebranche and Descartes. In sections 1–51, Berkeley argued against the classical scholars of optics by holding that: spatial depth, as the distance that separates the perceiver from the perceived object is itself invisible. That is, we do not see space directly or deduce its form logically using the laws of optics. Space for Berkeley is no more than a contingent expectation that visual and tactile sensations will follow one another in regular sequences that we come to expect through habit. Berkeley goes on to argue that visual cues, such as the perceived extension or 'confusion' of an object, can only be used to indirectly judge distance, because the viewer learns to associate visual cues with tactile sensations. Berkeley gives the following analogy regarding indirect distance perception: one perceives distance indirectly just as one perceives a person's embarrassment indirectly. When looking at an embarrassed person, we infer indirectly that the person is embarrassed by observing the red color on the person's face. We know through experience that a red face tends to signal embarrassment, as we've learned to associate the two. The question concerning the visibility of space was central to the Renaissance perspective tradition and its reliance on classical optics in the development of pictorial representations of spatial depth. This matter was debated by scholars since the 11th-century Arab polymath and mathematician Alhazen (al-Hasan Ibn al-Haytham) affirmed in experimental contexts the visibility of space. This issue, which was raised in Berkeley's theory of vision, was treated at length in the Phenomenology of Perception of Maurice Merleau-Ponty, in the context of confirming the visual perception of spatial depth (la profondeur), and by way of refuting Berkeley's thesis. Berkeley wrote about the perception of size in addition to that of distance. He is frequently misquoted as believing in size–distance invariance—a view held by the Optic Writers. This idea is that we scale the image size according to distance in a geometrical manner. The error may have become commonplace because the eminent historian and psychologist E. G. Boring perpetuated it. In fact, Berkeley argued that the same cues that evoke distance also evoke size, and that we do not first see size and then calculate distance. It is worth quoting Berkeley's words on this issue (Section 53): What inclines men to this mistake (beside the humour of making one see by geometry) is, that the same perceptions or ideas which suggest distance, do also suggest magnitude ... I say they do not first suggest distance, and then leave it to the judgement to use that as a medium, whereby to collect the magnitude; but they have as close and immediate a connexion with the magnitude as with the distance; and suggest magnitude as independently of distance, as they do distance independently of magnitude. Berkeley claimed that his visual theories were “vindicated” by a 1728 report regarding the recovery of vision in a 13-year-old boy operated for congenital cataracts by surgeon William Cheselden. In 2021, the name of Cheselden's patient was published for the first time: Daniel Dolins. Berkeley knew the Dolins family, had numerous social links to Cheselden, including the poet Alexander Pope, and Princess Caroline, to whom Cheselden's patient was presented. The report misspelled Cheselden's name, used language typical of Berkeley, and may even have been ghost-written by Berkeley. Unfortunately, Dolins was never able to see well enough to read, and there is no evidence that the surgery improved Dolins' vision at any point prior to his death at age 30. Philosophy of physics "Berkeley's works display his keen interest in natural philosophy [...] from his earliest writings (Arithmetica, 1707) to his latest (Siris, 1744). Moreover, much of his philosophy is shaped fundamentally by his engagement with the science of his time." The profundity of this interest can be judged from numerous entries in Berkeley's Philosophical Commentaries (1707–1708), e.g. "Mem. to Examine & accurately discuss the scholium of the 8th Definition of Mr Newton's Principia." (#316) Berkeley argued that forces and gravity, as defined by Newton, constituted "occult qualities" that "expressed nothing distinctly". He held that those who posited "something unknown in a body of which they have no idea and which they call the principle of motion, are in fact simply stating that the principle of motion is unknown." Therefore, those who "affirm that active force, action, and the principle of motion are really in bodies are adopting an opinion not based on experience." Forces and gravity existed nowhere in the phenomenal world. On the other hand, if they resided in the category of "soul" or "incorporeal thing", they "do not properly belong to physics" as a matter. Berkeley thus concluded that forces lay beyond any kind of empirical observation and could not be a part of proper science. He proposed his theory of signs as a means to explain motion and matter without reference to the "occult qualities" of force and gravity. Berkeley's razor Berkeley's razor is a rule of reasoning proposed by the philosopher Karl Popper in his study of Berkeley's key scientific work De Motu. Berkeley's razor is considered by Popper to be similar to Ockham's razor but "more powerful". It represents an extreme, empiricist view of scientific observation that states that the scientific method provides us with no true insight into the nature of the world. Rather, the scientific method gives us a variety of partial explanations about regularities that hold in the world and that are gained through experiment. The nature of the world, according to Berkeley, is only approached through proper metaphysical speculation and reasoning. Popper summarises Berkeley's razor as such: A general practical result—which I propose to call "Berkeley's razor"—of [Berkeley's] analysis of physics allows us a priori to eliminate from physical science all essentialist explanations. If they have a mathematical and predictive content they may be admitted qua mathematical hypotheses (while their essentialist interpretation is eliminated). If not they may be ruled out altogether. This razor is sharper than Ockham's: all entities are ruled out except those which are perceived. In another essay of the same book titled "Three Views Concerning Human Knowledge", Popper argues that Berkeley is to be considered as an instrumentalist philosopher, along with Robert Bellarmine, Pierre Duhem and Ernst Mach. According to this approach, scientific theories have the status of serviceable fictions, useful inventions aimed at explaining facts, and without any pretension to be true. Popper contrasts instrumentalism with the above-mentioned essentialism and his own "critical rationalism". Philosophy of mathematics In addition to his contributions to philosophy, Berkeley was also very influential in the development of mathematics, although in a rather indirect sense. "Berkeley was concerned with mathematics and its philosophical interpretation from the earliest stages of his intellectual life." Berkeley's "Philosophical Commentaries" (1707–1708) witness to his interest in mathematics: Axiom. No reasoning about things whereof we have no idea. Therefore no reasoning about Infinitesimals. (#354) Take away the signs from Arithmetic & Algebra, & pray what remains? (#767) These are sciences purely Verbal, & entirely useless but for Practise in Societys of Men. No speculative knowledge, no comparison of Ideas in them. (#768) In 1707, Berkeley published two treatises on mathematics. In 1734, he published The Analyst, subtitled A DISCOURSE Addressed to an Infidel Mathematician, a critique of calculus. Florian Cajori called this treatise "the most spectacular event of the century in the history of British mathematics." However, a recent study suggests that Berkeley misunderstood Leibnizian calculus. The mathematician in question is believed to have been either Edmond Halley, or Isaac Newton himself—though if to the latter, then the discourse was posthumously addressed, as Newton died in 1727. The Analyst represented a direct attack on the foundations and principles of calculus and, in particular, the notion of fluxion or infinitesimal change, which Newton and Leibniz used to develop the calculus. In his critique, Berkeley coined the phrase "ghosts of departed quantities", familiar to students of calculus. Ian Stewart's book From Here to Infinity captures the gist of his criticism. Berkeley regarded his criticism of calculus as part of his broader campaign against the religious implications of Newtonian mechanicsas a defence of traditional Christianity against deism, which tends to distance God from His worshipers. Specifically, he observed that both Newtonian and Leibnizian calculus employed infinitesimals sometimes as positive, nonzero quantities and other times as a number explicitly equal to zero. Berkeley's key point in "The Analyst" was that Newton's calculus (and the laws of motion based in calculus) lacked rigorous theoretical foundations. He claimed that In every other Science Men prove their Conclusions by their Principles, and not their Principles by the Conclusions. But if in yours you should allow your selves this unnatural way of proceeding, the Consequence would be that you must take up with Induction, and bid adieu to Demonstration. And if you submit to this, your Authority will no longer lead the way in Points of Reason and Science. Berkeley did not doubt that calculus produced real world truth; simple physics experiments could verify that Newton's method did what it claimed to do. "The cause of Fluxions cannot be defended by reason", but the results could be defended by empirical observation, Berkeley's preferred method of acquiring knowledge at any rate. Berkeley, however, found it paradoxical that "Mathematicians should deduce true Propositions from false Principles, be right in Conclusion, and yet err in the Premises." In The Analyst he endeavoured to show "how Error may bring forth Truth, though it cannot bring forth Science". Newton's science, therefore, could not on purely scientific grounds justify its conclusions, and the mechanical, deistic model of the universe could not be rationally justified. The difficulties raised by Berkeley were still present in the work of Cauchy whose approach to calculus was a combination of infinitesimals and a notion of limit, and were eventually sidestepped by Weierstrass by means of his (ε, δ) approach, which eliminated infinitesimals altogether. More recently, Abraham Robinson restored infinitesimal methods in his 1966 book Non-standard analysis by showing that they can be used rigorously. Moral philosophy The tract A Discourse on Passive Obedience (1712) is considered Berkeley's major contribution to moral and political philosophy. In A Discourse on Passive Obedience, Berkeley defends the thesis that people have "a moral duty to observe the negative precepts (prohibitions) of the law, including the duty not to resist the execution of punishment." However, Berkeley does make exceptions to this sweeping moral statement, stating that we need not observe precepts of "usurpers or even madmen" and that people can obey different supreme authorities if there are more than one claims to the highest authority. Berkeley defends this thesis with a deductive proof stemming from the laws of nature. First, he establishes that because God is perfectly good, the end to which he commands humans must also be good, and that end must not benefit just one person, but the entire human race. Because these commands—or laws—if practiced, would lead to the general fitness of humankind, it follows that they can be discovered by the right reason—for example, the law to never resist supreme power can be derived from reason because this law is "the only thing that stands between us and total disorder". Thus, these laws can be called the laws of nature, because they are derived from God—the creator of nature himself. "These laws of nature
earning his doctorate in divinity, and once again chose to remain at Trinity College Dublin, lecturing this time in Divinity and in Hebrew. In 1721/2 he was made Dean of Dromore and, in 1724, Dean of Derry. In 1723, following her violent quarrel with Jonathan Swift, who had been her intimate friend for many years, Esther Vanhomrigh (for whom Swift had created the nickname "Vanessa") named Berkeley her co-heir along with the barrister Robert Marshall; her choice of legatees caused a good deal of surprise since she did not know either of them well, although Berkeley as a very young man had known her father. Swift said generously that he did not grudge Berkeley his inheritance, much of which vanished in a lawsuit in any event. A story that Berkeley and Marshall disregarded a condition of the inheritance that they must publish the correspondence between Swift and Vanessa is probably untrue. In 1725, he began the project of founding a college in Bermuda for training ministers and missionaries in the colony, in pursuit of which he gave up his deanery with its income of £1100. Marriage and America In 1728, he married Anne Forster, daughter of John Forster, Chief Justice of the Irish Common Pleas, and his first wife Rebecca Monck. He then went to America on a salary of £100 per annum. He landed near Newport, Rhode Island, where he bought a plantation at Middletownthe famous "Whitehall". Berkeley purchased several enslaved Africans to work on the plantation. It has been claimed that "he introduced Palladianism into America by borrowing a design from [William] Kent's Designs of Inigo Jones for the door-case of his house in Rhode Island, Whitehall." He also brought to New England John Smibert, the Scottish artist he "discovered" in Italy, who is generally regarded as the founding father of American portrait painting. Meanwhile, he drew up plans for the ideal city he planned to build on Bermuda. He lived at the plantation while he waited for funds for his college to arrive. The funds, however, were not forthcoming. "With the withdrawal from London of his own persuasive energies, opposition gathered force; and the Prime Minister, Walpole grew steadily more sceptical and lukewarm. At last it became clear that the essential Parliamentary grant would be not forthcoming" and in 1732 he left America and returned to London. He and Anne had four children who survived infancy: Henry, George, William and Julia, and at least two other children who died in infancy. William's death in 1751 was a great cause of grief to his father. Episcopate in Ireland Berkeley was nominated to be the Bishop of Cloyne in the Church of Ireland on 18 January 1734. He was consecrated as such on 19 May 1734. He was the Bishop of Cloyne until his death on 14 January 1753, although he died at Oxford (see below). Humanitarian work While living in London's Saville Street, he took part in efforts to create a home for the city's abandoned children. The Foundling Hospital was founded by Royal Charter in 1739, and Berkeley is listed as one of its original governors. Last works His last two publications were Siris: A Chain of Philosophical Reflexions and Inquiries Concerning the Virtues of Tarwater, And divers other Subjects connected together and arising one from another (1744) and Further Thoughts on Tar-water (1752). Pine tar is an effective antiseptic and disinfectant when applied to cuts on the skin, but Berkeley argued for the use of pine tar as a broad panacea for diseases. His 1744 work on tar-water sold more copies than any of his other books during Berkeley's lifetime. He remained at Cloyne until 1752, when he retired. With his wife and daughter Julia he went to Oxford to live with his son George and supervise his education. He died soon afterward and was buried in Christ Church Cathedral, Oxford. His affectionate disposition and genial manners made him much loved and held in warm regard by many of his contemporaries. Anne outlived her husband by many years, and died in 1786. Contributions to philosophy The use of the concepts of "spirit" and "idea" is central in Berkeley's philosophy. As used by him, these concepts are difficult to translate into modern terminology. His concept of "spirit" is close to the concept of "conscious subject" or of "mind", and the concept of "idea" is close to the concept of "sensation" or "state of mind" or "conscious experience". Thus Berkeley denied the existence of matter as a metaphysical substance, but did not deny the existence of physical objects such as apples or mountains ("I do not argue against the existence of any one thing that we can apprehend, either by sense or reflection. That the things I see with mine eyes and touch with my hands do exist, really exist, I make not the least question. The only thing whose existence we deny, is that which philosophers call matter or corporeal substance. And in doing of this, there is no damage done to the rest of mankind, who, I dare say, will never miss it.", Principles #35). This basic claim of Berkeley's thought, his "idealism", is sometimes and somewhat derisively called "immaterialism" or, occasionally, subjective idealism. In Principles #3, he wrote, using a combination of Latin and English, esse is percipi (to be is to be perceived), most often if slightly inaccurately attributed to Berkeley as the pure Latin phrase esse est percipi. The phrase appears associated with him in authoritative philosophical sources, e.g., "Berkeley holds that there are no such mind-independent things, that, in the famous phrase, esse est percipi (aut percipere)—to be is to be perceived (or to perceive)." Hence, human knowledge is reduced to two elements: that of spirits and of ideas (Principles #86). In contrast to ideas, a spirit cannot be perceived. A person's spirit, which perceives ideas, is to be comprehended intuitively by inward feeling or reflection (Principles #89). For Berkeley, we have no direct 'idea' of spirits, albeit we have good reason to believe in the existence of other spirits, for their existence explains the purposeful regularities we find in experience ("It is plain that we cannot know the existence of other spirits otherwise than by their operations, or the ideas by them excited in us", Dialogues #145). This is the solution that Berkeley offers to the problem of other minds. Finally, the order and purposefulness of the whole of our experience of the world and especially of nature overwhelms us into believing in the existence of an extremely powerful and intelligent spirit that causes that order. According to Berkeley, reflection on the attributes of that external spirit leads us to identify it with God. Thus a material thing such as an apple consists of a collection of ideas (shape, color, taste, physical properties, etc.) which are caused in the spirits of humans by the spirit of God. Theology A convinced adherent of Christianity, Berkeley believed God to be present as an immediate cause of all our experiences. Here is Berkeley's proof of the existence of God: As T. I. Oizerman explained: Berkeley believed that God is not the distant engineer of Newtonian machinery that in the fullness of time led to the growth of a tree in the university quadrangle. Rather, the perception of the tree is an idea that God's mind has produced in the mind, and the tree continues to exist in the quadrangle when "nobody" is there, simply because God is an infinite mind that perceives all. The philosophy of David Hume concerning causality and objectivity is an elaboration of another aspect of Berkeley's philosophy. A.A. Luce, the most eminent Berkeley scholar of the 20th century, constantly stressed the continuity of Berkeley's philosophy. The fact that Berkeley returned to his major works throughout his life, issuing revised editions with only minor changes, also counts against any theory that attributes to him a significant volte-face. Relativity arguments John Locke (Berkeley's intellectual predecessor) states that we define an object by its primary and secondary qualities. He takes heat as an example of a secondary quality. If you put one hand in a bucket of cold water, and the other hand in a bucket of warm water, then put both hands in a bucket of lukewarm water, one of your hands is going to tell you that the water is cold and the other that the water is hot. Locke says that since two different objects (both your hands) perceive the water to be hot and cold, then the heat is not a quality of the water. While Locke used this argument to distinguish primary from secondary qualities, Berkeley extends it to cover primary qualities in the same way. For example, he says that size is not a quality of an object because the size of the object depends on the distance between the observer and the object, or the size of the observer. Since an object is a different size to different observers, then size is not a quality of the object. Berkeley rejects shape with a similar argument and then asks: if neither primary qualities nor secondary qualities are of the object, then how can we say that there is anything more than the qualities we observe? Relativity is the idea that there is no objective, universal truth; it is a state of dependence in which the existence of one independent object is solely dependent on that of another. According to Locke, characteristics of primary qualities are mind-independent, such as shape, size, etc., whereas secondary qualities are mind-dependent, for example, taste and color. George Berkeley refuted John Locke's belief on primary and secondary qualities because Berkeley believed that "we cannot abstract the primary qualities (e.g shape) from secondary ones (e.g color)". Berkeley argued that perception is dependent on the distance between the observer and the object, and "thus, we cannot conceive of mechanist material bodies which are extended but not (in themselves) colored". What perceived can be the same type of quality, but completely opposite form each other because of different positions and perceptions, what we perceive can be different even when the same types of things consist of contrary qualities. Secondary qualities aid in people's conception of primary qualities in an object, like how the color of an object leads people to recognize the object itself. More specifically, the color red can be perceived in apples, strawberries, and tomatoes, yet we would not know what these might look like without its color. We would also be unaware of what the color red looked like if red paint, or any object that has a perceived red color, failed to exist. From this, we can see that colors cannot exist on their own and can solely represent a group of perceived objects. Therefore, both primary and secondary qualities are mind-dependent: they cannot exist without our minds. George Berkeley was a philosopher who was against rationalism and "classical" empiricism. He was a "subjective idealist" or "empirical idealist", who believed that reality is constructed entirely of immaterial, conscious minds and their ideas; everything that exists is somehow dependent on the subject perceiving it, except the subject themselves. He refuted the existence of abstract objects that many other philosophers believed to exist, notably Plato. According to Berkeley, "an abstract object does not exist in space or time and which is therefore entirely non-physical and non-mental"; however, this argument contradicts with his relativity argument. If "esse est percipi", (Latin meaning that to exist is to be perceived) is true, then the objects in the relativity argument made by Berkeley can either exist or not. Berkeley believed that only the minds' perceptions and the Spirit that perceives are what exists in reality; what people perceive every day is only the idea of an object's existence, but the objects themselves are not perceived. Berkeley also discussed how, at times, materials cannot be perceived by oneself, and the mind of oneself cannot understand the objects. However, there also exists an "omnipresent, eternal mind" that Berkeley believed to consist of God and the Spirit, both omniscient and all-perceiving. According to Berkeley, God is the entity who controls everything, yet Berkeley also argued that "abstract object[s] do not exist in space or time". In other words, as Warnock argues, Berkeley "had recognized that he could not square with his own talk of spirits, of our minds and of God; for these are perceivers and not among objects of perception. Thus he says, rather weakly and without elucidation, that in addition to our ideas we also have notions—we know what it means to speak of spirits and their operations." However, the relativity argument violates the idea of immaterialism. Berkeley's immaterialism argues that "esse est percipi (aut percipere)", which in English is to be is to be perceived (or to perceive). That is saying only what perceived or perceives is real, and without our perception or God's nothing can be real. Yet, if the relativity argument, also by Berkeley, argues that the perception of an object depends on the different positions, then this means that what perceived can either be real or not because the perception does not show that whole picture and the whole picture cannot be perceived. Berkeley also believes that "when one perceives mediately, one perceives one idea by means of perceiving another". By this, it can be elaborated that if the standards of what perceived at first are different, what perceived after that can be different, as well. In the heat perception described above, one hand perceived the water to be hot and the other hand perceived the water to be cold due to relativity. If applying the idea "to be is to be perceived", the water should be both cold and hot because both perceptions are perceived by different hands. However, the water cannot be cold and hot at the same time for it self-contradicts, so this shows that what perceived is not always true because it sometimes can break the law of noncontradiction. In this case, "it would be arbitrary anthropocentrism to claim that humans have special access to the true qualities of objects". The truth for different people can be different, and humans are limited to accessing the absolute truth due to relativity. Summing up, nothing can be absolutely true due to relativity or the two arguments, to be is to be perceived and the relativity argument, do not always work together. New theory of vision In his Essay Towards a New Theory of Vision, Berkeley frequently criticised the views of the Optic Writers, a title that seems to include Molyneux, Wallis, Malebranche and Descartes. In sections 1–51, Berkeley argued against the classical scholars of optics by holding that: spatial depth, as the distance that separates the perceiver from the perceived object is itself invisible. That is, we do not see space directly or deduce its form logically using the laws of optics. Space for Berkeley is no more than a contingent expectation that visual and tactile sensations will follow one another in regular sequences that we come to expect through habit. Berkeley goes on to argue that visual cues, such as the perceived extension or 'confusion' of an object, can only be used to indirectly judge distance, because the viewer learns to associate visual cues with tactile sensations. Berkeley gives the following analogy regarding indirect distance perception: one perceives distance indirectly just as one perceives a person's embarrassment indirectly. When looking at an embarrassed person, we infer indirectly that the person is embarrassed by observing the red color on the person's face. We know through experience that a red face tends to signal embarrassment, as we've learned to associate the two. The question concerning the visibility of space was central to the Renaissance perspective tradition and its reliance on classical optics in the development of pictorial representations of spatial depth. This matter was debated by scholars since the 11th-century Arab polymath and mathematician Alhazen (al-Hasan Ibn al-Haytham) affirmed in experimental contexts the visibility of space. This issue, which was raised in Berkeley's theory of vision, was treated at length in the Phenomenology of Perception of Maurice Merleau-Ponty, in the context of confirming the visual perception of spatial depth (la profondeur), and by way of refuting Berkeley's thesis. Berkeley wrote about the perception of size in addition to that of distance. He is frequently misquoted as believing in size–distance invariance—a view held by the Optic Writers. This idea is that we scale the image size according to distance in a geometrical manner. The error may have become commonplace because the eminent historian and psychologist E. G. Boring perpetuated it. In fact, Berkeley argued that the same cues that evoke distance also evoke size, and that we do not first see size and then calculate distance. It is worth quoting Berkeley's words on this issue (Section 53): What inclines men to this mistake (beside the humour of making one see by geometry) is, that the same perceptions or ideas which suggest distance, do also suggest magnitude ... I say they do not first suggest distance, and then leave it to the judgement to use that as a medium, whereby to collect the magnitude; but they have as close and immediate a connexion with the magnitude as with the distance; and suggest magnitude as independently of distance, as they do distance independently of magnitude. Berkeley claimed that his visual theories were “vindicated” by a 1728 report regarding the recovery of vision in a 13-year-old boy operated for congenital cataracts by surgeon William Cheselden. In 2021, the name of Cheselden's patient was published for the first time: Daniel Dolins. Berkeley knew the Dolins family, had numerous social links to Cheselden,
the middle child of seven of Daniel Moore, a medical doctor, and Henrietta Sturge. His grandfather was the author George Moore. His eldest brother was Thomas Sturge Moore, a poet, writer and engraver. He was educated at Dulwich College and in 1892 went up to Trinity College, Cambridge, to read classics and moral sciences. He became a Fellow of Trinity in 1898 and went on to hold the University of Cambridge chair of Mental Philosophy and Logic from 1925 to 1939. Moore is best known today for defending ethical non-naturalism, his emphasis on common sense in philosophical method, and the paradox that bears his name. He was admired by and influenced among other philosophers, and by the Bloomsbury Group, but unlike his colleague and admirer Russell, who for some years thought he fulfilled his "ideal of genius", mostly unknown today outside of academic philosophy. Moore's essays are known for their clarity and circumspection of writing style and methodical and patient approach to philosophical problems. He was critical of modern philosophy for lack of progress, which he saw as a stark contrast to the dramatic advances in the natural sciences since the Renaissance. Among Moore's most famous works are his Principia Ethica, and his essays, "The Refutation of Idealism", "A Defence of Common Sense", and "A Proof of the External World". Moore was an important and admired member of the secretive Cambridge Apostles, a discussion group drawn from the British intellectual elite. At the time another member, a 22-year-old Bertrand Russell, wrote, "I almost worship him as if he were a god. I have never felt such an extravagant admiration for anybody," and would later write that "for some years he fulfilled my ideal of genius. He was in those days beautiful and slim, with a look almost of inspiration as deeply passionate as Spinoza's". From 1918 to 1919 Moore chaired the Aristotelian Society, a group committed to systematic study of philosophy, its historical development and its methods and problems. G. E. Moore died at the Evelyn Nursing Home on 24 October 1958. He was cremated at Cambridge Crematorium on 28 October 1958 and his ashes interred at the Parish of the Ascension Burial Ground in the city. His wife Dorothy Ely (1892–1977) was buried there. Together they had two sons, the poet Nicholas Moore and the composer Timothy Moore. Philosophy Ethics His influential work Principia Ethica is one of the main inspirations of the movement against ethical naturalism (see ethical non-naturalism) and is partly responsible for the twentieth-century concern with meta-ethics. The naturalistic fallacy Moore asserted that philosophical arguments can suffer from a confusion between the use of a term in a particular argument and the definition of that term (in all arguments). He named this confusion the naturalistic fallacy. For example, an ethical argument may claim that if a thing has certain properties, then that thing is 'good.' A hedonist may argue that 'pleasant' things are 'good' things. Other theorists may argue that 'complex' things are 'good' things. Moore contends that, even if such arguments are correct, they do not provide definitions for the term 'good'. The property of 'goodness' cannot be defined. It can only be shown and grasped. Any attempt to define it (X is good if it has property Y) will simply shift the problem (Why is Y-ness good in the first place?). Open-question argument Moore's argument for the indefinability of 'good' (and thus for the fallaciousness in the "naturalistic fallacy") is often called the open-question argument; it is presented in §13 of Principia Ethica. The argument hinges on the nature of statements such as "Anything that is pleasant is also good" and the possibility of asking questions such as "Is it good that x is pleasant?". According to Moore, these questions are open and these statements are significant; and they will remain so no matter what is substituted for "pleasure". Moore concludes from this that any analysis of value is bound to fail. In other words, if value could be analysed, then such questions and statements would be trivial and obvious. Since they are anything but trivial and obvious, value must be indefinable. Critics of Moore's arguments sometimes claim that he is appealing to general puzzles concerning analysis (cf. the paradox of analysis), rather than revealing anything special about value. The argument clearly depends on the assumption that if 'good' were definable, it would be an analytic truth about 'good', an assumption that many contemporary moral realists like Richard Boyd and Peter Railton reject. Other responses appeal to the Fregean distinction between sense and reference, allowing that value concepts are special and sui generis, but insisting that value properties are nothing but natural properties (this strategy is similar to that taken by non-reductive materialists in philosophy of mind). Good as indefinable Moore contended that goodness cannot be analysed in terms of any other property. In Principia Ethica, he writes: It may be true that all things which are good are also something else, just as it is true that all things which are yellow produce a certain kind of vibration in the light. And it is a fact, that Ethics aims at discovering what are those other properties belonging to all things which are good. But far too many philosophers have thought that when they named those other properties they were actually defining good; that these properties, in fact, were simply not "other," but absolutely and entirely the same with goodness. (Principia, § 10 ¶ 3) Therefore, we cannot define 'good' by explaining it in other words. We can only point to a thing or an action and say "That is good." Similarly, we cannot describe to a person born totally blind exactly what yellow is. We can only show a sighted person a piece of yellow paper or a yellow scrap of cloth and say "That is yellow." Good as a non-natural property In addition to categorising 'good' as indefinable, Moore also emphasized that it is a non-natural property. This means that it cannot be empirically or scientifically tested or verifiedit is not within the bounds of "natural science". Moral knowledge Moore argued that, once arguments based on the naturalistic fallacy had been discarded, questions of intrinsic goodness could be settled only by appeal to what he (following Sidgwick) called "moral intuitions": self-evident propositions which recommend themselves to moral reflection, but which are not susceptible to either direct proof or disproof (Principia, § 45). As a result of his view, he has often been described by later writers as an advocate of ethical intuitionism. Moore, however, wished to distinguish his view from the views usually described as "Intuitionist" when Principia Ethica was written: Moore distinguished his view from the view of deontological intuitionists, who held that "intuitions" could determine questions about what actions are right or required by duty. Moore, as a consequentialist, argued that "duties" and moral rules could be determined by investigating the effects of particular actions or kinds of actions (Principia, § 89), and so were matters for empirical investigation rather than direct objects of intuition (Prncipia, § 90). On Moore's view, "intuitions" revealed not the rightness or wrongness of specific
and slim, with a look almost of inspiration as deeply passionate as Spinoza's". From 1918 to 1919 Moore chaired the Aristotelian Society, a group committed to systematic study of philosophy, its historical development and its methods and problems. G. E. Moore died at the Evelyn Nursing Home on 24 October 1958. He was cremated at Cambridge Crematorium on 28 October 1958 and his ashes interred at the Parish of the Ascension Burial Ground in the city. His wife Dorothy Ely (1892–1977) was buried there. Together they had two sons, the poet Nicholas Moore and the composer Timothy Moore. Philosophy Ethics His influential work Principia Ethica is one of the main inspirations of the movement against ethical naturalism (see ethical non-naturalism) and is partly responsible for the twentieth-century concern with meta-ethics. The naturalistic fallacy Moore asserted that philosophical arguments can suffer from a confusion between the use of a term in a particular argument and the definition of that term (in all arguments). He named this confusion the naturalistic fallacy. For example, an ethical argument may claim that if a thing has certain properties, then that thing is 'good.' A hedonist may argue that 'pleasant' things are 'good' things. Other theorists may argue that 'complex' things are 'good' things. Moore contends that, even if such arguments are correct, they do not provide definitions for the term 'good'. The property of 'goodness' cannot be defined. It can only be shown and grasped. Any attempt to define it (X is good if it has property Y) will simply shift the problem (Why is Y-ness good in the first place?). Open-question argument Moore's argument for the indefinability of 'good' (and thus for the fallaciousness in the "naturalistic fallacy") is often called the open-question argument; it is presented in §13 of Principia Ethica. The argument hinges on the nature of statements such as "Anything that is pleasant is also good" and the possibility of asking questions such as "Is it good that x is pleasant?". According to Moore, these questions are open and these statements are significant; and they will remain so no matter what is substituted for "pleasure". Moore concludes from this that any analysis of value is bound to fail. In other words, if value could be analysed, then such questions and statements would be trivial and obvious. Since they are anything but trivial and obvious, value must be indefinable. Critics of Moore's arguments sometimes claim that he is appealing to general puzzles concerning analysis (cf. the paradox of analysis), rather than revealing anything special about value. The argument clearly depends on the assumption that if 'good' were definable, it would be an analytic truth about 'good', an assumption that many contemporary moral realists like Richard Boyd and Peter Railton reject. Other responses appeal to the Fregean distinction between sense and reference, allowing that value concepts are special and sui generis, but insisting that value properties are nothing but natural properties (this strategy is similar to that taken by non-reductive materialists in philosophy of mind). Good as indefinable Moore contended that goodness cannot be analysed in terms of any other property. In Principia Ethica, he writes: It may be true that all things which are good are also something else, just as it is true that all things which are yellow produce a certain kind of vibration in the light. And it is a fact, that Ethics aims at discovering what are those other properties belonging to all things which are good. But far too many philosophers have thought that when they named those other properties they were actually defining good; that these properties, in fact, were simply not "other," but absolutely and entirely the same with goodness. (Principia, § 10 ¶ 3) Therefore, we cannot define 'good' by explaining it in other words. We can only point to a thing or an action and say "That is good." Similarly, we cannot describe to a person born totally blind exactly what yellow is. We can only show a sighted person a piece of yellow paper or a yellow scrap of cloth and say "That is yellow." Good as a non-natural property In addition to categorising 'good' as indefinable, Moore also emphasized that it is a non-natural property. This means that it cannot be empirically or scientifically tested or verifiedit is not within the bounds of "natural science". Moral knowledge Moore argued that, once arguments based on the naturalistic fallacy had been discarded, questions of intrinsic goodness could be settled only by appeal to what he (following Sidgwick) called "moral intuitions": self-evident propositions which recommend themselves to moral reflection, but which are not susceptible to either direct proof or disproof (Principia, § 45). As a result of his view, he has often been described by later writers as an advocate of ethical intuitionism. Moore, however, wished to distinguish his view from the views usually described as "Intuitionist" when Principia Ethica was written: Moore distinguished his view from the view of deontological intuitionists, who held that "intuitions" could determine questions about what actions are right or required by duty. Moore, as a consequentialist, argued that "duties" and moral rules could be determined by investigating the effects of particular actions or kinds of actions (Principia, § 89), and so were matters for empirical investigation rather than direct objects of intuition (Prncipia, § 90). On Moore's view, "intuitions" revealed not the rightness or wrongness of specific actions, but only what things were good in themselves, as ends to be pursued. Right action, duty and virtue Moore holds that are those producing the most good. The difficulty with this is that the consequences of most actions are too vast for us to properly take into account, especially the long-term consequences. Because of this, Moore suggests that the definition of duty is limited to what generally produces better results than probable alternatives in a comparatively near future. Whether a given rule of action turns out to be a duty depends to some extent on the conditions of the corresponding society but duties agree mostly with what common-sense recommends. Virtues, like honesty, can in turn be defined as permanent dispositions to perform duties. Proof of an external world One of the most important parts of Moore's philosophical development was his break from the idealism that dominated British philosophy (as represented in the works of
sides. Those definitions can be expressed as one genus and two differentiae: one genus: the genus for both a triangle and a quadrilateral: "A plane figure" two differentiae: the differentia for a triangle: "that has 3 straight bounding sides." the differentia for a quadrilateral: "that has 4 straight bounding sides." The use of genus and differentia in constructing definitions goes back at least as far as Aristotle (384–322 BCE). Differentiation and Abstraction The process of producing new definitions by extending existing definitions is commonly known as differentiation (and also as derivation). The reverse process, by which just part of an existing definition is used itself as a new definition, is called abstraction; the new definition is called an abstraction and it is said to have been abstracted away from the existing definition. For instance, consider the following: a square: a quadrilateral that has interior angles which are all right angles, and that has bounding sides which all have the same length. A part of that definition may be singled out (using parentheses here): a square: (a quadrilateral that has interior angles which are all right angles), and that has bounding sides which all have the same length. and with that part, an abstraction may be formed: a rectangle: a quadrilateral that has interior angles which are all right angles. Then, the definition of a square may be recast with that abstraction as its genus: a square: a rectangle that has bounding sides which all have the same length. Similarly, the definition of a square may be rearranged and another portion singled out: a square: (a quadrilateral that has bounding sides which all have the same length), and that has interior angles which are all right angles. leading to the following abstraction: a rhombus: a quadrilateral that has bounding sides which all have the same length. Then, the definition of a square may be recast with that abstraction as its genus: a square: a rhombus that has interior angles which are all right angles. In fact, the definition of a square may be recast in terms of both of the abstractions, where one acts as the genus and the other acts as the differentia: a square: a rectangle that is a rhombus. a square: a rhombus that is a rectangle. Hence, abstraction is crucial in simplifying definitions. Multiplicity When multiple definitions could serve equally well, then all such definitions apply simultaneously. Thus, a square is a member of both the genus [a] rectangle and the genus [a] rhombus. In such a case, it is notationally convenient to consolidate the definitions into one definition that is expressed with multiple genera (and possibly no differentia, as in the following): a square: a rectangle and a rhombus. or completely equivalently: a square: a rhombus and a rectangle. More generally, a collection of equivalent definitions (each of which is expressed with one unique genus) can be recast as one definition that is expressed with genera. Thus, the following: a Definition: a Genus1 that is a Genus2 and that is a Genus3
a quadrilateral: "that has 4 straight bounding sides." The use of genus and differentia in constructing definitions goes back at least as far as Aristotle (384–322 BCE). Differentiation and Abstraction The process of producing new definitions by extending existing definitions is commonly known as differentiation (and also as derivation). The reverse process, by which just part of an existing definition is used itself as a new definition, is called abstraction; the new definition is called an abstraction and it is said to have been abstracted away from the existing definition. For instance, consider the following: a square: a quadrilateral that has interior angles which are all right angles, and that has bounding sides which all have the same length. A part of that definition may be singled out (using parentheses here): a square: (a quadrilateral that has interior angles which are all right angles), and that has bounding sides which all have the same length. and with that part, an abstraction may be formed: a rectangle: a quadrilateral that has interior angles which are all right angles. Then, the definition of a square may be recast with that abstraction as its genus: a square: a rectangle that has bounding sides which all have the same length. Similarly, the definition of a square may be rearranged and another portion singled out: a square: (a quadrilateral that has bounding sides which all have the same length), and that has interior angles which are all right angles. leading to the following abstraction: a rhombus: a quadrilateral that has bounding sides which all have the same length. Then, the definition of a square may be recast with that abstraction as its genus: a square: a rhombus that has interior angles which are all right angles. In fact, the definition of a square may be recast in terms of both of the abstractions, where one acts as the genus and the other acts as the differentia: a square: a rectangle that is a rhombus. a square: a rhombus that is a rectangle. Hence, abstraction is crucial in simplifying definitions. Multiplicity When multiple definitions could serve equally well, then all such definitions apply simultaneously. Thus, a square is a member of both the genus [a] rectangle and the genus [a] rhombus. In such a case, it is notationally convenient to consolidate the definitions into one definition that is expressed with multiple genera (and possibly no differentia, as in the following): a square: a rectangle and a rhombus. or completely equivalently: a square: a rhombus and a rectangle. More generally, a collection of equivalent definitions (each of which is expressed with one unique genus) can be recast as one definition that is expressed with genera. Thus, the following: a Definition: a Genus1 that is a Genus2 and that is a Genus3 and that is a… and that is a Genusn-1 and that is a Genusn, which has some non-genus Differentia. a Definition: a Genus2 that is a Genus1 and that is a Genus3 and that is a… and that is a Genusn-1 and that is a Genusn, which has some non-genus Differentia. a Definition: a Genus3 that is a Genus1 and that is a Genus2 and that is a… and that is a Genusn-1 and that is a Genusn, which has some non-genus Differentia. … a Definition: a Genusn-1 that is a Genus1 and that is a Genus2 and that is a Genus3 and that is a… and that is a Genusn, which has some non-genus
in to limit the maximum number of shots fired in fully automatic mode, with most common limits being two or three rounds per trigger pull. The presence of selective-fire modes on firearms allows more efficient use of ammunition for specific tactical needs, either precision-aimed or suppressive fire. This capability is most commonly found on military weapons of the 20th and 21st centuries, most notably the assault rifles. History The first primitive firearms were invented about 1250 AD in China when the man-portable fire lance (a bamboo or metal tube that could shoot ignited gunpowder) was combined with projectiles such as scrap metal, broken porcelain, or darts/arrows. An early depiction of a firearm is a sculpture from a cave in Sichuan, China. The sculpture dates to the 12th century and represents a figure carrying a vase-shaped bombard, with flames and a cannonball coming out of it. The oldest surviving gun, a hand cannon made of bronze, has been dated to 1288 because it was discovered at a site in modern-day Acheng District, Heilongjiang, China, where the Yuan Shi records that battles were fought at that time. The firearm had a 6.9 inch barrel of a 1-inch diameter, a 2.6 inch chamber for the gunpowder and a socket for the firearm's handle. It is 13.4 inches long and 7.8 pounds without the handle, which would have been made of wood. The Arabs and Mamluks had firearms in the late-13th century. Europeans obtained firearms in the 14th century. The Koreans adopted firearms from the Chinese in the 14th century. The Iranians (first Aq Qoyunlu and Safavids) and Indians (first Mughals) all got them no later than the 15th century, from the Ottoman Turks. The people of the Nusantara archipelago of Southeast Asia used the long arquebus at least by the last quarter of 15th century. Even though the knowledge of making gunpowder-based weapons in the Nusantara archipelago had been known after the failed Mongol invasion of Java (1293), and the predecessor of firearms, the pole gun (bedil tombak), was recorded as being used by Java in 1413, the knowledge of making "true" firearms came much later, after the middle of 15th century. It was brought by the Islamic nations of West Asia, most probably the Arabs. The precise year of introduction is unknown, but it may be safely concluded to be no earlier than 1460. Before the arrival of the Portuguese in Southeast Asia, the natives already possessed firearms, the Java arquebus. The technology of firearms in Southeast Asia further improved after the Portuguese capture of Malacca (1511). Starting in the 1513, the traditions of German-Bohemian gun-making merged with Turkish gun-making traditions. This resulted in the Indo-Portuguese tradition of matchlocks. Indian craftsmen modified the design by introducing a very short, almost pistol-like buttstock held against the cheek, not the shoulder, when aiming. They also reduced the caliber and made the gun lighter and more balanced. This was a hit with the Portuguese who did a lot of fighting aboard ship and on river craft, and valued a more compact gun. The Malaccan gunfounders, compared as being in the same level with those of Germany, quickly adapted these new firearms, and thus a new type of arquebus, the istinggar, appeared. The Japanese did not acquire firearms until the 16th century, and then from the Portuguese rather than from the Chinese. Developments in firearms accelerated during the 19th and 20th centuries. Breech-loading became more or less a universal standard for the reloading of most hand-held firearms and continues to be so with some notable exceptions (such as mortars). Instead of loading individual rounds into weapons, magazines holding multiple munitions were adopted—these aided rapid reloading. Automatic and semi-automatic firing mechanisms meant that a single soldier could fire many more rounds in a minute than a vintage weapon could fire over the course of a battle. Polymers and alloys in firearm construction made weaponry progressively lighter and thus easier to deploy. Ammunition changed over the centuries from simple metallic ball-shaped projectiles that rattled down the barrel to bullets and cartridges manufactured to high precision. Especially in the past century particular attention has focused on accuracy and sighting to make firearms altogether far more accurate than ever before. More than any single factor though, firearms have proliferated due to the advent of mass production—enabling arms-manufacturers to produce large quantities of weaponry to a consistent standard. Velocities of bullets increased with the use of a "jacket" of metals such as copper or copper alloys that covered a lead core and allowed the bullet to glide down the barrel more easily than exposed lead. Such bullets are designated as "full metal jacket" (FMJ). Such FMJ bullets are less likely to fragment on impact and are more likely to traverse through a target while imparting less energy. Hence, FMJ bullets impart less tissue damage than non-jacketed bullets that expand. This led to their adoption for military use by countries adhering to the Hague Convention of 1899. That said, the basic principle behind firearm operation remains unchanged to this day. A musket of several centuries ago is still similar in principle to a modern-day assault-rifle—using the expansion of gases to propel projectiles over long distances—albeit less accurately and rapidly. Evolution Early models Fire lances The Chinese fire lance from the 10th century was the direct predecessor to the modern concept of the firearm. It was not a gun itself, but an addition to soldiers' spears. Originally it consisted of paper or bamboo barrels which would contain incendiary gunpowder that could be lit one time and which would project flames at the enemy. Sometimes Chinese troops would place small projectiles within the barrel that would also be projected when the gunpowder was lit, but most of the explosive force would create flames. Later, the barrel was changed to be made of metal, so that a more explosive gunpowder could be used and put more force into the propulsion of projectiles. Hand cannons The original predecessors of all firearms, the Chinese fire lance and hand cannon, were loaded with gunpowder and the shot (initially lead shot, later replaced by cast iron) through the muzzle, while a fuse was placed at the rear. This fuse was lit, causing the gunpowder to ignite and propel the projectiles. In military use, the standard hand cannon was tremendously powerful, while also being somewhat erratic due to relative inability of the gunner to aim the weapon, or to control the ballistic properties of the projectile. Recoil could be absorbed by bracing the barrel against the ground using a wooden support, the forerunner of the stock. Neither the quality or amount of gunpowder, nor the consistency in projectile dimensions were controlled, with resulting inaccuracy in firing due to windage, variance in gunpowder-composition, and the difference in diameter between the bore and the shot. Hand cannons were replaced by lighter carriage-mounted artillery pieces, and ultimately by the arquebus. In the 1420s gunpowder was used to propel missiles from hand-held tubes during the Hussite revolt in Bohemia. Muskets Muzzle-loading muskets (smooth-bored long guns) were among the first firearms developed. The firearm was loaded through the muzzle with gunpowder, optionally with some wadding and then with a bullet (usually a solid lead ball, but musketeers could shoot stones when they ran out of bullets). Greatly improved muzzleloaders (usually rifled instead of smooth-bored) are manufactured today and have many enthusiasts, many of whom hunt large and small game with their guns. Muzzleloaders have to be manually reloaded after each shot; a skilled archer could fire multiple arrows faster than most early muskets could be reloaded and fired, although by the mid-18th century, when muzzleloaders became the standard small-armament of the military, a well-drilled soldier could fire six rounds in a minute using prepared cartridges in his musket. Before then, the effectiveness of muzzleloaders was hindered both by the low reloading speed and, before the firing mechanism was perfected, by the very high risk posed by the firearm to the person attempting to fire it. One interesting solution to the reloading problem was the "Roman Candle Gun" with superposed loads. This was a muzzleloader in which multiple charges and balls were loaded one on top of the other, with a small hole in each ball to allow the subsequent charge to be ignited after the one ahead of it was ignited. It was neither a very reliable nor popular firearm, but it enabled a form of "automatic" fire long before the advent of the machine gun. Loading techniques Most early firearms were muzzle-loading. This form of loading has several disadvantages, such as a slow rate of fire and having to expose oneself to enemy fire to reload - as the weapon had to be pointed upright so the powder could be poured through the muzzle into the breech, followed by the ramming the projectile into the breech. As effective methods of sealing the breech developed along with sturdy, weatherproof, self-contained metallic cartridges, muzzle-loaders were replaced by single-shot breech loaders. Eventually single-shot weapons were replaced by the following repeater-type weapons. Internal magazines Many firearms made from the late-19th century through the 1950s used internal magazines to load the cartridge into the chamber of the weapon. The most notable and revolutionary weapons of this period appeared during the U.S. Civil War of 1861-1865: the Spencer and Henry repeating rifles. Both used fixed tubular magazines, the former having the magazine in the buttstock and the latter under the barrel, which allowed a larger capacity. Later weapons used fixed box magazines that could not be removed from the weapon without disassembling the weapon itself. Fixed magazines permitted the use of larger cartridges and eliminated the hazard of having the bullet of one cartridge butting next to the primer or rim of another cartridge. These magazines are loaded while they are in the weapon, often using a stripper clip. A clip is used to transfer cartridges into the magazine. Some notable weapons that use internal magazines include the Mosin–Nagant, the Mauser Kar 98k, the Springfield M1903, the M1 Garand, and the SKS. Firearms that have internal magazines are usually, but not always, rifles. Some exceptions to this include the Mauser C96 pistol, which uses an internal magazine, and the Breda 30, an Italian light machine-gun. Detachable magazines Many modern firearms use what are called detachable or box magazines as their method of chambering a cartridge. Detachable magazines can be removed from the weapon without disassembling the firearms, usually by pushing a magazine release. Belt-fed weapons A belt or ammunition belt, a device used to retain and feed cartridges into a firearm, is commonly used with machine guns. Belts were originally composed of canvas or cloth with pockets spaced evenly to allow the belt to be mechanically fed into the gun. These designs were prone to malfunctions due to the effects of oil and other contaminants altering the belt. Later belt-designs used permanently-connected metal links to retain the cartridges during feeding. These belts were more tolerant to exposure to solvents and oil. Notable weapons that use belts include the M240, the M249, the M134 Minigun, and the PK Machine Gun. Firing mechanisms Matchlock Matchlocks were the first and simplest firearms-firing mechanisms developed. In the matchlock mechanism, the powder in the gun barrel was ignited by a piece of burning cord called a "match". The match was wedged into one end of an S-shaped piece of steel. When the trigger (often actually a lever) was pulled, the match was brought into the open end of a "touch hole" at the base of the gun barrel, which contained a very small quantity of gunpowder, igniting the main charge of gunpowder in the gun barrel. The match usually had to be relit after each firing. The main parts to the matchlock firing-mechanism are the pan, match, arm and trigger. A benefit of the pan and arm swivel being moved to the side of the gun was it gave a clear line of fire. An advantage to the matchlock firing mechanism is that it did not misfire. However, it also came with some disadvantages. One disadvantage involved weather: in rain the match could not be kept lit to fire the weapon. Another issue with the match was it could give away the position of soldiers because of the glow, sound, and smell. While European pistols were equipped with wheellock and flintlock mechanisms, Asian pistols used matchlock mechanisms. Wheellock The wheellock action, a successor to the matchlock, predated the flintlock. Despite its many faults, the wheellock was a significant improvement over the matchlock in terms of both convenience and safety, since it eliminated the need to keep a smoldering match in proximity to loose gunpowder. It operated using a small wheel (much like that on a cigarette lighter) which was wound up with a key before use and which, when the trigger was pulled, spun against a flint, creating the shower of sparks that ignited the powder in the touch hole. Supposedly invented by Leonardo da Vinci (1452-1519), the Italian Renaissance man, the wheellock action was an innovation that was not widely adopted due to the high cost of the clockwork mechanism. Flintlock The flintlock action represented a major innovation in firearm design. The spark used to ignite the gunpowder in the touch hole came from a sharpened piece of flint clamped in the jaws of a "cock" which, when released by the trigger, struck a piece of steel called the "frizzen" to generate the necessary sparks. (The spring-loaded arm that holds a piece of flint or pyrite is referred to as a cock because of its resemblance to a rooster.) The cock had to be manually reset after each firing, and the flint had to be replaced periodically due to wear from striking the frizzen. (See also flintlock mechanism, snaphance, Miquelet lock.) The flintlock was widely used during the 17th, 18th, and 19th centuries in both muskets and rifles. Percussion cap Percussion caps (caplock mechanisms), coming into wide service in the early 19th century, offered a dramatic improvement over flintlocks. With the percussion-cap mechanism, the small primer charge of gunpowder used in all preceding firearms was replaced by a completely self-contained explosive charge contained in a small brass "cap". The cap was fastened to the touch hole of the gun (extended to form a "nipple") and ignited by the impact of the gun's "hammer". (The hammer is roughly the same as the cock found on flintlocks except that it does not clamp onto anything.) In the case of percussion caps the hammer was hollow on the end to fit around the cap in order to keep the cap from fragmenting and injuring the shooter. Once struck, the flame from the cap in turn ignited the main charge of gunpowder, as with the flintlock, but there was no longer any need to charge the touch hole with gunpowder, and even better, the touch hole was no longer exposed to the elements. As a result, the percussion-cap mechanism was considerably safer, far more weatherproof, and vastly more reliable (cloth-bound cartridges containing a pre-measured charge of gunpowder and a ball had been in regular military service for many years, but the exposed gunpowder in the entry to the touch hole had long been a source of misfires). All muzzleloaders manufactured since the second half of the 19th century use percussion caps except those built as replicas of the flintlock or earlier firearms. Cartridges Frenchman Louis-Nicolas Flobert invented the first rimfire metallic cartridge in 1845. His cartridge consisted of a percussion cap with a bullet attached to the top. Flobert then made what he called "parlor guns" for this cartridge, as these rifles and pistols were designed to be shot in indoor shooting-parlors in large homes. These 6mm Flobert cartridges do not contain any powder, the only propellant substance contained in the cartridge is the percussion cap. In English-speaking countries, the 6mm Flobert cartridge corresponds to .22 BB Cap and .22 CB Cap ammunition. These cartridges have a relatively low muzzle-velocity of around 700 ft/s (210 m/s). Cartridges represented a major innovation: firearms ammunition, previously delivered as separate bullets and powder, was combined in a single metallic (usually brass) cartridge containing a percussion cap, powder, and a bullet in one weatherproof package. The main technical advantage of the brass cartridge-case was the effective and reliable sealing of high-pressure gasses at the breech, as the gas pressure forces the cartridge case to expand outward, pressing it firmly against the inside of the gun-barrel chamber. This prevents the leakage of hot gas which could injure the shooter. The brass cartridge also opened the way for modern repeating arms, by uniting the bullet, gunpowder and primer into one assembly that could be fed reliably into the breech by a mechanical action in the firearm. Before this, a "cartridge" was simply a pre-measured quantity of gunpowder together with a ball in a small cloth bag (or rolled paper cylinder), which also acted as wadding for the charge and ball. This early form of cartridge had to be rammed into the muzzleloader's barrel, and either a small charge of gunpowder in the touch hole or an external percussion cap mounted on the touch hole ignited the gunpowder in the cartridge. Cartridges with built-in percussion caps (called "primers") continue to this day to be the standard in firearms. In cartridge-firing firearms, a hammer (or a firing-pin struck by the hammer) strikes the cartridge primer, which then ignites the gunpowder within. The primer charge is at the base of the cartridge, either within the rim (a "rimfire" cartridge) or in a small percussion cap embedded in the center of the base (a "centerfire" cartridge). As a rule, centerfire cartridges are more powerful than rimfire cartridges, operating at considerably higher pressures than rimfire cartridges. Centerfire cartridges are also safer, as a dropped rimfire cartridge has the potential to discharge if its rim strikes the ground with sufficient force to ignite the primer. This is practically impossible with most centerfire cartridges. Nearly all contemporary firearms load cartridges directly into their breech. Some additionally or exclusively load from a magazine that holds multiple cartridges. A magazine is defined as a part of the firearm which exists to store ammunition and to assist in its feeding by the action into the breech (such as through the rotation of a revolver's cylinder or by spring-loaded platforms in most pistol and rifle designs). Some magazines, such as that of most centerfire hunting-rifles and all revolvers, are internal to and inseparable from the firearm, and are loaded by using a "clip". A clip (the term often mistakingly refers to a detachable "magazine") is a device that holds the ammunition by the rim of the case and is designed to assist the shooter in reloading the firearm's magazine. Examples include revolver speedloaders, the stripper clip used to aid loading rifles such as the Lee–Enfield or
manufacturing tolerances, most of which have been improved over time. Machine guns are often mounted on vehicles or helicopters and have been used since World War I as offensive firearms in fighter aircraft and tanks (e.g. for air combat or suppressing fire for ground troop support). The definition of a machine gun is different in U.S. law. The National Firearms Act and Firearm Owners Protection Act define a "machine gun" in the United States code Title 26, Subtitle E, Chapter 53, Subchapter B, Part 1, § 5845 as: "... any firearm which shoots ... automatically more than one shot, without manual reloading, by a single function of the trigger". "Machine gun" is therefore largely synonymous with "automatic weapon" in the U.S. civilian parlance, covering all automatic firearms. Sniper rifles The definition of a sniper rifle is disputed among military, police and civilian observers alike, however most generally define a “sniper rifle” as a high powered, semi-automatic/bolt action, precision rifle with an accurate range further than that of a standard rifle. These are often purpose-built for their applications. For example, a police sniper rifle may differ in specs from a military rifle. Police snipers generally do not engage targets at extreme range, but rather, a target at medium range. They may also have multiple targets within the shorter range, and thus a semi-automatic model is preferred to a bolt action. They also may be more compact than mil-spec rifles as police marksmen may need more portability. On the other hand, a military rifle is more likely to use a higher-powered cartridge to defeat body armor or medium-light cover. They are more commonly (but not a lot more) bolt-action, as they are simpler to build and maintain. Also, due to fewer moving and overall parts, they are much more reliable under adverse conditions. They may also have a more powerful scope to acquire targets further away. Overall, sniper units never became prominent until World War I, when the Germans displayed their usefulness on the battlefield. Since then, they have become irrevocably embedded in warfare. Examples of sniper rifles include the Accuracy International AWM, Sako TRG-42 and the CheyTac M200. Examples of specialized sniper cartridges include the .338 Lapua Magnum, .300 Winchester Magnum, and .408 CheyTac rounds. Submachine guns A submachine gun is a magazine-fed firearm, usually smaller than other automatic firearms, that fires pistol-caliber ammunition; for this reason certain submachine guns can also be referred to as machine pistols, especially when referring to handgun-sized designs such as the Škorpion vz. 61 and Glock 18. Well-known examples are the Israeli Uzi and Heckler & Koch MP5 which use the 9×19mm Parabellum cartridge, and the American Thompson submachine gun which fires .45 ACP. Because of their small size and limited projectile penetration compared to high-power rifle rounds, submachine guns are commonly favored by military, paramilitary and police forces for close-quarters engagements such as inside buildings, in urban areas or in trench complexes. Submachine guns were originally about the size of carbines. Because they fire pistol ammunition, they have limited long-range use, but in close combat can be used in fully automatic in a controllable manner due to the lighter recoil of the pistol ammunition. They are also extremely inexpensive and simple to build in time of war, enabling a nation to quickly arm its military. In the latter half of the 20th century, submachine guns were being miniaturized to the point of being only slightly larger than some large handguns. The most widely used submachine gun at the end of the 20th century was the Heckler & Koch MP5. The MP5 is actually designated as a "machine pistol" by Heckler & Koch (MP5 stands for Maschinenpistole 5, or Machine Pistol 5), although some reserve this designation for even smaller submachine guns such as the MAC-10 and Glock 18, which are about the size and shape of pistols. Automatic rifles An automatic rifle is a magazine-fed firearm, wielded by a single infantryman, that is chambered for rifle cartridges and capable of automatic fire. The M1918 Browning Automatic Rifle was the first U.S. infantry weapon of this type, and was generally used for suppressive or support fire in the role now usually filled by the light machine gun. Other early automatic rifles include the Fedorov Avtomat and the Huot Automatic Rifle. Later, German forces fielded the Sturmgewehr 44 during World War II, a light automatic rifle firing a reduced power "intermediate cartridge". This design was to become the basis for the "assault rifle" subclass of automatic weapons, as contrasted with "battle rifles", which generally fire a traditional "full-power" rifle cartridge. Assault rifles In World War II, Germany introduced the StG 44, and brought to the forefront of firearm technology what eventually became the class of firearm most widely adopted by the military, the assault rifle. An assault rifle is usually slightly smaller than a battle rifle such as the American M14, but the chief differences defining an assault rifle are select-fire capability and the use of a rifle round of lesser power, known as an intermediate cartridge. Soviet engineer Mikhail Kalashnikov quickly adapted the German concept, using a less-powerful 7.62×39mm cartridge derived from the standard 7.62×54mmR Russian battle rifle round, to produce the AK-47, which has become the world's most widely used assault rifle. Soon after World War II, the Automatic Kalashnikov AK-47 assault rifle began to be fielded by the Soviet Union and its allies in the Eastern Bloc, as well as by nations such as China, North Korea, and North Vietnam. In the United States, the assault rifle design was later in coming; the replacement for the M1 Garand of WWII was another John Garand design chambered for the new 7.62×51mm NATO cartridge; the select-fire M14, which was used by the U.S. military until the 1960s. The significant recoil of the M14 when fired in full-automatic mode was seen as a problem as it reduced accuracy, and in the 1960s it was replaced by Eugene Stoner's AR-15, which also marked a switch from the powerful .30 caliber cartridges used by the U.S. military up until early in the Vietnam War to the much less powerful but far lighter and light recoiling .223 caliber (5.56mm) intermediate cartridge. The military later designated the AR-15 as the "M16". The civilian version of the M16 continues to be known as the AR-15 and looks exactly like the military version, although to conform to ATF regulations in the U.S., it lacks the mechanism that permits fully automatic fire. Variants of both of the M16 and AK-47 are still in wide international use today, though other automatic rifle designs have since been introduced. A smaller version of the M16A2, the M4 carbine, is widely used by U.S. and NATO tank and vehicle crews, airbornes, support staff, and in other scenarios where space is limited. The IMI Galil, an Israeli-designed weapon based on the action of the AK-47, is in use by Israel, Italy, Burma, the Philippines, Peru, and Colombia. Swiss Arms of Switzerland produces the SIG SG 550 assault rifle used by France, Chile, and Spain among others, and Steyr Mannlicher produces the AUG, a bullpup rifle in use in Austria, Australia, New Zealand, Ireland, and Saudi Arabia among other nations. Modern designs call for compact weapons retaining firepower. The bullpup design, by mounting the magazine behind the trigger, unifies the accuracy and firepower of the traditional assault rifle with the compact size of the submachine gun (though submachine guns are still used); examples are the French FAMAS and the British SA80. Personal defense weapons A recently developed class of firearm is the personal defense weapon or PDW, which is in simplest terms a submachine gun designed to fire ammunitions with ballistic performance similar to rifle cartridges. While a submachine gun is desirable for its compact size and ammunition capacity, its pistol cartridges lack the penetrating capability of a rifle round. Conversely, rifle bullets can pierce light armor and are easier to shoot accurately, but even a carbine such as the Colt M4 is larger and/or longer than a submachine gun, making it harder to maneuver in close quarters. The solution many firearms manufacturers have presented is a weapon resembling a submachine gun in size and general configuration, but which fires a higher-powered armor-penetrating round (often specially designed for the weapon), thus combining the advantages of a carbine and submachine gun. This also earned the PDWs an infrequently used nickname — the submachine carbines. The FN P90 and Heckler & Koch MP7 are most famous examples of PDWs. Battle rifles Battle rifles are another subtype of rifle, usually defined as selective fire rifles that use full power rifle cartridges, examples of which include the 7.62x51mm NATO, 7.92x57mm Mauser, and 7.62x54mmR. These serve similar purposes as assault rifles, as they both are usually employed by ground infantry. However, some prefer battle rifles due to their more powerful cartridge, despite added recoil. Some semi-automatic sniper rifles are configured from battle rifles. Function Firearms are also categorized by their functioning cycle or "action" which describes its loading, firing, and unloading cycle. Manual The earliest evolution of the firearm, there are many types of manual action firearms. These can be divided into two basic categories: single shot and repeating. A single shot firearm can only be fired once per equipped barrel before it must be reloaded or charged via an external mechanism or series of steps. A repeating firearm can be fired multiple times, but can only be fired once with each subsequent pull of the trigger. Between trigger pulls, the firearm's action must be reloaded or charged via an internal mechanism. Lever action A gun which has a lever that is pulled down then back up to expel the old cartridge then load a new round. Pump action Pump action weapons are primarily shotguns. A pump action is created when the user slides a lever (usually a grip) and it brings a new round in the chamber while expelling the old one. Semi-automatic A semi-automatic, self-loading, or "auto loader" firearm is one that performs all steps necessary to prepare it for firing again after a single discharge, until cartridges are no longer available in the weapon's feed device or magazine. Auto loaders fire one round with each pull of the trigger. Some people confuse the term with "fully automatic" firearms. (See next.) While some semi-automatic rifles may resemble military-style firearms, they are not properly classified "Assault Weapons" which refers to those that continue to fire until the trigger is no longer depressed. Automatic An automatic firearm, or "fully automatic", "fully auto", or "full auto", is generally defined as one that continues to load and fire cartridges from its magazine as long as the trigger is depressed (and until the magazine is depleted of available ammunition.) The first weapon generally considered in this category is the Gatling gun, originally a carriage-mounted, crank-operated firearm with multiple rotating barrels that was fielded in the American Civil War. The modern trigger-actuated machine gun began with various designs developed in the late 19th century and fielded in World War I, such as the Maxim gun, Lewis Gun, and MG 08 "Spandau". Most automatic weapons are classed as long guns (as the ammunition used is of similar type as for rifles, and the recoil of the weapon's rapid fire is better controlled with two hands), but handgun-sized automatic weapons also exist, generally in the "submachine gun" or "machine pistol" class. Selective fire Selective fire, or "select fire", means the capability of a weapon's fire control to be adjusted in either semi-automatic, fully automatic firing modes, or 3 round burst. The modes are chosen by means of a selector, which varies depending on the weapon's design. Some selective-fire weapons have burst fire mechanisms built in to limit the maximum number of shots fired in fully automatic mode, with most common limits being two or three rounds per trigger pull. The presence of selective-fire modes on firearms allows more efficient use of ammunition for specific tactical needs, either precision-aimed or suppressive fire. This capability is most commonly found on military weapons of the 20th and 21st centuries, most notably the assault rifles. History The first primitive firearms were invented about 1250 AD in China when the man-portable fire lance (a bamboo or metal tube that could shoot ignited gunpowder) was combined with projectiles such as scrap metal, broken porcelain, or darts/arrows. An early depiction of a firearm is a sculpture from a cave in Sichuan, China. The sculpture dates to the 12th century and represents a figure carrying a vase-shaped bombard, with flames and a cannonball coming out of it. The oldest surviving gun, a hand cannon made of bronze, has been dated to 1288 because it was discovered at a site in modern-day Acheng District, Heilongjiang, China, where the Yuan Shi records that battles were fought at that time. The firearm had a 6.9 inch barrel of a 1-inch diameter, a 2.6 inch chamber for the gunpowder and a socket for the firearm's handle. It is 13.4 inches long and 7.8 pounds without the handle, which would have been made of wood. The Arabs and Mamluks had firearms in the late-13th century. Europeans obtained firearms in the 14th century. The Koreans adopted firearms from the Chinese in the 14th century. The Iranians (first Aq Qoyunlu and Safavids) and Indians (first Mughals) all got them no later than the 15th century, from the Ottoman Turks. The people of the Nusantara archipelago of Southeast Asia used the long arquebus at least by the last quarter of 15th century. Even though the knowledge of making gunpowder-based weapons in the Nusantara archipelago had been known after the failed Mongol invasion of Java (1293), and the predecessor of firearms, the pole gun (bedil tombak), was recorded as being used by Java in 1413, the knowledge of making "true" firearms came much later, after the middle of 15th century. It was brought by the Islamic nations of West Asia, most probably the Arabs. The precise year of introduction is unknown, but it may be safely concluded to be no earlier than 1460. Before the arrival of the Portuguese in Southeast Asia, the natives already possessed firearms, the Java arquebus. The technology of firearms in Southeast Asia further improved after the Portuguese capture of Malacca (1511). Starting in the 1513, the traditions of German-Bohemian gun-making merged with Turkish gun-making traditions. This resulted in the Indo-Portuguese tradition of matchlocks. Indian craftsmen modified the design by introducing a very short, almost pistol-like buttstock held against the cheek, not the shoulder, when aiming. They also reduced the caliber and made the gun lighter and more balanced. This was a hit with the Portuguese who did a lot of fighting aboard ship and on river craft, and valued a more compact gun. The Malaccan gunfounders, compared as being in the same level with those of Germany, quickly adapted these new firearms, and thus a new type of arquebus, the istinggar, appeared. The Japanese did not acquire firearms until the 16th century, and then from the Portuguese rather than from the Chinese. Developments in firearms accelerated during the 19th and 20th centuries. Breech-loading became more or less a universal standard for the reloading of most hand-held firearms and continues to be so with some notable exceptions (such as mortars). Instead of loading individual rounds into weapons, magazines holding multiple munitions were adopted—these aided rapid reloading. Automatic and semi-automatic firing mechanisms meant that a single soldier could fire many more rounds in a minute than a vintage weapon could fire over the course of a battle. Polymers and alloys in firearm construction made weaponry progressively lighter and thus easier to deploy. Ammunition changed over the centuries from simple metallic ball-shaped projectiles that rattled down the barrel to bullets and cartridges manufactured to high precision. Especially in the past century particular attention has focused on accuracy and sighting to make firearms altogether far more accurate than ever before. More than any single factor though, firearms have proliferated due to the advent of mass production—enabling arms-manufacturers to produce large quantities of weaponry to a consistent standard. Velocities of bullets increased with the use of a "jacket" of metals such as copper or copper alloys that covered a lead core and allowed the bullet to glide down the barrel more easily than exposed lead. Such bullets are designated as "full metal jacket" (FMJ). Such FMJ bullets are less likely to fragment on impact and are more likely to traverse through a target while imparting less energy. Hence, FMJ bullets impart less tissue damage than non-jacketed bullets that expand. This led to their adoption for military use by countries adhering to the Hague Convention of 1899. That said, the basic principle behind firearm operation remains unchanged to this day. A musket of several centuries ago is still similar in principle to a modern-day assault-rifle—using the expansion of gases to propel projectiles over long distances—albeit less accurately and rapidly. Evolution Early models Fire lances The Chinese fire lance from the 10th century was the direct predecessor to the modern concept of the firearm. It was not a gun itself, but an addition to soldiers' spears. Originally it consisted of paper or bamboo barrels which would contain incendiary gunpowder that could be lit one time and which would project flames at the enemy. Sometimes Chinese troops would place small projectiles within the barrel that would also be projected when the gunpowder was lit, but most of the explosive force would create flames. Later, the barrel was changed to be made of metal, so that a more explosive gunpowder could be used and put more force into the propulsion of projectiles. Hand cannons The original predecessors of all firearms, the Chinese fire lance and hand cannon, were loaded with gunpowder and the shot (initially lead shot,
covertly collect information about the British in New York. Washington had disregarded incidents of disloyalty by Benedict Arnold, who had distinguished himself in many battles. During mid-1780, Arnold began supplying British spymaster John André with sensitive information intended to compromise Washington and capture West Point, a key American defensive position on the Hudson River. Historians have noted as possible reasons for Arnold's treachery his anger at losing promotions to junior officers, or repeated slights from Congress. He was also deeply in debt, profiteering from the war, and disappointed by Washington's lack of support during his eventual court-martial. Arnold repeatedly asked for command of West Point, and Washington finally agreed in August. Arnold met André on September 21, giving him plans to take over the garrison. Militia forces captured André and discovered the plans, but Arnold escaped to New York. Washington recalled the commanders positioned under Arnold at key points around the fort to prevent any complicity, but he did not suspect Arnold's wife Peggy. Washington assumed personal command at West Point and reorganized its defenses. André's trial for espionage ended in a death sentence, and Washington offered to return him to the British in exchange for Arnold, but Clinton refused. André was hanged on October 2, 1780, despite his last request being to face a firing squad, to deter other spies. Southern theater and Yorktown In late 1778, General Clinton shipped 3,000 troops from New York to Georgia and launched a Southern invasion against Savannah, reinforced by 2,000 British and Loyalist troops. They repelled an attack by Patriots and French naval forces, which bolstered the British war effort. In mid-1779, Washington attacked Iroquois warriors of the Six Nations to force Britain's Indian allies out of New York, from which they had assaulted New England towns. In response, Indian warriors joined with Loyalist rangers led by Walter Butler and killed more than 200 frontiersmen in June, laying waste to the Wyoming Valley in Pennsylvania. Washington retaliated by ordering General John Sullivan to lead an expedition to effect "the total destruction and devastation" of Iroquois villages and take their women and children hostage. Those who managed to escape fled to Canada. Washington's troops went into quarters at Morristown, New Jersey during the winter of 1779–1780 and suffered their worst winter of the war, with temperatures well below freezing. New York Harbor was frozen over, snow and ice covered the ground for weeks, and the troops again lacked provisions. Clinton assembled 12,500 troops and attacked Charlestown, South Carolina in January 1780, defeating General Benjamin Lincoln who had only 5,100 Continental troops. The British went on to occupy the South Carolina Piedmont in June, with no Patriot resistance. Clinton returned to New York and left 8,000 troops commanded by General Charles Cornwallis. Congress replaced Lincoln with Horatio Gates; he failed in South Carolina and was replaced by Washington's choice of Nathaniel Greene, but the British already had the South in their grasp. Washington was reinvigorated, however, when Lafayette returned from France with more ships, men, and supplies, and 5,000 veteran French troops led by Marshal Rochambeau arrived at Newport, Rhode Island in July 1780. French naval forces then landed, led by Admiral Grasse, and Washington encouraged Rochambeau to move his fleet south to launch a joint land and naval attack on Arnold's troops. Washington's army went into winter quarters at New Windsor, New York in December 1780, and Washington urged Congress and state officials to expedite provisions in hopes that the army would not "continue to struggle under the same difficulties they have hitherto endured". On March 1, 1781, Congress ratified the Articles of Confederation, but the government that took effect on March2 did not have the power to levy taxes, and it loosely held the states together. General Clinton sent Benedict Arnold, now a British Brigadier General with 1,700 troops, to Virginia to capture Portsmouth and conduct raids on Patriot forces from there; Washington responded by sending Lafayette south to counter Arnold's efforts. Washington initially hoped to bring the fight to New York, drawing off British forces from Virginia and ending the war there, but Rochambeau advised Grasse that Cornwallis in Virginia was the better target. Grasse's fleet arrived off the Virginia coast, and Washington saw the advantage. He made a feint towards Clinton in New York, then headed south to Virginia. The Siege of Yorktown was a decisive Allied victory by the combined forces of the Continental Army commanded by General Washington, the French Army commanded by the General Comte de Rochambeau, and the French Navy commanded by Admiral de Grasse, in the defeat of Cornwallis' British forces. On August 19, the march to Yorktown led by Washington and Rochambeau began, which is known now as the "celebrated march". Washington was in command of an army of 7,800 Frenchmen, 3,100 militia, and 8,000 Continentals. Not well experienced in siege warfare, Washington often referred to the judgment of General Rochambeau and used his advice about how to proceed; however, Rochambeau never challenged Washington's authority as the battle's commanding officer. By late September, Patriot-French forces surrounded Yorktown, trapped the British army, and prevented British reinforcements from Clinton in the North, while the French navy emerged victorious at the Battle of the Chesapeake. The final American offensive was begun with a shot fired by Washington. The siege ended with a British surrender on October 19, 1781; over 7,000 British soldiers were made prisoners of war, in the last major land battle of the American Revolutionary War. Washington negotiated the terms of surrender for two days, and the official signing ceremony took place on October 19; Cornwallis claimed illness and was absent, sending General Charles O'Hara as his proxy. As a gesture of goodwill, Washington held a dinner for the American, French, and British generals, all of whom fraternized on friendly terms and identified with one another as members of the same professional military caste. After the surrender at Yorktown, a situation developed that threatened relations between the newly independent America and Britain. Following a series of retributive executions between Patriots and Loyalists, Washington, on May 18, 1782, wrote in a letter to General Moses Hazen that a British captain would be executed in retaliation for the execution of Joshua Huddy, a popular Patriot leader, who was hanged at the direction of the Loyalist Richard Lippincott. Washington wanted Lippincott himself to be executed but was rebuffed. Subsequently, Charles Asgill was chosen instead, by a drawing of lots from a hat. This was a violation of the 14th article of the Yorktown Articles of Capitulation, which protected prisoners of war from acts of retaliation. Later, Washington's feelings on matters changed and in a letter of November 13, 1782, to Asgill, he acknowledged Asgill's letter and situation, expressing his desire not to see any harm come to him. After much consideration between the Continental Congress, Alexander Hamilton, Washington, and appeals from the French Crown, Asgill was finally released, where Washington issued Asgill a pass that allowed his passage to New York. Demobilization and resignation When peace negotiations began in April 1782, both the British and French began gradually evacuating their forces. The American treasury was empty, unpaid, and mutinous soldiers forced the adjournment of Congress, and Washington dispelled unrest by suppressing the Newburgh Conspiracy in March 1783; Congress promised officers a five-year bonus. Washington submitted an account of $450,000 in expenses which he had advanced to the army. The account was settled, though it was allegedly vague about large sums and included expenses his wife had incurred through visits to his headquarters. The following month, a Congressional committee led by Alexander Hamilton began adapting the army for peacetime. In August 1783, Washington gave the Army's perspective to the committee in his Sentiments on a Peace Establishment. He advised Congress to keep a standing army, create a "national militia" of separate state units, and establish a navy and a national military academy. The Treaty of Paris was signed on September 3, 1783, and Great Britain officially recognized the independence of the United States. Washington then disbanded his army, giving a farewell address to his soldiers on November 2. During this time, Washington oversaw the evacuation of British forces in New York and was greeted by parades and celebrations. There he announced that Colonel Henry Knox had been promoted commander-in-chief. Washington and Governor George Clinton took formal possession of the city on November 25. In early December 1783, Washington bade farewell to his officers at Fraunces Tavern and resigned as commander-in-chief soon thereafter, refuting Loyalist predictions that he would not relinquish his military command. In a final appearance in uniform, he gave a statement to the Congress: "I consider it an indispensable duty to close this last solemn act of my official life, by commending the interests of our dearest country to the protection of Almighty God, and those who have the superintendence of them, to his holy keeping." Washington's resignation was acclaimed at home and abroad and showed a skeptical world that the new republic would not degenerate into chaos. The same month, Washington was appointed president-general of the Society of the Cincinnati, a newly established hereditary fraternity of Revolutionary War officers. He served in this capacity for the remainder of his life. Early republic (1783–1789) Return to Mount Vernon Washington was longing to return home after spending just ten days at Mount Vernon out of years of war. He arrived on Christmas Eve, delighted to be "free of the bustle of a camp and the busy scenes of public life". He was a celebrity and was fêted during a visit to his mother at Fredericksburg in February 1784, and he received a constant stream of visitors wishing to pay their respects to him at Mount Vernon. Washington reactivated his interests in the Great Dismal Swamp and Potomac canal projects begun before the war, though neither paid him any dividends, and he undertook a 34-day, 680-mile (1090 km) trip to check on his land holdings in the Ohio Country. He oversaw the completion of the remodeling work at Mount Vernon, which transformed his residence into the mansion that survives to this day—although his financial situation was not strong. Creditors paid him in depreciated wartime currency, and he owed significant amounts in taxes and wages. Mount Vernon had made no profit during his absence, and he saw persistently poor crop yields due to pestilence and poor weather. His estate recorded its eleventh year running at a deficit in 1787, and there was little prospect of improvement. Washington undertook a new landscaping plan and succeeded in cultivating a range of fast-growing trees and shrubs that were native to North America. He also began breeding mules after having been gifted a Spanish jack by King Charles III of Spain in 1784. There were few mules in the United States at that time, and he believed that properly bred mules would revolutionize agriculture and transportation. Constitutional Convention of 1787 Before returning to private life in June 1783, Washington called for a strong union. Though he was concerned that he might be criticized for meddling in civil matters, he sent a circular letter to all the states, maintaining that the Articles of Confederation was no more than "a rope of sand" linking the states. He believed the nation was on the verge of "anarchy and confusion", was vulnerable to foreign intervention, and that a national constitution would unify the states under a strong central government. When Shays' Rebellion erupted in Massachusetts on August 29, 1786, over taxation, Washington was further convinced that a national constitution was needed. Some nationalists feared that the new republic had descended into lawlessness, and they met together on September 11, 1786, at Annapolis to ask Congress to revise the Articles of Confederation. One of their biggest efforts, however, was getting Washington to attend. Congress agreed to a Constitutional Convention to be held in Philadelphia in Spring 1787, and each state was to send delegates. On December 4, 1786, Washington was chosen to lead the Virginia delegation, but he declined on December 21. He had concerns about the legality of the convention and consulted James Madison, Henry Knox, and others. They persuaded him to attend it, however, as his presence might induce reluctant states to send delegates and smooth the way for the ratification process. On March 28, Washington told Governor Edmund Randolph that he would attend the convention but made it clear that he was urged to attend. Washington arrived in Philadelphia on May 9, 1787, though a quorum was not attained until Friday, May 25. Benjamin Franklin nominated Washington to preside over the convention, and he was unanimously elected to serve as president general. The convention's state-mandated purpose was to revise the Articles of Confederation with "all such alterations and further provisions" required to improve them, and the new government would be established when the resulting document was "duly confirmed by the several states". Governor Edmund Randolph of Virginia introduced Madison's Virginia Plan on May 27, the third day of the convention. It called for an entirely new constitution and a sovereign national government, which Washington highly recommended. Washington wrote Alexander Hamilton on July 10: "I almost despair of seeing a favorable issue to the proceedings of our convention and do therefore repent having had any agency in the business." Nevertheless, he lent his prestige to the goodwill and work of the other delegates. He unsuccessfully lobbied many to support ratification of the Constitution, such as anti-federalist Patrick Henry; Washington told him "the adoption of it under the present circumstances of the Union is in my opinion desirable" and declared the alternative would be anarchy. Washington and Madison then spent four days at Mount Vernon evaluating the new government's transition. Chancellor of William & Mary In 1788, the Board of Visitors of the College of William & Mary decided to re-establish the position of Chancellor, and elected Washington to the office on January 18. The College Rector Samuel Griffin wrote to Washington inviting him to the post, and in a letter dated April 30, 1788, Washington accepted the position of the 14th Chancellor of the College of William & Mary. He continued to serve in the post through his presidency until his death on December 14, 1799. First presidential election The delegates to the Convention anticipated a Washington presidency and left it to him to define the office once elected. The state electors under the Constitution voted for the president on February 4, 1789, and Washington suspected that most republicans had not voted for him. The mandated March4 date passed without a Congressional quorum to count the votes, but a quorum was reached on April 5. The votes were tallied the next day, and Congressional Secretary Charles Thomson was sent to Mount Vernon to tell Washington he had been elected president. Washington won the majority of every state's electoral votes; John Adams received the next highest number of votes and therefore became vice president. Washington had "anxious and painful sensations" about leaving the "domestic felicity" of Mount Vernon, but departed for New York City on April 16 to be inaugurated. Presidency (1789–1797) Washington was inaugurated on April 30, 1789, taking the oath of office at Federal Hall in New York City. His coach was led by militia and a marching band and followed by statesmen and foreign dignitaries in an inaugural parade, with a crowd of 10,000. Chancellor Robert R. Livingston administered the oath, using a Bible provided by the Masons, after which the militia fired a 13-gun salute. Washington read a speech in the Senate Chamber, asking "that Almighty Being who rules over the universe, who presides in the councils of nations—and whose providential aids can supply every human defect, consecrate the liberties and happiness of the people of the United States". Though he wished to serve without a salary, Congress insisted adamantly that he accept it, later providing Washington $25,000 per year to defray costs of the presidency. Washington wrote to James Madison: "As the first of everything in our situation will serve to establish a precedent, it is devoutly wished on my part that these precedents be fixed on true principles." To that end, he preferred the title "Mr. President" over more majestic names proposed by the Senate, including "His Excellency" and "His Highness the President". His executive precedents included the inaugural address, messages to Congress, and the cabinet form of the executive branch. Washington had planned to resign after his first term, but the political strife in the nation convinced him he should remain in office. He was an able administrator and a judge of talent and character, and he regularly talked with department heads to get their advice. He tolerated opposing views, despite fears that a democratic system would lead to political violence, and he conducted a smooth transition of power to his successor. He remained non-partisan throughout his presidency and opposed the divisiveness of political parties, but he favored a strong central government, was sympathetic to a Federalist form of government, and leery of the Republican opposition. Washington dealt with major problems. The old Confederation lacked the powers to handle its workload and had weak leadership, no executive, a small bureaucracy of clerks, a large debt, worthless paper money, and no power to establish taxes. He had the task of assembling an executive department and relied on Tobias Lear for advice selecting its officers. Great Britain refused to relinquish its forts in the American West, and Barbary pirates preyed on American merchant ships in the Mediterranean at a time when the United States did not even have a navy. Cabinet and executive departments Congress created executive departments in 1789, including the State Department in July, the Department of War in August, and the Treasury Department in September. Washington appointed fellow Virginian Edmund Randolph as Attorney General, Samuel Osgood as Postmaster General, Thomas Jefferson as Secretary of State, and Henry Knox as Secretary of War. Finally, he appointed Alexander Hamilton as Secretary of the Treasury. Washington's cabinet became a consulting and advisory body, not mandated by the Constitution. Washington's cabinet members formed rival parties with sharply opposing views, most fiercely illustrated between Hamilton and Jefferson. Washington restricted cabinet discussions to topics of his choosing, without participating in the debate. He occasionally requested cabinet opinions in writing and expected department heads to agreeably carry out his decisions. Domestic issues Washington was apolitical and opposed the formation of parties, suspecting that conflict would undermine republicanism. He exercised great restraint in using his veto power, writing that "I give my Signature to many Bills with which my Judgment is at variance…." His closest advisors formed two factions, portending the First Party System. Secretary of the Treasury Alexander Hamilton formed the Federalist Party to promote national credit and a financially powerful nation. Secretary of State Thomas Jefferson opposed Hamilton's agenda and founded the Jeffersonian Republicans. Washington favored Hamilton's agenda, however, and it ultimately went into effect—resulting in bitter controversy. Washington proclaimed November 26 as a day of Thanksgiving to encourage national unity. "It is the duty of all nations to acknowledge the providence of Almighty God, to obey His will, to be grateful for His benefits, and humbly to implore His protection and favor." He spent that day fasting and visiting debtors in prison to provide them with food and beer. African Americans In response to two antislavery petitions that were presented to Congress in 1790, slaveholders in Georgia and South Carolina objected and threatened to "blow the trumpet of civil war". Washington and Congress responded with a series of racist measures: naturalized citizenship was denied to black immigrants; blacks were barred from serving in state militias; the Southwest Territory that would soon become the state of Tennessee was permitted to maintain slavery; and two more slave states were admitted (Kentucky in 1792, and Tennessee in 1796). On February 12, 1793, Washington signed into law the Fugitive Slave Act, which overrode state laws and courts, allowing agents to cross state lines to capture and return escaped slaves. Many free blacks in the north decried the law believing it would allow bounty hunting and the kidnappings of blacks. The Fugitive Slave Act gave effect to the Constitution's Fugitive Slave Clause, and the Act was passed overwhelmingly in Congress (e.g. the vote was 48 to 7 in the House). On the anti-slavery side of the ledger, in 1789 Washington signed a reenactment of the Northwest Ordinance which had freed all slaves brought after 1787 into a vast expanse of federal territory north of the Ohio River, except for slaves escaping from slave states. That 1787 law lapsed when the new U.S. Constitution was ratified in 1789. The Slave Trade Act of 1794, which sharply limited American involvement in the Atlantic slave trade, was also signed by Washington. And, Congress acted on February 18, 1791, to admit the free state of Vermont into the Union as the 14th state as of March 4, 1791. National Bank Washington's first term was largely devoted to economic concerns, in which Hamilton had devised various plans to address matters. The establishment of public credit became a primary challenge for the federal government. Hamilton submitted a report to a deadlocked Congress, and he, Madison, and Jefferson reached the Compromise of 1790 in which Jefferson agreed to Hamilton's debt proposals in exchange for moving the nation's capital temporarily to Philadelphia and then south near Georgetown on the Potomac River. The terms were legislated in the Funding Act of 1790 and the Residence Act, both of which Washington signed into law. Congress authorized the assumption and payment of the nation's debts, with funding provided by customs duties and excise taxes. Hamilton created controversy among Cabinet members by advocating establishing the First Bank of the United States. Madison and Jefferson objected, but the bank easily passed Congress. Jefferson and Randolph insisted that the new bank was beyond the authority granted by the constitution, as Hamilton believed. Washington sided with Hamilton and signed the legislation on February 25, and the rift became openly hostile between Hamilton and Jefferson. The nation's first financial crisis occurred in March 1792. Hamilton's Federalists exploited large loans to gain control of U.S. debt securities, causing a run on the national bank; the markets returned to normal by mid-April. Jefferson believed Hamilton was part of the scheme, despite Hamilton's efforts to ameliorate, and Washington again found himself in the middle of a feud. Jefferson–Hamilton feud Jefferson and Hamilton adopted diametrically opposed political principles. Hamilton believed in a strong national government requiring a national bank and foreign loans to function, while Jefferson believed the states and the farm element should primarily direct the government; he also resented the idea of banks and foreign loans. To Washington's dismay, the two men persistently entered into disputes and infighting. Hamilton demanded that Jefferson resign if he could not support Washington, and Jefferson told Washington that Hamilton's fiscal system would lead to the overthrow of the Republic. Washington urged them to call a truce for the nation's sake, but they ignored him. Washington reversed his decision to retire after his first term to minimize party strife, but the feud continued after his re-election. Jefferson's political actions, his support of Freneau's National Gazette, and his attempt to undermine Hamilton nearly led Washington to dismiss him from the cabinet; Jefferson ultimately resigned his position in December 1793, and Washington forsook him from that time on. The feud led to the well-defined Federalist and Republican parties, and party affiliation became necessary for election to Congress by 1794. Washington remained aloof from congressional attacks on Hamilton, but he did not publicly protect him, either. The Hamilton–Reynolds sex scandal opened Hamilton to disgrace, but Washington continued to hold him in "very high esteem" as the dominant force in establishing federal law and government. Whiskey Rebellion In March 1791, at Hamilton's urging, with support from Madison, Congress imposed an excise tax on distilled spirits to help curtail the national debt, which took effect in July. Grain farmers strongly protested in Pennsylvania's frontier districts; they argued that they were unrepresented and were shouldering too much of the debt, comparing their situation to excessive British taxation before the Revolutionary War. On August 2, Washington assembled his cabinet to discuss how to deal with the situation. Unlike Washington, who had reservations about using force, Hamilton had long waited for such a situation and was eager to suppress the rebellion by using federal authority and force. Not wanting to involve the federal government if possible, Washington called on Pennsylvania state officials to take the initiative, but they declined to take military action. On August 7, Washington issued his first proclamation for calling up state militias. After appealing for peace, he reminded the protestors that, unlike the rule of the British crown, the Federal law was issued by state-elected representatives. Threats and violence against tax collectors, however, escalated into defiance against federal authority in 1794 and gave rise to the Whiskey Rebellion. Washington issued a final proclamation on September 25, threatening the use of military force to no avail. The federal army was not up to the task, so Washington invoked the Militia Act of 1792 to summon state militias. Governors sent troops, initially commanded by Washington, who gave the command to Light-Horse Harry Lee to lead them into the rebellious districts. They took 150 prisoners, and the remaining rebels dispersed without further fighting. Two of the prisoners were condemned to death, but Washington exercised his Constitutional authority for the first time and pardoned them. Washington's forceful action demonstrated that the new government could protect itself and its tax collectors. This represented the first use of federal military force against the states and citizens, and remains the only time an incumbent president has commanded troops in the field. Washington justified his action against "certain self-created societies", which he regarded as "subversive organizations" that threatened the national union. He did not dispute their right to protest, but he insisted that their dissent must not violate federal law. Congress agreed and extended their congratulations to him; only Madison and Jefferson expressed indifference. Foreign affairs In April 1792, the French Revolutionary Wars began between Great Britain and France, and Washington declared America's neutrality. The revolutionary government of France sent diplomat Citizen Genêt to America, and he was welcomed with great enthusiasm. He created a network of new Democratic-Republican Societies promoting France's interests, but Washington denounced them and demanded that the French recall Genêt. The National Assembly of France granted Washington honorary French citizenship on August 26, 1792, during the early stages of the French Revolution. Hamilton formulated the Jay Treaty to normalize trade relations with Great Britain while removing them from western forts, and also to resolve financial debts remaining from the Revolution. Chief Justice John Jay acted as Washington's negotiator and signed the treaty on November 19, 1794; critical Jeffersonians, however, supported France. Washington deliberated, then supported the treaty because it avoided war with Britain, but was disappointed that its provisions favored Britain. He mobilized public opinion and secured ratification in the Senate but faced frequent public criticism. The British agreed to abandon their forts around the Great Lakes, and the United States modified the boundary with Canada. The government liquidated numerous pre-Revolutionary debts, and the British opened the British West Indies to American trade. The treaty secured peace with Britain and a decade of prosperous trade. Jefferson claimed that it angered France and "invited rather than avoided" war. Relations with France deteriorated afterward, leaving succeeding president John Adams with prospective war. James Monroe was the American Minister to France, but Washington recalled him for his opposition to the Treaty. The French refused to accept his replacement Charles Cotesworth Pinckney, and the French Directory declared the authority to seize American ships two days before Washington's term ended. Native American affairs Ron Chernow describes Washington as always trying to be even-handed in dealing with the Natives. He states that Washington hoped they would abandon their itinerant hunting life and adapt to fixed agricultural communities in the manner of white settlers. He also maintains that Washington never advocated outright confiscation of tribal land or the forcible removal of tribes and that he berated American settlers who abused natives, admitting that he held out no hope for pacific relations with the natives as long as "frontier settlers entertain the opinion that there is not the same crime (or indeed no crime at all) in killing a native as in killing a white man." By contrast, Colin G. Calloway writes that "Washington had a lifelong obsession with getting Indian land, either for himself or for his nation, and initiated policies and campaigns that had devastating effects in Indian country." "The growth of the nation," Galloway has stated, "demanded the dispossession of Indian people. Washington hoped the process could be bloodless and that Indian people would give up their lands for a "fair" price and move away. But if Indians refused and resisted, as they often did, he felt he had no choice but to "extirpate" them and that the expeditions he sent to destroy Indian towns were therefore entirely justified." During the Fall of 1789, Washington had to contend with the British refusing to evacuate their forts in the Northwest frontier and their concerted efforts to incite hostile Indian tribes to attack American settlers. The Northwest tribes under Miami chief Little Turtle allied with the British Army to resist American expansion, and killed 1,500 settlers between 1783 and 1790. As documented by Harless (2018), Washington declared that "The Government of the United States are determined that their Administration of Indian Affairs shall be directed entirely by the great principles of Justice and humanity", and provided that treaties should negotiate their land interests. The administration regarded powerful tribes as foreign nations, and Washington even smoked a peace pipe and drank wine with them at the Philadelphia presidential house. He made numerous attempts to conciliate them; he equated killing indigenous peoples with killing whites and sought to integrate them into European-American culture. Secretary of War Henry Knox also attempted to encourage agriculture among the tribes. In the Southwest, negotiations failed between federal commissioners and raiding Indian tribes seeking retribution. Washington invited Creek Chief Alexander McGillivray and 24 leading chiefs to New York to negotiate a treaty and treated them like foreign dignitaries. Knox and McGillivray concluded the Treaty of New York on August 7, 1790, in Federal Hall, which provided the tribes with agricultural supplies and McGillivray with a rank of Brigadier General Army and a salary of $1,500. In 1790, Washington sent Brigadier General Josiah Harmar to pacify the Northwest tribes, but Little Turtle routed him twice and forced him to withdraw. The Western Confederacy of tribes used guerrilla tactics and were an effective force against the sparsely manned American Army. Washington sent Major General Arthur St. Clair from Fort Washington on an expedition to restore peace in the territory in 1791. On November 4, St. Clair's forces were ambushed and soundly defeated by tribal forces with few survivors, despite Washington's warning of surprise attacks. Washington was outraged over what he viewed to be excessive Native American brutality and execution of captives, including women and children. St. Clair resigned his commission, and Washington replaced him with the Revolutionary War hero General Anthony Wayne. From 1792 to 1793, Wayne instructed his troops on Native American warfare tactics and instilled discipline which was lacking under St. Clair. In August 1794, Washington sent Wayne into tribal territory with authority to drive them out by burning their villages and crops in the Maumee Valley. On August 24, the American army under Wayne's leadership defeated the western confederacy at the Battle of Fallen Timbers, and the Treaty of Greenville in August 1795 opened up two-thirds of the Ohio Country for American settlement. Second term Originally, Washington had planned to retire after his first term, while many Americans could not imagine anyone else taking his place. After nearly four years as president, and dealing with the infighting in his own cabinet and with partisan critics, Washington showed little enthusiasm in running for a second term, while Martha also wanted him not to run. James Madison urged him not to retire, that his absence would only allow the dangerous political rift in his cabinet and the House to worsen. Jefferson also pleaded with him not to retire and agreed to drop his attacks on Hamilton, or he would also retire if Washington did. Hamilton maintained that Washington's absence would be "deplored as the greatest evil" to the country at this time. Washington's close nephew George Augustine Washington, his manager at Mount Vernon, was critically ill and had to be replaced, further increasing Washington's desire to retire and return to Mount Vernon. When the election of 1792 neared, Washington did not publicly announce his presidential candidacy. Still, he silently consented to run to prevent a further political-personal rift in his cabinet. The Electoral College unanimously elected him president on February 13, 1793, and John Adams as vice president by a vote of 77 to 50. Washington, with nominal fanfare, arrived alone at his inauguration in his carriage. Sworn into office by Associate Justice William Cushing on March 4, 1793, in the Senate Chamber of Congress Hall in Philadelphia, Washington gave a brief address and then immediately retired to his Philadelphia presidential house, weary of office and in poor health. On April 22, 1793, during the French Revolution, Washington issued his famous Neutrality Proclamation and was resolved to pursue "a conduct friendly and impartial toward the belligerent Powers" while he warned Americans not to intervene in the international conflict. Although Washington recognized France's revolutionary government, he would eventually ask French minister to America Citizen Genêt be recalled over the Citizen Genêt Affair. Genêt was a diplomatic troublemaker who was openly hostile toward Washington's neutrality policy. He procured four American ships as privateers to strike at Spanish forces (British allies) in Florida while organizing militias to strike at other British possessions. However, his efforts failed to draw America into the foreign campaigns during Washington's presidency. On July 31, 1793, Jefferson submitted his resignation from Washington's cabinet. Washington signed the Naval Act of 1794 and commissioned the first six federal frigates to combat Barbary pirates. In January 1795, Hamilton, who desired more income for his family, resigned office and was replaced by Washington appointment Oliver Wolcott, Jr. Washington and Hamilton remained friends. However, Washington's relationship with his Secretary of War Henry Knox deteriorated. Knox resigned office on the rumor he profited from construction contracts on U.S. Frigates. In the final months of his presidency, Washington was assailed by his political foes and a partisan press who accused him of being ambitious and greedy, while he argued that he had taken no salary during the war and had risked his life in battle. He regarded the press as a disuniting, "diabolical" force of falsehoods, sentiments that he expressed in his Farewell Address. At the end of his second term, Washington retired for personal and political reasons, dismayed with personal attacks, and to ensure that a truly contested presidential election could be held. He did not feel bound to a two-term limit, but his retirement set a significant precedent. Washington is often credited with setting the principle of a two-term presidency, but it was Thomas Jefferson who first refused to run for a third term on political grounds. Farewell Address In 1796, Washington declined to run for a third term of office, believing his death in office would create an image of a lifetime appointment. The precedent of a two-term limit was created by his retirement from office. In May 1792, in anticipation of his retirement, Washington instructed James Madison to prepare a "valedictory address", an initial draft of which was entitled the "Farewell Address". In May 1796, Washington sent the manuscript to his Secretary of Treasury Alexander Hamilton who did an extensive rewrite, while Washington provided final edits. On September 19, 1796, David Claypoole's American Daily Advertiser published the final version of the address. Washington stressed that national identity was paramount, while a united America would safeguard freedom and prosperity. He warned the nation of three eminent dangers: regionalism, partisanship, and foreign entanglements, and said the "name of AMERICAN, which belongs to you, in your national capacity, must always exalt the just pride of patriotism, more than any appellation derived from local discriminations." Washington called for men to move beyond partisanship for the common good, stressing that the United States must concentrate on its own interests. He warned against foreign alliances and their influence in domestic affairs, and bitter partisanship and the dangers of political parties. He counseled friendship and commerce with all nations, but advised against involvement in European wars. He stressed the importance of religion, asserting that "religion and morality are indispensable supports" in a republic. Washington's address favored Hamilton's Federalist ideology and economic policies. Washington closed the address by reflecting on his legacy: After initial publication, many Republicans, including Madison, criticized the Address and believed it was an anti-French campaign document. Madison believed Washington was strongly pro-British. Madison also was suspicious of who authored the Address. In 1839, Washington biographer Jared Sparks maintained that Washington's "...Farewell Address was printed and published with the laws, by order of the legislatures, as an evidence of the value they attached to its political precepts, and of their affection for its author." In 1972, Washington scholar James Flexner referred to the Farewell Address as receiving as much acclaim as Thomas Jefferson's Declaration of Independence and Abraham Lincoln's Gettysburg Address. In 2010, historian Ron Chernow reported the Farewell Address proved to be one of the most influential statements on Republicanism. Post-presidency (1797–1799) Retirement Washington retired to Mount Vernon in March 1797 and devoted time to his plantations and other business interests, including his distillery. His plantation operations were only minimally profitable, and his lands in the west (Piedmont) were under Indian attacks and yielded little income, with the squatters there refusing to pay rent. He attempted to sell these but without success. He became an even more committed Federalist. He vocally supported the Alien and Sedition Acts and convinced Federalist John Marshall to run for Congress to weaken the Jeffersonian hold on Virginia. Washington grew restless in retirement, prompted by tensions with France, and he wrote to Secretary of War James McHenry offering to organize President Adams' army. In a continuation of the French Revolutionary Wars, French privateers began seizing American ships in 1798, and relations deteriorated with France and led to the "Quasi-War". Without consulting Washington, Adams nominated him for a lieutenant general commission on July 4, 1798, and the position of commander-in-chief of the armies. Washington chose to accept, replacing James Wilkinson, and he served as the commanding general from July 13, 1798, until his death 17 months later. He participated in planning for a provisional army, but he avoided involvement in details. In advising McHenry of potential officers for the army, he appeared to make a complete break with Jefferson's Democratic-Republicans: "you could as soon scrub the blackamoor white, as to change the principles of a profest Democrat; and that he will leave nothing unattempted to overturn the government of this country." Washington delegated the active leadership of the army to Hamilton, a major general. No army invaded the United States during this period, and Washington did not assume a field command. Washington was known to be rich because of the well-known "glorified façade of wealth and grandeur" at Mount Vernon, but nearly all his wealth was in the form of land and slaves rather than ready cash. To supplement his income, he erected a distillery for substantial whiskey production. Historians estimate that the estate was worth about $1million in 1799 dollars, . He bought land parcels to spur development around the new Federal City named in his honor, and he sold individual lots to middle-income investors rather than multiple lots to large investors, believing they would more likely commit to making improvements. Final days and death On December 12, 1799, Washington inspected his farms on horseback. He returned home late and had guests over for dinner. He had a sore throat the next day but was well enough to mark trees for cutting. That evening, he complained of chest congestion but was still cheerful. On Saturday, he awoke to an inflamed throat and difficulty breathing, so he ordered estate overseer George Rawlins to remove nearly a pint of his blood; bloodletting was a common practice of the time. His family summoned Doctors James Craik, Gustavus Richard Brown, and Elisha C. Dick. (Dr. William Thornton arrived some hours after Washington died.) Dr. Brown thought Washington had quinsy; Dr. Dick thought the condition was a more serious "violent inflammation of the throat". They continued the process of bloodletting to approximately five pints, and Washington's condition deteriorated further. Dr. Dick proposed a tracheotomy, but the others were not familiar with that procedure and therefore disapproved. Washington instructed Brown and Dick to leave the room, while he assured Craik, "Doctor, I die hard, but I am not afraid to go." Washington's death came more swiftly than expected. On his deathbed, he instructed his private secretary Tobias Lear to wait three days before his burial, out of fear of being entombed alive. According to Lear, he died peacefully between 10 and 11 p.m. on December 14, 1799, with Martha seated at the foot of his bed. His last words were "'Tis well", from his conversation with Lear about his burial. He was 67. Congress immediately adjourned for the day upon news of Washington's death, and the Speaker's chair was shrouded in black the next morning. The funeral was held four days after his death on December 18, 1799, at Mount Vernon, where his body was interred. Cavalry and foot soldiers led the procession, and six colonels served as the pallbearers. The Mount Vernon funeral service was restricted mostly to family and friends. Reverend Thomas Davis read the funeral service by the vault with a brief address, followed by a ceremony performed by various members of Washington's Masonic lodge in Alexandria, Virginia. Congress chose Light-Horse Harry Lee to deliver the eulogy. Word of his death traveled slowly; church bells rang in the cities, and many places of business closed. People worldwide admired Washington and were saddened by his death, and memorial processions were held in major cities of the United States. Martha wore a black mourning cape for one year, and she burned their correspondence to protect their privacy. Only five letters between the couple are known to have survived: two from Martha to George and three from him to her. The diagnosis of Washington's illness and the immediate cause of his death have been subjects of debate since the day he died. The published account of Drs. Craik and Brown stated that his symptoms had been consistent with cynanche trachealis (tracheal inflammation), a term of that period used to describe severe inflammation of the upper windpipe, including quinsy. Accusations have persisted since Washington's death concerning medical malpractice, with some believing he had been bled to death. Various modern medical authors have speculated that he died from a severe case of epiglottitis complicated by the given treatments, most notably the massive blood loss which almost certainly caused hypovolemic shock. Burial, net worth, and aftermath Washington was buried in the old Washington family vault at Mount Vernon, situated on a grassy slope overspread with willow, juniper, cypress, and chestnut trees. It contained the remains of his brother Lawrence and other family members, but the decrepit brick vault needed repair, prompting Washington to leave instructions in his will for the construction of a new vault. Washington's estate at the time of his death was worth an estimated $780,000 in 1799, approximately equivalent to $17.82million in 2021. Washington's peak net worth was $587.0 million, including his 300 slaves. Washington held title to more than 65,000 acres of land in 37 different locations. In 1830, a disgruntled ex-employee of the estate attempted to steal what he thought was Washington's skull, prompting the construction of a more secure vault. The next year, the new vault was constructed at Mount Vernon to receive the remains of George and Martha and other relatives. In 1832, a joint Congressional committee debated moving his body from Mount Vernon to a crypt in the Capitol. The crypt had been built by architect Charles Bulfinch in the 1820s during the reconstruction of the burned-out capital, after the Burning of Washington by the British during the War of 1812. Southern opposition was intense, antagonized by an ever-growing rift between North and South; many were concerned that Washington's remains could end up on "a shore foreign to his native soil" if the country became divided, and Washington's remains stayed in Mount Vernon. On October 7, 1837, Washington's remains were placed, still in the original lead coffin, within a marble sarcophagus designed by William Strickland and constructed by John Struthers earlier that year. The sarcophagus was sealed and encased with planks, and an outer vault was constructed around it. The outer vault has the sarcophagi of both George and Martha Washington; the inner vault has the remains of other Washington family members and relatives. Personal life Washington was somewhat reserved in personality, but he generally had a strong presence among others. He made speeches and announcements when required, but he was not a noted orator or debater. He was taller than most of his contemporaries; accounts of his height vary from to tall, he weighed between as an adult, and he was known for his great strength. He had grey-blue eyes and reddish-brown hair which he wore powdered in the fashion of the day. He had a rugged and dominating presence, which garnered respect from his peers. He bought William Lee on May 27, 1768, and he was Washington's valet for 20 years. He was the only slave freed immediately in Washington's will. Washington frequently suffered from severe tooth decay and ultimately lost all his teeth but one. He had several sets of false teeth, which he wore during his presidency, made using a variety of materials including both animal and human teeth, but wood was not used despite common lore. These dental problems left him in constant pain, for which he took laudanum. As a public figure, he relied upon the strict confidence of his dentist. Washington was a talented equestrian early in life. He collected thoroughbreds at Mount Vernon, and his two favorite horses were Blueskin and Nelson. Fellow Virginian Thomas Jefferson said Washington was "the best horseman of his age and the most graceful figure that could be seen on horseback"; he also hunted foxes, deer, ducks, and other game. He was an excellent dancer and attended the theater frequently. He drank in moderation but was morally opposed to excessive drinking, smoking tobacco, gambling, and profanity. Religion and Freemasonry Washington was descended from Anglican minister Lawrence Washington (his great-great-grandfather), whose troubles with the Church of England may have prompted his heirs to emigrate to America. Washington was baptized as an infant in April 1732 and became a devoted member of the Church of England (the Anglican Church). He served more than 20 years as a vestryman and churchwarden for Fairfax Parish and Truro Parish, Virginia. He privately prayed and read the Bible daily, and he publicly encouraged people and the nation to pray. He may have taken communion on a regular basis prior to the Revolutionary War, but he did not do so following the war, for which he was admonished by Pastor James Abercrombie. Washington believed in a "wise, inscrutable, and irresistible" Creator God who was active in the Universe, contrary to deistic thought. He referred to God by the Enlightenment terms Providence, the Creator, or the Almighty, and also as the Divine Author or the Supreme Being. He believed in a divine power who watched over battlefields, was involved in the outcome of war, was protecting his life, and was involved in American politics—and specifically in the creation of the United States. Modern historian Ron Chernow has posited that Washington avoided evangelistic Christianity or hellfire-and-brimstone speech along with communion and anything inclined to "flaunt his religiosity". Chernow has also said Washington "never used his religion as a device for partisan purposes or in official undertakings". No mention of Jesus Christ appears in his private correspondence, and such references are rare in his public writings. He frequently quoted from the Bible or paraphrased it, and often referred to the Anglican Book of Common Prayer. There is debate on whether he is best classed as a Christian or a theistic rationalist—or both. Washington emphasized religious toleration in a nation with numerous denominations and religions. He publicly attended services of different Christian denominations and prohibited anti-Catholic celebrations in the Army. He engaged workers at Mount Vernon without regard for religious belief or affiliation. While president, he acknowledged major religious sects and gave speeches on religious toleration. He was distinctly rooted in the ideas, values, and modes of thinking of the Enlightenment, but he harbored no contempt of organized Christianity and its clergy, "being no bigot myself to any mode of worship". In 1793, speaking to members of the New Church in Baltimore, Washington proclaimed, "We have abundant reason to rejoice that in this Land the light of truth and reason has triumphed over the power of bigotry and superstition." Freemasonry was a widely accepted institution in the late 18th century, known for advocating moral teachings. Washington was attracted to the Masons' dedication to the Enlightenment principles of rationality, reason, and brotherhood. The American Masonic lodges did not share the anti-clerical perspective of the controversial European lodges. A Masonic lodge was established in Fredericksburg in September 1752, and Washington was initiated two months later at the age of 20 as one of its first Entered Apprentices. Within a year, he progressed through its ranks to become a Master Mason. Washington had high regard for the Masonic Order, but his personal lodge attendance was sporadic. In 1777, a convention of Virginia lodges asked him to be the Grand Master of the newly established Grand Lodge of Virginia, but he declined due to his commitments leading the Continental Army. After 1782, he frequently corresponded with Masonic lodges and members, and he was listed as Master in the Virginia charter of Alexandria Lodge No. 22 in 1788. Slavery In Washington's lifetime, slavery was deeply ingrained in the economic and social fabric of Virginia. Slavery was legal in all of the Thirteen Colonies prior
of this Family". He canceled all business activity and remained with Martha every night for three months. Opposition to British Parliament and Crown Washington played a central role before and during the American Revolution. His disdain for the British military had begun when he was passed over for promotion into the Regular Army. Opposed to taxes imposed by the British Parliament on the Colonies without proper representation, he and other colonists were also angered by the Royal Proclamation of 1763 which banned American settlement west of the Allegheny Mountains and protected the British fur trade. Washington believed the Stamp Act of 1765 was an "Act of Oppression", and he celebrated its repeal the following year. In March 1766, Parliament passed the Declaratory Act asserting that Parliamentary law superseded colonial law. In the late 1760s, the interference of the British Crown in American lucrative western land speculation spurred on the American Revolution. Washington himself was a prosperous land speculator, and in 1767, he encouraged "adventures" to acquire backcountry western lands. Washington helped lead widespread protests against the Townshend Acts passed by Parliament in 1767, and he introduced a proposal in May 1769 drafted by George Mason which called Virginians to boycott British goods; the Acts were mostly repealed in 1770. Parliament sought to punish Massachusetts colonists for their role in the Boston Tea Party in 1774 by passing the Coercive Acts, which Washington referred to as "an invasion of our rights and privileges". He said Americans must not submit to acts of tyranny since "custom and use shall make us as tame and abject slaves, as the blacks we rule over with such arbitrary sway". That July, he and George Mason drafted a list of resolutions for the Fairfax County committee which Washington chaired, and the committee adopted the Fairfax Resolves calling for a Continental Congress, and an end to the slave trade. On August 1, Washington attended the First Virginia Convention, where he was selected as a delegate to the First Continental Congress, September 5 to October 26, 1774, which he also attended. As tensions rose in 1774, he helped train county militias in Virginia and organized enforcement of the Continental Association boycott of British goods instituted by the Congress. The American Revolutionary War began on April 19, 1775, with the Battles of Lexington and Concord and the Siege of Boston. The colonists were divided over breaking away from British rule and split into two factions: Patriots who rejected British rule, and Loyalists who desired to remain subject to the King. General Thomas Gage was commander of British forces in America at the beginning of the war. Upon hearing the shocking news of the onset of war, Washington was "sobered and dismayed", and he hastily departed Mount Vernon on May 4, 1775, to join the Second Continental Congress in Philadelphia. Commander in chief (1775–1783) Congress created the Continental Army on June 14, 1775, and Samuel and John Adams nominated Washington to become its commander-in-chief. Washington was chosen over John Hancock because of his military experience and the belief that a Virginian would better unite the colonies. He was considered an incisive leader who kept his "ambition in check". He was unanimously elected commander in chief by Congress the next day. Washington appeared before Congress in uniform and gave an acceptance speech on June 16, declining a salary—though he was later reimbursed expenses. He was commissioned on June 19 and was roundly praised by Congressional delegates, including John Adams, who proclaimed that he was the man best suited to lead and unite the colonies. Congress appointed Washington "General & Commander in chief of the army of the United Colonies and of all the forces raised or to be raised by them", and instructed him to take charge of the siege of Boston on June 22, 1775. Congress chose his primary staff officers, including Major General Artemas Ward, Adjutant General Horatio Gates, Major General Charles Lee, Major General Philip Schuyler, Major General Nathanael Greene, Colonel Henry Knox, and Colonel Alexander Hamilton. Washington was impressed by Colonel Benedict Arnold and gave him responsibility for launching an invasion of Canada. He also engaged French and Indian War compatriot Brigadier General Daniel Morgan. Henry Knox impressed Adams with ordnance knowledge, and Washington promoted him to colonel and chief of artillery. At the start of the war, Washington opposed the recruiting of blacks, both free and enslaved, into the Continental Army. After his appointment, Washington banned their enlistment. The British saw an opportunity to divide the colonies, and the colonial governor of Virginia issued a proclamation, which promised freedom to slaves if they joined the British. Desperate for manpower by late 1777, Washington relented and overturned his ban. By the end of the war, around one-tenth of Washington's army were blacks. Following the British surrender, Washington sought to enforce terms of the preliminary Treaty of Paris (1783) by reclaiming slaves freed by the British and returning them to servitude. He arranged to make this request to Sir Guy Carleton on May 6, 1783. Instead, Carleton issued 3,000 freedom certificates and all former slaves in New York City were able to leave before the city was evacuated by the British in late November 1783. After the war Washington became the target of accusations made by General Lee involving his alleged questionable conduct as Commander in Chief during the war that were published by patriot-printer William Goddard. Goddard in a letter of May 30, 1785, had informed Washington of Lee's request to publish his account and assured him that he "...took the liberty to suppress such expressions as appeared to be the ebullitions of a disappointed & irritated mind ...". Washington replied, telling Goddard to print what he saw fit, and to let "... the impartial & dispassionate world," draw their own conclusions. Siege of Boston Early in 1775, in response to the growing rebellious movement, London sent British troops, commanded by General Thomas Gage, to occupy Boston. They set up fortifications about the city, making it impervious to attack. Various local militias surrounded the city and effectively trapped the British, resulting in a standoff. As Washington headed for Boston, word of his march preceded him, and he was greeted everywhere; gradually, he became a symbol of the Patriot cause. Upon arrival on July 2, 1775, two weeks after the Patriot defeat at nearby Bunker Hill, he set up his Cambridge, Massachusetts headquarters and inspected the new army there, only to find an undisciplined and badly outfitted militia. After consultation, he initiated Benjamin Franklin's suggested reforms—drilling the soldiers and imposing strict discipline, floggings, and incarceration. Washington ordered his officers to identify the skills of recruits to ensure military effectiveness, while removing incompetent officers. He petitioned Gage, his former superior, to release captured Patriot officers from prison and treat them humanely. In October 1775, King George III declared that the colonies were in open rebellion and relieved General Gage of command for incompetence, replacing him with General William Howe. The Continental Army, further diminished by expiring short-term enlistments, and by January 1776 reduced by half to 9,600 men, had to be supplemented with the militia, and was joined by Knox with heavy artillery captured from Fort Ticonderoga. When the Charles River froze over, Washington was eager to cross and storm Boston, but General Gates and others were opposed to untrained militia striking well-garrisoned fortifications. Washington reluctantly agreed to secure the Dorchester Heights, 100 feet above Boston, in an attempt to force the British out of the city. On March 9, under cover of darkness, Washington's troops brought up Knox's big guns and bombarded British ships in Boston harbor. On March 17, 9,000 British troops and Loyalists began a chaotic ten-day evacuation of Boston aboard 120 ships. Soon after, Washington entered the city with 500 men, with explicit orders not to plunder the city. He ordered vaccinations against smallpox to great effect, as he did later in Morristown, New Jersey. He refrained from exerting military authority in Boston, leaving civilian matters in the hands of local authorities. Invasion of Quebec (1775) The Invasion of Quebec (June 1775 – October 1776, French: Invasion du Québec) was the first major military initiative by the newly formed Continental Army during the American Revolutionary War. On June 27, 1775, Congress authorized General Philip Schuyler to investigate, and, if it seemed appropriate, begin an invasion. Benedict Arnold, passed over for its command, went to Boston and convinced General George Washington to send a supporting force to Quebec City under his command. The objective of the campaign was to seize the Province of Quebec (part of modern-day Canada) from Great Britain, and persuade French-speaking Canadiens to join the revolution on the side of the Thirteen Colonies. One expedition left Fort Ticonderoga under Richard Montgomery, besieged and captured Fort St. Johns, and very nearly captured British General Guy Carleton when taking Montreal. The other expedition, under Benedict Arnold, left Cambridge, Massachusetts and traveled with great difficulty through the wilderness of Maine to Quebec City. The two forces joined there, but they were defeated at the Battle of Quebec in December 1775. Battle of Long Island Washington then proceeded to New York City, arriving on April 13, 1776, and began constructing fortifications to thwart the expected British attack. He ordered his occupying forces to treat civilians and their property with respect, to avoid the abuses which Bostonian citizens suffered at the hands of British troops during their occupation. A plot to assassinate or capture him was discovered and thwarted, resulting in the arrest of 98 people involved or complicit (56 of which were from Long Island (Kings (Brooklyn) and Queens counties), including the Loyalist Mayor of New York David Mathews. Washington's bodyguard, Thomas Hickey, was hanged for mutiny and sedition. General Howe transported his resupplied army, with the British fleet, from Halifax to New York, knowing the city was key to securing the continent. George Germain, who ran the British war effort in England, believed it could be won with one "decisive blow". The British forces, including more than a hundred ships and thousands of troops, began arriving on Staten Island on July2 to lay siege to the city. After the Declaration of Independence was adopted on July 4, Washington informed his troops in his general orders of July9 that Congress had declared the united colonies to be "free and independent states". Howe's troop strength totaled 32,000 regulars and Hessians auxiliaries, and Washington's consisted of 23,000, mostly raw recruits and militia. In August, Howe landed 20,000 troops at Gravesend, Brooklyn, and approached Washington's fortifications, as George III proclaimed the rebellious American colonists to be traitors. Washington, opposing his generals, chose to fight, based upon inaccurate information that Howe's army had only 8,000-plus troops. In the Battle of Long Island, Howe assaulted Washington's flank and inflicted 1,500 Patriot casualties, the British suffering 400. Washington retreated, instructing General William Heath to acquisition river craft in the area. On August 30, General William Alexander held off the British and gave cover while the army crossed the East River under darkness to Manhattan Island without loss of life or materiel, although Alexander was captured. Howe, emboldened by his Long Island victory, dispatched Washington as "George Washington, Esq." in futility to negotiate peace. Washington declined, demanding to be addressed with diplomatic protocol, as general and fellow belligerent, not as a "rebel", lest his men are hanged as such if captured. The Royal Navy bombarded the unstable earthworks on lower Manhattan Island. Washington, with misgivings, heeded the advice of Generals Greene and Putnam to defend Fort Washington. They were unable to hold it, and Washington abandoned it despite General Lee's objections, as his army retired north to the White Plains. Howe's pursuit forced Washington to retreat across the Hudson River to Fort Lee to avoid encirclement. Howe landed his troops on Manhattan in November and captured Fort Washington, inflicting high casualties on the Americans. Washington was responsible for delaying the retreat, though he blamed Congress and General Greene. Loyalists in New York considered Howe a liberator and spread a rumor that Washington had set fire to the city. Patriot morale reached its lowest when Lee was captured. Now reduced to 5,400 troops, Washington's army retreated through New Jersey, and Howe broke off pursuit, delaying his advance on Philadelphia, and set up winter quarters in New York. Crossing the Delaware, Trenton, and Princeton Washington crossed the Delaware River into Pennsylvania, where Lee's replacement John Sullivan joined him with 2,000 more troops. The future of the Continental Army was in doubt for lack of supplies, a harsh winter, expiring enlistments, and desertions. Washington was disappointed that many New Jersey residents were Loyalists or skeptical about the prospect of independence. Howe split up his British Army and posted a Hessian garrison at Trenton to hold western New Jersey and the east shore of the Delaware, but the army appeared complacent, and Washington and his generals devised a surprise attack on the Hessians at Trenton, which he codenamed "Victory or Death". The army was to cross the Delaware River to Trenton in three divisions: one led by Washington (2,400 troops), another by General James Ewing (700), and the third by Colonel John Cadwalader (1,500). The force was to then split, with Washington taking the Pennington Road and General Sullivan traveling south on the river's edge. Washington first ordered a 60-mile search for Durham boats to transport his army, and he ordered the destruction of vessels that could be used by the British. Washington crossed the Delaware River on Christmas night, December 25, 1776, while he personally risked capture staking out the Jersey shoreline. His men followed across the ice-obstructed river in sleet and snow from McConkey's Ferry, with 40 men per vessel. The wind churned up the waters, and they were pelted with hail, but by 3:00a.m. on December 26, they made it across with no losses. Henry Knox was delayed, managing frightened horses and about 18 field guns on flat-bottomed ferries. Cadwalader and Ewing failed to cross due to the ice and heavy currents, and awaiting Washington doubted his planned attack on Trenton. Once Knox arrived, Washington proceeded to Trenton to take only his troops against the Hessians, rather than risk being spotted returning his army to Pennsylvania. The troops spotted Hessian positions a mile from Trenton, so Washington split his force into two columns, rallying his men: "Soldiers keep by your officers. For God's sake, keep by your officers." The two columns were separated at the Birmingham crossroads. General Nathanael Greene's column took the upper Ferry Road, led by Washington, and General John Sullivan's column advanced on River Road. (See map.) The Americans marched in sleet and snowfall. Many were shoeless with bloodied feet, and two died of exposure. At sunrise, Washington led them in a surprise attack on the Hessians, aided by Major General Knox and artillery. The Hessians had 22 killed (including Colonel Johann Rall), 83 wounded, and 850 captured with supplies. Washington retreated across Delaware River to Pennsylvania and returned to New Jersey on January 3, 1777, launching an attack on British regulars at Princeton, with 40 Americans killed or wounded and 273 British killed or captured. American Generals Hugh Mercer and John Cadwalader were being driven back by the British when Mercer was mortally wounded, then Washington arrived and led the men in a counterattack which advanced to within of the British line. Some British troops retreated after a brief stand, while others took refuge in Nassau Hall, which became the target of Colonel Alexander Hamilton's cannons. Washington's troops charged, the British surrendered in less than an hour, and 194 soldiers laid down their arms. Howe retreated to New York City where his army remained inactive until early the next year. Washington's depleted Continental Army took up winter headquarters in Morristown, New Jersey while disrupting British supply lines and expelling them from parts of New Jersey. Washington later said the British could have successfully counterattacked his encampment before his troops were dug in. The victories at Trenton and Princeton by Washington revived Patriot morale and changed the course of the war. The British still controlled New York, and many Patriot soldiers did not re-enlist or deserted after the harsh winter campaign. Congress instituted greater rewards for re-enlisting and punishments for desertion to effect greater troop numbers. Strategically, Washington's victories were pivotal for the Revolution and quashed the British strategy of showing overwhelming force followed by offering generous terms. In February 1777, word reached London of the American victories at Trenton and Princeton, and the British realized the Patriots were in a position to demand unconditional independence. Brandywine, Germantown, and Saratoga In July 1777, British General John Burgoyne led the Saratoga campaign south from Quebec through Lake Champlain and recaptured Fort Ticonderoga intending to divide New England, including control of the Hudson River. However, General Howe in British-occupied New York blundered, taking his army south to Philadelphia rather than up the Hudson River to join Burgoyne near Albany. Meanwhile, Washington and Gilbert du Motier, Marquis de Lafayette rushed to Philadelphia to engage Howe and were shocked to learn of Burgoyne's progress in upstate New York, where the Patriots were led by General Philip Schuyler and successor Horatio Gates. Washington's army of less experienced men were defeated in the pitched battles at Philadelphia. Howe outmaneuvered Washington at the Battle of Brandywine on September 11, 1777, and marched unopposed into the nation's capital at Philadelphia. A Patriot attack failed against the British at Germantown in October. Major General Thomas Conway prompted some members of Congress (referred to as the Conway Cabal) to consider removing Washington from command because of the losses incurred at Philadelphia. Washington's supporters resisted, and the matter was finally dropped after much deliberation. Once the plot was exposed, Conway wrote an apology to Washington, resigned, and returned to France. Washington was concerned with Howe's movements during the Saratoga campaign to the north, and he was also aware that Burgoyne was moving south toward Saratoga from Quebec. Washington took some risks to support Gates' army, sending reinforcements north with Generals Benedict Arnold, his most aggressive field commander, and Benjamin Lincoln. On October 7, 1777, Burgoyne tried to take Bemis Heights but was isolated from support by Howe. He was forced to retreat to Saratoga and ultimately surrendered after the Battles of Saratoga. As Washington suspected, Gates' victory emboldened his critics. Biographer John Alden maintains, "It was inevitable that the defeats of Washington's forces and the concurrent victory of the forces in upper New York should be compared." The admiration for Washington was waning, including little credit from John Adams. British commander Howe resigned in May 1778, left America forever, and was replaced by Sir Henry Clinton. Valley Forge and Monmouth Washington's army of 11,000 went into winter quarters at Valley Forge north of Philadelphia in December 1777. They suffered between 2,000 and 3,000 deaths in the extreme cold over six months, mostly from disease and lack of food, clothing, and shelter. Meanwhile, the British were comfortably quartered in Philadelphia, paying for supplies in pounds sterling, while Washington struggled with a devalued American paper currency. The woodlands were soon exhausted of game, and by February, lowered morale and increased desertions ensued. Washington made repeated petitions to the Continental Congress for provisions. He received a congressional delegation to check the Army's conditions and expressed the urgency of the situation, proclaiming: "Something must be done. Important alterations must be made." He recommended that Congress expedite supplies, and Congress agreed to strengthen and fund the army's supply lines by reorganizing the commissary department. By late February, supplies began arriving. Baron Friedrich Wilhelm von Steuben's incessant drilling soon transformed Washington's recruits into a disciplined fighting force, and the revitalized army emerged from Valley Forge early the following year. Washington promoted Von Steuben to Major General and made him chief of staff. In early 1778, the French responded to Burgoyne's defeat and entered into a Treaty of Alliance with the Americans. The Continental Congress ratified the treaty in May, which amounted to a French declaration of war against Britain. The British evacuated Philadelphia for New York that June and Washington summoned a war council of American and French Generals. He chose a partial attack on the retreating British at the Battle of Monmouth; the British were commanded by Howe's successor General Henry Clinton. Generals Charles Lee and Lafayette moved with 4,000 men, without Washington's knowledge, and bungled their first attack on June 28. Washington relieved Lee and achieved a draw after an expansive battle. At nightfall, the British continued their retreat to New York, and Washington moved his army outside the city. Monmouth was Washington's last battle in the North; he valued the safety of his army more than towns with little value to the British. West Point espionage Washington became "America's first spymaster" by designing an espionage system against the British. In 1778, Major Benjamin Tallmadge formed the Culper Ring at Washington's direction to covertly collect information about the British in New York. Washington had disregarded incidents of disloyalty by Benedict Arnold, who had distinguished himself in many battles. During mid-1780, Arnold began supplying British spymaster John André with sensitive information intended to compromise Washington and capture West Point, a key American defensive position on the Hudson River. Historians have noted as possible reasons for Arnold's treachery his anger at losing promotions to junior officers, or repeated slights from Congress. He was also deeply in debt, profiteering from the war, and disappointed by Washington's lack of support during his eventual court-martial. Arnold repeatedly asked for command of West Point, and Washington finally agreed in August. Arnold met André on September 21, giving him plans to take over the garrison. Militia forces captured André and discovered the plans, but Arnold escaped to New York. Washington recalled the commanders positioned under Arnold at key points around the fort to prevent any complicity, but he did not suspect Arnold's wife Peggy. Washington assumed personal command at West Point and reorganized its defenses. André's trial for espionage ended in a death sentence, and Washington offered to return him to the British in exchange for Arnold, but Clinton refused. André was hanged on October 2, 1780, despite his last request being to face a firing squad, to deter other spies. Southern theater and Yorktown In late 1778, General Clinton shipped 3,000 troops from New York to Georgia and launched a Southern invasion against Savannah, reinforced by 2,000 British and Loyalist troops. They repelled an attack by Patriots and French naval forces, which bolstered the British war effort. In mid-1779, Washington attacked Iroquois warriors of the Six Nations to force Britain's Indian allies out of New York, from which they had assaulted New England towns. In response, Indian warriors joined with Loyalist rangers led by Walter Butler and killed more than 200 frontiersmen in June, laying waste to the Wyoming Valley in Pennsylvania. Washington retaliated by ordering General John Sullivan to lead an expedition to effect "the total destruction and devastation" of Iroquois villages and take their women and children hostage. Those who managed to escape fled to Canada. Washington's troops went into quarters at Morristown, New Jersey during the winter of 1779–1780 and suffered their worst winter of the war, with temperatures well below freezing. New York Harbor was frozen over, snow and ice covered the ground for weeks, and the troops again lacked provisions. Clinton assembled 12,500 troops and attacked Charlestown, South Carolina in January 1780, defeating General Benjamin Lincoln who had only 5,100 Continental troops. The British went on to occupy the South Carolina Piedmont in June, with no Patriot resistance. Clinton returned to New York and left 8,000 troops commanded by General Charles Cornwallis. Congress replaced Lincoln with Horatio Gates; he failed in South Carolina and was replaced by Washington's choice of Nathaniel Greene, but the British already had the South in their grasp. Washington was reinvigorated, however, when Lafayette returned from France with more ships, men, and supplies, and 5,000 veteran French troops led by Marshal Rochambeau arrived at Newport, Rhode Island in July 1780. French naval forces then landed, led by Admiral Grasse, and Washington encouraged Rochambeau to move his fleet south to launch a joint land and naval attack on Arnold's troops. Washington's army went into winter quarters at New Windsor, New York in December 1780, and Washington urged Congress and state officials to expedite provisions in hopes that the army would not "continue to struggle under the same difficulties they have hitherto endured". On March 1, 1781, Congress ratified the Articles of Confederation, but the government that took effect on March2 did not have the power to levy taxes, and it loosely held the states together. General Clinton sent Benedict Arnold, now a British Brigadier General with 1,700 troops, to Virginia to capture Portsmouth and conduct raids on Patriot forces from there; Washington responded by sending Lafayette south to counter Arnold's efforts. Washington initially hoped to bring the fight to New York, drawing off British forces from Virginia and ending the war there, but Rochambeau advised Grasse that Cornwallis in Virginia was the better target. Grasse's fleet arrived off the Virginia coast, and Washington saw the advantage. He made a feint towards Clinton in New York, then headed south to Virginia. The Siege of Yorktown was a decisive Allied victory by the combined forces of the Continental Army commanded by General Washington, the French Army commanded by the General Comte de Rochambeau, and the French Navy commanded by Admiral de Grasse, in the defeat of Cornwallis' British forces. On August 19, the march to Yorktown led by Washington and Rochambeau began, which is known now as the "celebrated march". Washington was in command of an army of 7,800 Frenchmen, 3,100 militia, and 8,000 Continentals. Not well experienced in siege warfare, Washington often referred to the judgment of General Rochambeau and used his advice about how to proceed; however, Rochambeau never challenged Washington's authority as the battle's commanding officer. By late September, Patriot-French forces surrounded Yorktown, trapped the British army, and prevented British reinforcements from Clinton in the North, while the French navy emerged victorious at the Battle of the Chesapeake. The final American offensive was begun with a shot fired by Washington. The siege ended with a British surrender on October 19, 1781; over 7,000 British soldiers were made prisoners of war, in the last major land battle of the American Revolutionary War. Washington negotiated the terms of surrender for two days, and the official signing ceremony took place on October 19; Cornwallis claimed illness and was absent, sending General Charles O'Hara as his proxy. As a gesture of goodwill, Washington held a dinner for the American, French, and British generals, all of whom fraternized on friendly terms and identified with one another as members of the same professional military caste. After the surrender at Yorktown, a situation developed that threatened relations between the newly independent America and Britain. Following a series of retributive executions between Patriots and Loyalists, Washington, on May 18, 1782, wrote in a letter to General Moses Hazen that a British captain would be executed in retaliation for the execution of Joshua Huddy, a popular Patriot leader, who was hanged at the direction of the Loyalist Richard Lippincott. Washington wanted Lippincott himself to be executed but was rebuffed. Subsequently, Charles Asgill was chosen instead, by a drawing of lots from a hat. This was a violation of the 14th article of the Yorktown Articles of Capitulation, which protected prisoners of war from acts of retaliation. Later, Washington's feelings on matters changed and in a letter of November 13, 1782, to Asgill, he acknowledged Asgill's letter and situation, expressing his desire not to see any harm come to him. After much consideration between the Continental Congress, Alexander Hamilton, Washington, and appeals from the French Crown, Asgill was finally released, where Washington issued Asgill a pass that allowed his passage to New York. Demobilization and resignation When peace negotiations began in April 1782, both the British and French began gradually evacuating their forces. The American treasury was empty, unpaid, and mutinous soldiers forced the adjournment of Congress, and Washington dispelled unrest by suppressing the Newburgh Conspiracy in March 1783; Congress promised officers a five-year bonus. Washington submitted an account of $450,000 in expenses which he had advanced to the army. The account was settled, though it was allegedly vague about large sums and included expenses his wife had incurred through visits to his headquarters. The following month, a Congressional committee led by Alexander Hamilton began adapting the army for peacetime. In August 1783, Washington gave the Army's perspective to the committee in his Sentiments on a Peace Establishment. He advised Congress to keep a standing army, create a "national militia" of separate state units, and establish a navy and a national military academy. The Treaty of Paris was signed on September 3, 1783, and Great Britain officially recognized the independence of the United States. Washington then disbanded his army, giving a farewell address to his soldiers on November 2. During this time, Washington oversaw the evacuation of British forces in New York and was greeted by parades and celebrations. There he announced that Colonel Henry Knox had been promoted commander-in-chief. Washington and Governor George Clinton took formal possession of the city on November 25. In early December 1783, Washington bade farewell to his officers at Fraunces Tavern and resigned as commander-in-chief soon thereafter, refuting Loyalist predictions that he would not relinquish his military command. In a final appearance in uniform, he gave a statement to the Congress: "I consider it an indispensable duty to close this last solemn act of my official life, by commending the interests of our dearest country to the protection of Almighty God, and those who have the superintendence of them, to his holy keeping." Washington's resignation was acclaimed at home and abroad and showed a skeptical world that the new republic would not degenerate into chaos. The same month, Washington was appointed president-general of the Society of the Cincinnati, a newly established hereditary fraternity of Revolutionary War officers. He served in this capacity for the remainder of his life. Early republic (1783–1789) Return to Mount Vernon Washington was longing to return home after spending just ten days at Mount Vernon out of years of war. He arrived on Christmas Eve, delighted to be "free of the bustle of a camp and the busy scenes of public life". He was a celebrity and was fêted during a visit to his mother at Fredericksburg in February 1784, and he received a constant stream of visitors wishing to pay their respects to him at Mount Vernon. Washington reactivated his interests in the Great Dismal Swamp and Potomac canal projects begun before the war, though neither paid him any dividends, and he undertook a 34-day, 680-mile (1090 km) trip to check on his land holdings in the Ohio Country. He oversaw the completion of
of the region are (from west to east) Brownsville, Corpus Christi, Houston, Galveston, Beaumont, Lake Charles, Lafayette, Baton Rouge, New Orleans, Gulfport, Biloxi, Mobile, Pensacola, Navarre, St. Petersburg, and Tampa. All are the centers or major cities of their respective metropolitan areas and many of which contain large ports. Geography The Gulf Coast is made of many inlets, bays, and lagoons. The coast is intersected by numerous rivers, the largest of which is the Mississippi River. Much of the land along the Gulf Coast is, or was, marshland. Ringing the Gulf Coast is the Gulf Coastal Plain, which reaches from Southern Texas to the western Florida Panhandle, while the western portions of the Gulf Coast are made up of many barrier islands and peninsulas, including the Padre Island along the Texas coast. These landforms protect numerous bays and inlets providing as a barrier to oncoming waves. The central part of the Gulf Coast, from eastern Texas through Louisiana, consists primarily of marshland. The eastern part of the Gulf Coast, predominantly Florida, is dotted with many bays and inlets. Climate The Gulf Coast climate is humid subtropical, although the southwestern tip of Florida, such as Everglades City, features a tropical climate. Much of the year is warm to hot along the Gulf Coast, while the three winter months bring periods of cool (or rarely, cold) weather mixed with mild temperatures. The area is highly vulnerable to hurricanes as well as floods and severe thunderstorms. Much of the Gulf Coast has a summer precipitation maximum, with July or August commonly the wettest month due to the combination of frequent summer thunderstorms produced by relentless heat and humidity, and tropical weather systems (tropical depressions, tropical storms and hurricanes), while winter and early spring rainfall also can be heavy. This pattern is evident in southern cites as Houston, Texas, New Orleans, Louisiana, Mobile, Alabama and Pensacola, Florida. However, the central and southern Florida peninsula and South Texas has a pronounced winter dry season, as at Tampa and Fort Myers, Florida. On the central and southern Texas coast, winter, early spring and mid-summer are markedly drier, and September is the wettest month on average (as at Corpus Christi and Brownsville, Texas). Tornadoes are infrequent at the coast but do occur; however, they occur more frequently in inland portions of Gulf Coast states. Over most of the Gulf Coast from Houston, Texas eastward, extreme rainfall events are a significant threat, commonly from tropical weather systems, which can bring 4 to 10 or more inches of rain in a single day. In August 2017, Hurricane Harvey made landfall along the central Texas coast, then migrated to and stalled over the greater Houston area for several days, producing extreme, unprecedented rainfall totals of over 40 inches (1,000 mm) in many areas, unleashing widespread flooding. Earthquakes are extremely rare to the area, but a surprising 6.0 earthquake in the Gulf of Mexico on September 10,
eastern part of the Gulf Coast, predominantly Florida, is dotted with many bays and inlets. Climate The Gulf Coast climate is humid subtropical, although the southwestern tip of Florida, such as Everglades City, features a tropical climate. Much of the year is warm to hot along the Gulf Coast, while the three winter months bring periods of cool (or rarely, cold) weather mixed with mild temperatures. The area is highly vulnerable to hurricanes as well as floods and severe thunderstorms. Much of the Gulf Coast has a summer precipitation maximum, with July or August commonly the wettest month due to the combination of frequent summer thunderstorms produced by relentless heat and humidity, and tropical weather systems (tropical depressions, tropical storms and hurricanes), while winter and early spring rainfall also can be heavy. This pattern is evident in southern cites as Houston, Texas, New Orleans, Louisiana, Mobile, Alabama and Pensacola, Florida. However, the central and southern Florida peninsula and South Texas has a pronounced winter dry season, as at Tampa and Fort Myers, Florida. On the central and southern Texas coast, winter, early spring and mid-summer are markedly drier, and September is the wettest month on average (as at Corpus Christi and Brownsville, Texas). Tornadoes are infrequent at the coast but do occur; however, they occur more frequently in inland portions of Gulf Coast states. Over most of the Gulf Coast from Houston, Texas eastward, extreme rainfall events are a significant threat, commonly from tropical weather systems, which can bring 4 to 10 or more inches of rain in a single day. In August 2017, Hurricane Harvey made landfall along the central Texas coast, then migrated to and stalled over the greater Houston area for several days, producing extreme, unprecedented rainfall totals of over 40 inches (1,000 mm) in many areas, unleashing widespread flooding. Earthquakes are extremely rare to the area, but a surprising 6.0 earthquake in the Gulf of Mexico on September 10, 2006, could be felt from the cities of New Orleans to Tampa. Economic activities The Gulf Coast is a major center of economic activity. The marshlands along the Louisiana and Texas coasts provide breeding grounds and nurseries for ocean life that drive the fishing and shrimping industries. The Port of South Louisiana (Metropolitan New Orleans in Laplace) and the Port of Houston are two of the ten busiest ports in the world by cargo volume. As of 2004, seven of the top ten busiest ports in the U.S. are on the Gulf Coast. The discovery of oil and gas deposits along the coast and offshore, combined with easy access to shipping, have made the Gulf Coast the heart of the U.S. petrochemical industry. The coast contains nearly 4,000 oil platforms. Besides the above, the region features other important industries including aerospace and biomedical research, as well as older industries such as agriculture and — especially since the development of the Gulf Coast beginning in the 1920s and the increase in wealth throughout the United States — tourism. History Before European settlers arrived in the region, the Gulf Coast was home to several pre-Columbian kingdoms which had extensive trade networks with empires such as the Aztecs and the Mississippi Mound Builders. Shark and alligator teeth and shells from the Gulf have been found as far north as Ohio, in the mounds of the Hopewell culture. The first Europeans to settle the Gulf Coast were primarily the French and the Spanish. The Louisiana Purchase, Adams–Onís Treaty and the Texas Revolution made the Gulf Coast a part of the United States during the first half of the 19th century. As the U.S. population continued to expand its frontiers westward, the Gulf Coast was a natural magnet in the South providing access to shipping lanes and both national and international commerce. The development of sugar and cotton production (enabled by slavery) allowed the South to prosper. By the mid 19th century the city of New Orleans, being situated as a key to commerce on the Mississippi River and in the Gulf, had become the largest U.S. city not on the Atlantic seaboard and the fourth largest in the U.S. overall. Two major events were turning points in the earlier history of the Gulf Coast region. The first was the American Civil War, which caused severe damage to some economic sectors in the South, including the Gulf Coast. The second event was the Galveston Hurricane of 1900. At the end of the 19th century Galveston was, with New Orleans, one of the most developed cities in the region. The city had the third busiest port in the U.S. and its financial district was known as the "Wall Street of the South". The storm mostly destroyed the city, which has never regained its former glory, and set back development in the region. Since then the Gulf Coast has been hit with numerous other hurricanes.
ones, which matches observations. Astronomers do not currently know what process stops the contraction. In fact, theories of disk galaxy formation are not successful at producing the rotation speed and size of disk galaxies. It has been suggested that the radiation from bright newly formed stars, or from an active galactic nucleus can slow the contraction of a forming disk. It has also been suggested that the dark matter halo can pull the galaxy, thus stopping disk contraction. The Lambda-CDM model is a cosmological model that explains the formation of the universe after the Big Bang. It is a relatively simple model that predicts many properties observed in the universe, including the relative frequency of different galaxy types; however, it underestimates the number of thin disk galaxies in the universe. The reason is that these galaxy formation models predict a large number of mergers. If disk galaxies merge with another galaxy of comparable mass (at least 15 percent of its mass) the merger will likely destroy, or at a minimum greatly disrupt the disk, and the resulting galaxy is not expected to be a disk galaxy (see next section). While this remains an unsolved problem for astronomers, it does not necessarily mean that the Lambda-CDM model is completely wrong, but rather that it requires further refinement to accurately reproduce the population of galaxies in the universe. Galaxy mergers and the formation of elliptical galaxies Elliptical galaxies (such as IC 1101) are among some of the largest known thus far. Their stars are on orbits that are randomly oriented within the galaxy (i.e. they are not rotating like disk galaxies). A distinguishing feature of elliptical galaxies is that the velocity of the stars does not necessarily contribute to flattening of the galaxy, such as in spiral galaxies. Elliptical galaxies have central supermassive black holes, and the masses of these black holes correlate with the galaxy's mass. Elliptical galaxies have two main stages of evolution. The first is due to the supermassive black hole growing by accreting cooling gas. The second stage is marked by the black hole stabilizing by suppressing gas cooling, thus leaving the elliptical galaxy in a stable state. The mass of the black hole is also correlated to a property called sigma which is the dispersion of the velocities of stars in their orbits. This relationship, known as the M-sigma relation, was discovered in 2000. Elliptical galaxies mostly lack disks, although some bulges of disk galaxies resemble elliptical galaxies. Elliptical galaxies are more likely found in crowded regions of the universe (such as galaxy clusters). Astronomers now see elliptical galaxies as some of the most evolved systems in the universe. It is widely accepted that the main driving force for the evolution of elliptical galaxies is mergers of smaller galaxies. Many galaxies in the universe are gravitationally bound to other galaxies, which means that they will never escape their mutual pull. If the galaxies are of similar size, the resultant galaxy will appear similar to neither of the progenitors, but will instead be elliptical. There are many types of galaxy mergers, which do not necessarily result in elliptical galaxies, but result in a structural change. For example, a minor merger event is thought to be occurring between the Milky Way and the Magellanic Clouds. Mergers between such large galaxies are regarded as violent, and the frictional interaction of the gas between the two galaxies can cause gravitational shock waves, which are capable of forming new stars in the new elliptical galaxy. By sequencing several images of different galactic collisions, one can observe the timeline of two spiral galaxies merging into a single elliptical galaxy. In the Local Group, the Milky Way and the Andromeda Galaxy are gravitationally bound, and currently approaching each other at high speed. Simulations show that the Milky Way and Andromeda are on a collision course, and are expected to collide in less than five billion years. During this collision, it is expected that the Sun and the rest of the Solar System will be ejected from its current path around the Milky Way. The remnant could be a giant elliptical galaxy. Galaxy quenching One observation (see above) that must be explained by a successful theory of galaxy evolution is the existence of two different populations of galaxies on the galaxy color-magnitude diagram. Most galaxies tend to fall into two separate locations on this diagram: a "red sequence" and a "blue cloud". Red sequence galaxies are generally non-star-forming elliptical galaxies with little gas and dust, while blue cloud galaxies tend to be dusty star-forming spiral galaxies. As described in previous sections, galaxies tend to evolve from spiral to elliptical structure via mergers. However, the current rate of galaxy mergers does not explain how all galaxies move from the "blue cloud" to the "red sequence". It also does not explain how star formation ceases in galaxies. Theories of galaxy evolution must therefore be able to explain how star formation turns off in galaxies. This phenomenon is called galaxy "quenching". Stars form out of cold gas (see also the Kennicutt–Schmidt law), so a galaxy is quenched when it has no more cold gas. However, it is thought that quenching occurs relatively quickly (within 1 billion years), which is much shorter than the time it would take for a galaxy to simply use up its reservoir of cold gas. Galaxy evolution models explain this by hypothesizing other physical mechanisms that remove or shut off the supply of cold gas in a galaxy. These mechanisms can be broadly classified into two categories: (1) preventive feedback mechanisms that stop cold gas from entering a galaxy or stop it from producing stars, and (2) ejective feedback mechanisms that remove gas so that it cannot form
the dark matter halo can pull the galaxy, thus stopping disk contraction. The Lambda-CDM model is a cosmological model that explains the formation of the universe after the Big Bang. It is a relatively simple model that predicts many properties observed in the universe, including the relative frequency of different galaxy types; however, it underestimates the number of thin disk galaxies in the universe. The reason is that these galaxy formation models predict a large number of mergers. If disk galaxies merge with another galaxy of comparable mass (at least 15 percent of its mass) the merger will likely destroy, or at a minimum greatly disrupt the disk, and the resulting galaxy is not expected to be a disk galaxy (see next section). While this remains an unsolved problem for astronomers, it does not necessarily mean that the Lambda-CDM model is completely wrong, but rather that it requires further refinement to accurately reproduce the population of galaxies in the universe. Galaxy mergers and the formation of elliptical galaxies Elliptical galaxies (such as IC 1101) are among some of the largest known thus far. Their stars are on orbits that are randomly oriented within the galaxy (i.e. they are not rotating like disk galaxies). A distinguishing feature of elliptical galaxies is that the velocity of the stars does not necessarily contribute to flattening of the galaxy, such as in spiral galaxies. Elliptical galaxies have central supermassive black holes, and the masses of these black holes correlate with the galaxy's mass. Elliptical galaxies have two main stages of evolution. The first is due to the supermassive black hole growing by accreting cooling gas. The second stage is marked by the black hole stabilizing by suppressing gas cooling, thus leaving the elliptical galaxy in a stable state. The mass of the black hole is also correlated to a property called sigma which is the dispersion of the velocities of stars in their orbits. This relationship, known as the M-sigma relation, was discovered in 2000. Elliptical galaxies mostly lack disks, although some bulges of disk galaxies resemble elliptical galaxies. Elliptical galaxies are more likely found in crowded regions of the universe (such as galaxy clusters). Astronomers now see elliptical galaxies as some of the most evolved systems in the universe. It is widely accepted that the main driving force for the evolution of elliptical galaxies is mergers of smaller galaxies. Many galaxies in the universe are gravitationally bound to other galaxies, which means that they will never escape their mutual pull. If the galaxies are of similar size, the resultant galaxy will appear similar to neither of the progenitors, but will instead be elliptical. There are many types of galaxy mergers, which do not necessarily result in elliptical galaxies, but result in a structural change. For example, a minor merger event is thought to be occurring between the Milky Way and the Magellanic Clouds. Mergers between such large galaxies are regarded as violent, and the frictional interaction of the gas between the two galaxies can cause gravitational shock waves, which are capable of forming new stars in the new elliptical galaxy. By sequencing several images of different galactic collisions, one can observe the timeline of two spiral galaxies merging into a single elliptical galaxy. In the Local Group, the Milky Way and the Andromeda Galaxy are gravitationally bound, and currently approaching each other at high speed. Simulations show that the Milky Way and Andromeda are on a collision course, and are expected to collide in less than five billion years. During this collision, it is expected that the Sun and the rest of the Solar System will be ejected from its current path around the Milky Way. The remnant could be a giant elliptical galaxy. Galaxy quenching One observation (see above) that must be explained by a successful theory of galaxy evolution is the existence of two different populations of galaxies on the galaxy color-magnitude diagram. Most galaxies tend to fall into two separate locations on this diagram: a "red sequence" and a "blue cloud". Red sequence galaxies are generally non-star-forming elliptical galaxies with little gas and dust, while blue cloud galaxies tend to be dusty star-forming spiral galaxies. As described in previous sections, galaxies tend to evolve from spiral to elliptical structure via mergers. However, the current rate of galaxy mergers does not explain how all galaxies move from the "blue cloud" to the "red sequence". It also does not explain how star formation ceases in galaxies. Theories of galaxy evolution must therefore be able to explain how star formation turns off in galaxies. This phenomenon is called galaxy "quenching". Stars form out of cold gas (see also the Kennicutt–Schmidt law), so a galaxy is quenched when it has no more cold gas. However, it is thought that quenching occurs relatively quickly (within 1 billion years), which is much shorter than the time it would take for a galaxy to simply use up its reservoir of cold gas. Galaxy evolution models explain this by hypothesizing other physical mechanisms that remove or shut off the supply of cold gas in a galaxy. These mechanisms can be broadly classified into two categories: (1) preventive feedback mechanisms that stop cold gas from entering a galaxy or stop it from producing stars, and (2) ejective feedback mechanisms that remove gas so that it cannot form stars. One theorized preventive mechanism called “strangulation” keeps cold gas from entering the galaxy. Strangulation is likely the main mechanism for quenching star formation in nearby low-mass galaxies. The exact physical explanation for strangulation is still unknown, but it may have to do with a galaxy's
they're not into preening or pampering, and they just might not give much of a hoot what others think of them. Or whether others think of them at all." Furthermore, guides regarding managing multiple generations in the workforce describe Gen Xers as: independent, resilient, resourceful, self-managing, adaptable, cynical, pragmatic, skeptical of authority, and as seeking a work-life balance. Entrepreneurship as an individual trait Individualism is one of the defining traits of Generation X, and is reflected in their entrepreneurial spirit. In the 2008 book X Saves the World: How Generation X Got the Shaft but Can Still Keep Everything from Sucking, author Jeff Gordinier describes Generation X as a "dark horse demographic" which "doesn't seek the limelight". Gordiner cites examples of Gen Xers' contributions to society such as: Google, Wikipedia, Amazon.com, and YouTube, arguing that if boomers had created them, "we'd never hear the end of it". In the book, Gordinier contrasts Gen Xers to baby boomers, saying boomers tend to trumpet their accomplishments more than Gen Xers do, creating what he describes as "elaborate mythologies" around their achievements. Gordiner cites Steve Jobs as an example, while Gen Xers, he argues, are more likely to "just quietly do their thing". In a 2007 article published in the Harvard Business Review, authors Strauss and Howe wrote of Generation X: "They are already the greatest entrepreneurial generation in U.S. history; their high-tech savvy and marketplace resilience have helped America prosper in the era of globalization." According to authors Michael Hais and Morley Winograd: Small businesses and the entrepreneurial spirit that Gen Xers embody have become one of the most popular institutions in America. There's been a recent shift in consumer behavior and Gen Xers will join the "idealist generation" in encouraging the celebration of individual effort and business risk-taking. As a result, Xers will spark a renaissance of entrepreneurship in economic life, even as overall confidence in economic institutions declines. Customers, and their needs and wants (including Millennials) will become the North Star for an entire new generation of entrepreneurs. A 2015 study by Sage Group reports Gen Xers "dominate the playing field" with respect to founding startups in the United States and Canada, with Xers launching the majority (55%) of all new businesses in 2015. Income benefits of a college education Unlike millennials, Generation X was the last generation in the U.S. for whom higher education was broadly financially remunerative. In 2019, the Federal Reserve Bank of St. Louis published research (using data from the 2016 Survey of Consumer Finances) demonstrating that after controlling for race and age, cohort families with heads of household with post-secondary education and born before 1980 have seen wealth and income premiums, while, for those after 1980, the wealth premium has weakened to a point of statistical insignificance (in part because of the rising cost of college). The income premium, while remaining positive, has declined to historic lows, with more pronounced downward trajectories among heads of household with postgraduate degrees. Parenting and volunteering In terms of advocating for their children in the educational setting, author Neil Howe describes Gen X parents as distinct from baby boomer parents. Howe argues that Gen Xers are not helicopter parents, which Howe describes as a parenting style of boomer parents of millennials. Howe described Gen Xers instead as "stealth fighter parents", due to the tendency of Gen X parents to let minor issues go and to not hover over their children in the educational setting, but to intervene forcefully and swiftly in the event of more serious issues. In 2012, the Corporation for National and Community Service ranked Gen X volunteer rates in the U.S. at "29.4% per year", the highest compared with other generations. The rankings were based on a three-year moving average between 2009 and 2011. Income differential with previous generations A report titled Economic Mobility: Is the American Dream Alive and Well? focused on the income of males 30–39 in 2004 (those born April 1964March 1974). The study was released on 25 May 2007 and emphasized that this generation's men made less (by 12%) than their fathers had at the same age in 1974, thus reversing a historical trend. It concluded that, per year increases in household income generated by fathers/sons slowed from an average of 0.9% to 0.3%, barely keeping pace with inflation. "Family incomes have risen though (over the period 1947 to 2005) because more women have gone to work", "supporting the incomes of men, by adding a second earner to the family. And as with male income, the trend is downward." Elsewhere Although, globally, children and adolescents of Generation X will have been heavily influenced by U.S. cultural industries with shared global currents (e.g. rising divorce rates, the AIDS epidemic, advancements in ICT), there is not one U.S.-born raised concept but multiple perspectives and geographical outgrowths. Even within the period of analysis, inside national communities, commonalities will have differed on the basis of one's birth date. The generation, Christine Henseler also remarks, was shaped as much by real-world events, within national borders, determined by specific political, cultural, and historical incidents. She adds "In other words, it is in between both real, clearly bordered spaces and more fluid global currents that we can spot the spirit of Generation X." In 2016, a global consumer insights project from Viacom International Media Networks and Viacom, based on over 12,000 respondents across 21 countries, reported on Gen X's unconventional approach to sex, friendship, and family, their desire for flexibility and fulfillment at work and the absence of midlife crisis for Gen Xers. The project also included a 20 min documentary titled Gen X Today. Russia In Russia Generation Xers are referred to as "the last Soviet children", as the last children to come of age prior to the downfall of communism in their nation and prior to the Dissolution of the Soviet Union. Those that reached adulthood in the 1980s and grew up educated in the doctrines of Marxism and Leninism found themselves against a background of economic and social change, with the advent of Mikhail Gorbachev to power and Perestroika. However, even before the collapse of the Soviet Union and the disbanding of the Communist Party of the Soviet Union, surveys demonstrated that Russian young people repudiated the key features of the Communist worldview that their party leaders, schoolteachers, and even parents had tried to instill in them. This generation, caught in the transition between Marxism–Leninism and an unknown future, and wooed by the new domestic political classes, remained largely apathetic. France In France, "Generation X" is not as widely known or used to define its members. Demographically, this denotes those born from the beginning of the 1960s to the early 1980s. There is general agreement that, domestically, the event that is accepted in France as the separating point between the baby boomer generation and Generation X are the French strikes and violent riots of May 1968 with those of the generation too young to participate. Those at the start of the cohort are sometimes referred to as 'Génération Bof' because of their tendency to use the word 'bof', which, translated into English, means "whatever". The generation is closely associated with socialist François Mitterrand who served as President of France during two consecutive terms between 1981 and 1995 as most transitioned into adulthood during that period. Economically, Xers started when the new labour market was emerging and were the first to fully experience the advent of the post-industrial society. For those at the tail-end of the generation, educational and defense reforms, a new style baccalauréat général with three distinct streams in 1995 (the preceding programme, introduced in 1968) and the cessation of military conscription in 1997 (for those born after January 1979) are considered as new transition points to the next. Republic of Ireland The term "Generation X" is used to describe Irish people born between 1965 and 1985; they grew up during The Troubles and the 1980s economic recession, coming of age during the Celtic Tiger period of prosperity in the 1990s onward. The appropriateness of the term to Ireland has been questioned, with Darach Ó Séaghdha noting that "Generation X is usually contrasted with the one before by growing up in smaller and different family units on account of their parents having greater access to contraception and divorce – again, things that were not widely available in Ireland. [Contraception was only available under prescription in 1978 and without prescription in 1985; divorce was illegal until 1996.] However, this generation was in prime position to benefit from the Celtic Tiger, the Peace Process and liberalisations introduced on foot of EU membership and was less likely to emigrate than those that came before and after. You could say that in many ways, these are Ireland’s real boomers." Culturally, Britpop, Celtic rock, the trad revival, Father Ted, the 1990 FIFA World Cup and rave culture were significant. The Divine Comedy song "Generation Sex" (1998) painted a picture of hedonism in the late 20th century, as well as its effect on the media. David McWilliams' 2005 book The Pope's Children: Ireland's New Elite profiled Irish people born in the 1970s (just prior to the papal visit to Ireland), which was a baby boom that saw Ireland's population increase for the first time since the 1840s Great Famine. The Pope's Children were in position to benefit from the Celtic Tiger and the newly liberal culture, where the Catholic Church had significantly less social power. United Kingdom As children, adolescents and young adults Political environment The United Kingdom's Economic and Social Research Council described Generation X as "Thatcher's children" because the cohort grew up while Margaret Thatcher was Prime Minister from 1979 to 1990, "a time of social flux and transformation". Those born in the late 1960s and early 1970s grew up in a period of social unrest. While unemployment was low in the early 1970s, industrial and social unrest escalated. Strike action culminated in the "Winter of Discontent" in 1978–79, and the Troubles began to unfold in Northern Ireland. The turn to neoliberal policies introduced and maintained by consecutive conservative governments from 1979 to 1997 marked the end of the post-war consensus. Education The almost universal dismantling of the grammar school system in Great Britain during the 1960s and the 1970s meant that the vast majority of the cohort attended secondary modern schools, relabelled comprehensive schools. Compulsory education ended at the age of 16. As older members of the cohort reached the end of their mandatory schooling, levels of educational enrollment among older adolescents remained below much of the Western world. By the early 1980s, some 80% to 90% of school leavers in France and West Germany received vocational training, compared with 40% in the United Kingdom. By the mid-1980s, over 80% of pupils in the United States and West Germany and over 90% in Japan stayed in education until the age of eighteen, compared with 33% of British pupils. There was, however, broadly a rise in education levels among this age range as Generation X passed through it. In 1990, 25% of young people in England stayed in some kind of full-time education after the age of 18, this was an increase from 15% a decade earlier. Later, the Further and Higher Education Act 1992 and the liberalisation of higher education in the UK saw greater numbers of those born towards the tail-end of the generation gaining university places. Employment The 1980s, when much of Generation X reached working age, was an era defined by high unemployment rates. This was particularly true of the youngest members of the working aged population. In 1984, 26% of 16 to 24 year olds were neither in full-time education or participating in the workforce. However, this figure did decrease as the economic situation improved reaching 17% by 1993. In midlife Generation X were far more likely to have children out of wedlock than their parents. The number of babies being born to unmarried parents in England and Wales rose from 11% in 1979, a quarter in 1998, 40% by 2002 and almost half in 2012. They were also significantly more likely to have children later in life than their predecessors. The average age of a mother giving birth rose from 27 in 1982 to 30 in 2012. That year saw 29,994 children born to mothers over the age 40, an increase of 360% from 2002. A 2016 study of over 2,500 British office workers conducted by Workfront found that survey respondents of all ages selected those from Generation X as the hardest-working employees and members of the workforce (chosen by 60%). Gen X was also ranked highest among fellow workers for having the strongest work ethic (chosen by 59.5%), being the most helpful (55.4%), the most skilled (54.5%), and the best troubleshooters/problem-solvers (41.6%). Political evolution Ipsos MORI reports that at the 1987 and 1992 general elections, the first United Kingdom general elections where significant numbers of Generation X members could vote, a plurality of 18 to 24 year olds opted for the Labour Party by a small margin. The polling organisation's figures suggest that in 1987 39% of that age group voted Labour, 37% for the Conservatives and 22% for the SDP–Liberal Alliance. Five years later, these numbers were fairly similar at 38% Labour, 35% Conservative and 19% Liberal Democrats, a party by then formed from the previously mentioned alliance. Both these elections saw a fairly significant lead for the conservatives in the popular vote among the general population. At the 1997 General election where Labour won a large majority of seats and a comfortable lead in the popular vote, research suggests that voters under the age of 35 were more likely to vote labour if they turned out than the wider electorate but significantly less likely to vote than in 1992. Analysts suggested this may have been due to fewer differences in policies between the major parties and young people having less of a sense of affiliation with particular political parties than older generations. A similar trend continued at the 2001 and 2005 general elections as turnout dropped further among both the relatively young and the wider public. Voter turnout across the electorate began to recover from a 2001 low until the 2017 general election. Generation X also became more likely to vote as they entered the midlife age demographics. Polling suggests a plurality of their age group backed the Conservatives in 2010 and 2015 but less overwhelming than much of the older generation. At the 2016 EU membership referendum and 2017 general election, Generation X was split with younger members appearing to back remain and Labour and older members tending towards Leave and Conservative in a British electorate more polarised by age than ever before. At the 2019 general election, voting trends continued to be heavily divided by age but a plurality of younger as well as older generation X members (then 39 to 55 year olds) voted Conservative. Germany In Germany, "Generation X" is not widely used or applied. Instead, reference is made to "Generation Golf" in the previous West German republic, based on a novel by Florian Illies. In the east, children of the "Mauerfall" or coming down of the wall. For former East Germans, there was adaptation, but also a sense of loss of accustomed values and structures. These effects turned into romantic narratives of their childhood. For those in the West, there was a period of discovery and exploration of what had been a forbidden land. South Africa In South Africa, Gen Xers spent their formative years of the 1980s during the "hyper-politicized environment of the final years of apartheid". Arts and culture Music Gen Xers were the first cohort to come of age with MTV. They were the first generation to experience the emergence of music videos as teenagers and are sometimes called the MTV Generation. Gen Xers were responsible for the alternative rock movement of the 1990s and 2000s, including the grunge subgenre. Hip hop has also been described as defining music of the generation, particularly artists such as Tupac Shakur, N.W.A., and The Notorious B.I.G. Punk rock From 1974 to 1976, a new generation of rock bands arose, such as the Ramones, Johnny Thunders and the Heartbreakers, The Dictators in New York City, the Sex Pistols, the Clash, the Damned, and Buzzcocks in the United Kingdom, and the Saints in Brisbane. By late 1976, these acts were generally recognized as forming the vanguard of "punk rock", and as 1977 approached, punk rock became a major and highly controversial cultural phenomenon in the UK. It spawned a punk subculture which expressed a youthful rebellion, characterized by distinctive styles of clothing and adornment (ranging from deliberately offensive T-shirts, leather jackets, studded or spiked bands and jewelry, as well as bondage and S&M clothes) and a variety of anti-authoritarian ideologies that have since been associated with the form. By 1977 the influence of punk rock music and its subculture became more pervasive, spreading throughout various countries worldwide. It generally took root in local scenes that tended to reject affiliation with the mainstream. In the late 1970s, punk experienced its second wave. Acts that were not active during its formative years adopted the style. While at first punk musicians were not Gen Xers themselves (many of them were late boomers, or Generation Jones), the fanbase for punk became increasingly Gen X-oriented as the earliest Xers entered their adolescence, and it therefore made a significant imprint on the cohort. By the 1980s, faster and more aggressive subgenres such as hardcore punk (e.g. Minor Threat), street punk (e.g. the Exploited, NOFX) and anarcho-punk (e.g. Subhumans) became the predominant modes of punk rock. Musicians identifying with or inspired by punk often later pursued other musical directions, resulting in a broad range of spinoffs. This development gave rise to genres such as post-punk, new wave and later indie pop, alternative rock, and noise rock. Gen Xers were no longer simply the consumers of punk, they became the creators as well. By the 1990s, punk rock re-emerged into the mainstream. Punk rock and pop punk bands with Gen X members such as Green Day, Rancid, The Offspring, and Blink-182 brought widespread popularity to the genre . Hard rock Arguably in a similar way to punk, a sense of disillusionment, angst and anger catalysed hard rock and heavy metal to grow from the earlier influence of rock. Post-punk The energy generated by the punk movement launched a subsequent proliferation of weird and eclectic post-punk sub cultures, spanning new wave, goth etc., and influencing the New Romantics, Grunge A notable example of alternative rock is grunge music and the associated subculture that developed in the Pacific Northwest of the U.S. Grunge song lyrics have been called the "...product of Generation X malaise". Vulture commented: "the best bands arose from the boredom of latchkey kids". "People made records entirely to please themselves because there was nobody else to please" commented producer Jack Endino. Grunge lyrics are typically dark, nihilistic, angst-filled, anguished, and often addressing themes such as social alienation, despair and apathy. The Guardian wrote that grunge "didn't recycle banal cliches but tackled weighty subjects". Topics of grunge lyrics included homelessness, suicide, rape, broken homes, drug addiction, self-loathing, misogyny, domestic abuse and finding "meaning in an indifferent universe". Grunge lyrics tended to be introspective and aimed to enable the listener to see into hidden personal issues and examine depravity in the world. Notable grunge bands include: Nirvana, Pearl Jam, Alice in Chains, Stone Temple Pilots and Soundgarden. Hip hop The golden age of hip hop refers to hip hop music made from the mid-1980s to mid-1990s, typically by artists originating from the New York metropolitan area. The music style was characterized by its diversity, quality, innovation and influence after the genre's emergence and establishment in the previous decade. There were various types of subject matter, while the music was experimental and the sampling eclectic. The artists most often associated with the period are LL Cool J, Run–D.M.C., Public Enemy, the Beastie Boys, KRS-One, Eric B. & Rakim, De La Soul, Big Daddy Kane, EPMD, A Tribe Called Quest, Wu-Tang Clan, Slick Rick, Ultramagnetic MC's, and the Jungle Brothers. Releases by these acts co-existed in this period with, and were as commercially viable as, those of early gangsta rap artists such as Ice-T, Geto Boys and N.W.A, the sex raps of 2 Live Crew and Too Short, and party-oriented music by acts such as Kid 'n Play, The Fat Boys, DJ Jazzy Jeff & The Fresh Prince and MC Hammer. In addition to lyrical self-glorification, hip hop was also used as a form of social protest. Lyrical content from the era often drew attention to a variety of social issues, including afrocentric living, drug use, crime and violence, religion, culture, the state of the American economy, and the modern man's struggle. Conscious and political hip hop tracks of the time were a response to the effects of American capitalism and former President Reagan's conservative political economy. According to Rose Tricia, "In rap, relationships between black cultural practice, social and economic conditions, technology, sexual and racial politics, and the institution policing of the popular terrain are complex and in constant motion". Even though hip hop was used as a mechanism for different social issues, it was still very complex with issues within the movement itself. There was also often an emphasis on black nationalism. Hip hop artists often talked about urban poverty and the problems of alcohol, drugs, and gangs in their communities. Public Enemy's most influential song, "Fight the Power", came out at this time; the song speaks up to the government, proclaiming that people in the ghetto have freedom of speech and rights like every other American. Film Indie films Gen Xers were largely responsible for the "indie film" movement of the 1990s, both as young directors and in large part as the film audiences which were fueling demand for such films. In cinema, directors Kevin Smith, Quentin Tarantino, Sofia Coppola, John Singleton, Spike Jonze, David Fincher, Steven Soderbergh, and Richard Linklater have been called Generation X filmmakers. Smith is most known for his View Askewniverse films, the flagship film being Clerks, which is set in New Jersey circa 1994, and focuses on two convenience-store clerks in their twenties. Linklater's Slacker similarly explores young adult characters who were interested in philosophizing. While not a member of Gen X himself, director John Hughes has been recognized as having created classic 1980s teen films with early Gen X characters which "an entire generation took ownership of", including The Breakfast Club, Sixteen Candles, Weird Science, and Ferris Bueller's Day Off. In France, a new movement emerged, the Cinéma du look, spearheaded by filmmakers Luc Besson, Jean-Jacques Beineix and Leos Carax. Although not Gen Xers themselves, Subway (1985), 37°2 le matin (English: Betty Blue; 1986), and Mauvais Sang (1986) sought to capture on screen the generation's malaise, sense of entrapment, and desire to escape. Franchise mega sequels The birth of franchise mega-sequels in the science fiction, fantasy, and horror fiction genres, such as the epic space opera Star Wars and the Halloween franchise, had a profound and notable cultural influence. Literature The literature of early Gen Xers is often dark and introspective. In the U.S., authors such as Elizabeth Wurtzel, David Foster Wallace,
totaled 14.3 million. In addition, unlike Boomers and previous generations, women outpaced men in college completion rates. Adjusting to a new societal environment For early Gen Xer graduates entering the job market at the end of the 1980s, economic conditions were challenging and did not show signs of major improvements until the mid-1990s. In the U.S., restrictive monetary policy to curb rising inflation and the collapse of a large number of savings and loan associations (private banks that specialized in home mortgages) impacted the welfare of many American households. This precipitated a large government bailout, which placed further strain on the budget. Furthermore, three decades of growth came to an end. The social contract between employers and employees, which had endured during the 1960s and 1970s and was scheduled to last until retirement, was no longer applicable. By the late 1980s, there were large-scale layoffs of boomers, corporate downsizing, and accelerated offshoring of production. On the political front, in the U.S. the generation became ambivalent if not outright disaffected with politics. They had been reared in the shadow of the Vietnam War and the Watergate scandal. They came to maturity under the Reagan and George H. W. Bush presidencies, with first-hand experience of the impact of neoliberal policies. Few had experienced a Democratic administration and even then, only, at an atmospheric level. For those on the left of the political spectrum, the disappointments with the previous boomer student mobilizations of the 1960s and the collapse of those movements towards a consumerist "greed is good" and "yuppie" culture during the 1980s felt, to a greater extent, hypocrisy if not outright betrayal. Hence, the preoccupation on "authenticity" and not "selling-out". The Revolutions of 1989 and the collapse of the socialist utopia with the fall of the Berlin Wall, moreover, added to the disillusionment that any alternative to the capitalist model was possible. Birth of the slacker In 1990, Time magazine published an article titled "Living: Proceeding with Caution", which described those then in their 20s as aimless and unfocused. Media pundits and advertisers further struggled to define the cohort, typically portraying them as "unfocused twentysomethings". A MetLife report noted: "media would portray them as the Friends generation: rather self-involved and perhaps aimless...but fun". Gen Xers were often portrayed as apathetic or as "slackers", lacking bearings, a stereotype which was initially tied to Richard Linklater's comedic and essentially plotless 1991 film Slacker. After the film was released, "journalists and critics thought they put a finger on what was different about these young adults in that 'they were reluctant to grow up' and 'disdainful of earnest action'". Ben Stiller's 1994 film Reality Bites also sought to capture the zeitgeist of the generation with a portrayal of the attitudes and lifestyle choices of the time. Negative stereotypes of Gen X young adults continued, including that they were "bleak, cynical, and disaffected". In 1998, such stereotypes prompted sociological research at Stanford University to study the accuracy of the characterization of Gen X young adults as cynical and disaffected. Using the national General Social Survey, the researchers compared answers to identical survey questions asked of 18–29-year-olds in three different time periods. Additionally, they compared how older adults answered the same survey questions over time. The surveys showed 18–29-year-old Gen Xers did exhibit higher levels of cynicism and disaffection than previous cohorts of 18–29-year-olds surveyed. However, they also found that cynicism and disaffection had increased among all age groups surveyed over time, not just young adults, making this a period effect, not a cohort effect. In other words, adults of all ages were more cynical and disaffected in the 1990s, not just Generation X. Rise of the Internet and the dot-com bubble By the mid-late 1990s, under Bill Clinton's presidency, economic optimism had returned to the U.S., with unemployment reduced from 7.5% in 1992 to 4% in 2000. Younger members of Gen X, straddling across administrations, politically experienced a "liberal renewal". In 1997, Time magazine published an article titled "Generation X Reconsidered", which retracted the previously reported negative stereotypes and reported positive accomplishments. The article cited Gen Xers' tendency to found technology startup companies and small businesses, as well as their ambition, which research showed was higher among Gen X young adults than older generations. Yet, the slacker moniker stuck. As the decade progressed, Gen X gained a reputation for entrepreneurship. In 1999, The New York Times dubbed them "Generation 1099", describing them as the "once pitied but now envied group of self-employed workers whose income is reported to the Internal Revenue Service not on a W-2 form, but on Form 1099". Consumer access to the Internet and its commercial development throughout the 1990s witnessed a frenzy of IT initiatives. Newly created companies, launched on stock exchanges globally, were formed with dubitable revenue generation or cash flow. When the dot-com bubble eventually burst in 2000, early Gen Xers who had embarked as entrepreneurs in the IT industry while iding the Internet wave, as well as newly qualified programmers at the tail-end of the generation (who had grown up with AOL and the first Web browsers), were both caught in the crash. This had major repercussions, with cross-generational consequences; five years after the bubble burst, new matriculation of IT millennial undergraduates fell by 40% and by as much as 70% in some information systems programs. However, following the crisis, sociologist Mike Males reported continued confidence and optimism among the cohort. He reported "surveys consistently find 80% to 90% of Gen Xers self-confident and optimistic". Males wrote "these young Americans should finally get the recognition they deserve", praising the cohort and stating that "the permissively raised, universally deplored Generation X is the true 'great generation', for it has braved a hostile social climate to reverse abysmal trends". He described them as the hardest-working group since the World War II generation. He reported Gen Xers' entrepreneurial tendencies helped create the high-tech industry that fueled the 1990s economic recovery. In 2002, Time magazine published an article titled Gen Xers Aren't Slackers After All, reporting that four out of five new businesses were the work of Gen Xers. Response to 9/11 In the U.S., Gen Xers were described as the major heroes of the September 11 terrorist attacks by author William Strauss. The firefighters and police responding to the attacks were predominantly from Generation X. Additionally, the leaders of the passenger revolt on United Airlines Flight 93 were also, by majority, Gen Xers. Author Neil Howe reported survey data which showed that Gen Xers were cohabiting and getting married in increasing numbers following the terrorist attacks. Gen X survey respondents reported that they no longer wanted to live alone. In October 2001, the Seattle Post-Intelligencer wrote of Gen Xers: "Now they could be facing the most formative events of their lives and their generation." The Greensboro News & Record reported members of the cohort "felt a surge of patriotism since terrorists struck" by giving blood, working for charities, donating to charities, and by joining the military to fight the War on Terror. The Jury Expert, a publication of The American Society of Trial Consultants, reported: "Gen X members responded to the terrorist attacks with bursts of patriotism and national fervor that surprised even themselves." In midlife Achieving a work-life balance In 2011, survey analysis from the Longitudinal Study of American Youth found Gen Xers (defined as those who were then between the ages of 30 and 50) to be "balanced, active, and happy" in midlife and as achieving a work-life balance. The Longitudinal Study of Youth is an NIH-NIA funded study by the University of Michigan which has been studying Generation X since 1987. The study asked questions such as "Thinking about all aspects of your life, how happy are you? If zero means that you are very unhappy and 10 means that you are very happy, please rate your happiness." LSA reported that "mean level of happiness was 7.5 and the median (middle score) was 8. Only four percent of Generation X adults indicated a great deal of unhappiness (a score of three or lower). Twenty-nine percent of Generation X adults were very happy with a score of 9 or 10 on the scale." In 2014, Pew Research provided further insight, describing the cohort as "savvy, skeptical and self-reliant; they're not into preening or pampering, and they just might not give much of a hoot what others think of them. Or whether others think of them at all." Furthermore, guides regarding managing multiple generations in the workforce describe Gen Xers as: independent, resilient, resourceful, self-managing, adaptable, cynical, pragmatic, skeptical of authority, and as seeking a work-life balance. Entrepreneurship as an individual trait Individualism is one of the defining traits of Generation X, and is reflected in their entrepreneurial spirit. In the 2008 book X Saves the World: How Generation X Got the Shaft but Can Still Keep Everything from Sucking, author Jeff Gordinier describes Generation X as a "dark horse demographic" which "doesn't seek the limelight". Gordiner cites examples of Gen Xers' contributions to society such as: Google, Wikipedia, Amazon.com, and YouTube, arguing that if boomers had created them, "we'd never hear the end of it". In the book, Gordinier contrasts Gen Xers to baby boomers, saying boomers tend to trumpet their accomplishments more than Gen Xers do, creating what he describes as "elaborate mythologies" around their achievements. Gordiner cites Steve Jobs as an example, while Gen Xers, he argues, are more likely to "just quietly do their thing". In a 2007 article published in the Harvard Business Review, authors Strauss and Howe wrote of Generation X: "They are already the greatest entrepreneurial generation in U.S. history; their high-tech savvy and marketplace resilience have helped America prosper in the era of globalization." According to authors Michael Hais and Morley Winograd: Small businesses and the entrepreneurial spirit that Gen Xers embody have become one of the most popular institutions in America. There's been a recent shift in consumer behavior and Gen Xers will join the "idealist generation" in encouraging the celebration of individual effort and business risk-taking. As a result, Xers will spark a renaissance of entrepreneurship in economic life, even as overall confidence in economic institutions declines. Customers, and their needs and wants (including Millennials) will become the North Star for an entire new generation of entrepreneurs. A 2015 study by Sage Group reports Gen Xers "dominate the playing field" with respect to founding startups in the United States and Canada, with Xers launching the majority (55%) of all new businesses in 2015. Income benefits of a college education Unlike millennials, Generation X was the last generation in the U.S. for whom higher education was broadly financially remunerative. In 2019, the Federal Reserve Bank of St. Louis published research (using data from the 2016 Survey of Consumer Finances) demonstrating that after controlling for race and age, cohort families with heads of household with post-secondary education and born before 1980 have seen wealth and income premiums, while, for those after 1980, the wealth premium has weakened to a point of statistical insignificance (in part because of the rising cost of college). The income premium, while remaining positive, has declined to historic lows, with more pronounced downward trajectories among heads of household with postgraduate degrees. Parenting and volunteering In terms of advocating for their children in the educational setting, author Neil Howe describes Gen X parents as distinct from baby boomer parents. Howe argues that Gen Xers are not helicopter parents, which Howe describes as a parenting style of boomer parents of millennials. Howe described Gen Xers instead as "stealth fighter parents", due to the tendency of Gen X parents to let minor issues go and to not hover over their children in the educational setting, but to intervene forcefully and swiftly in the event of more serious issues. In 2012, the Corporation for National and Community Service ranked Gen X volunteer rates in the U.S. at "29.4% per year", the highest compared with other generations. The rankings were based on a three-year moving average between 2009 and 2011. Income differential with previous generations A report titled Economic Mobility: Is the American Dream Alive and Well? focused on the income of males 30–39 in 2004 (those born April 1964March 1974). The study was released on 25 May 2007 and emphasized that this generation's men made less (by 12%) than their fathers had at the same age in 1974, thus reversing a historical trend. It concluded that, per year increases in household income generated by fathers/sons slowed from an average of 0.9% to 0.3%, barely keeping pace with inflation. "Family incomes have risen though (over the period 1947 to 2005) because more women have gone to work", "supporting the incomes of men, by adding a second earner to the family. And as with male income, the trend is downward." Elsewhere Although, globally, children and adolescents of Generation X will have been heavily influenced by U.S. cultural industries with shared global currents (e.g. rising divorce rates, the AIDS epidemic, advancements in ICT), there is not one U.S.-born raised concept but multiple perspectives and geographical outgrowths. Even within the period of analysis, inside national communities, commonalities will have differed on the basis of one's birth date. The generation, Christine Henseler also remarks, was shaped as much by real-world events, within national borders, determined by specific political, cultural, and historical incidents. She adds "In other words, it is in between both real, clearly bordered spaces and more fluid global currents that we can spot the spirit of Generation X." In 2016, a global consumer insights project from Viacom International Media Networks and Viacom, based on over 12,000 respondents across 21 countries, reported on Gen X's unconventional approach to sex, friendship, and family, their desire for flexibility and fulfillment at work and the absence of midlife crisis for Gen Xers. The project also included a 20 min documentary titled Gen X Today. Russia In Russia Generation Xers are referred to as "the last Soviet children", as the last children to come of age prior to the downfall of communism in their nation and prior to the Dissolution of the Soviet Union. Those that reached adulthood in the 1980s and grew up educated in the doctrines of Marxism and Leninism found themselves against a background of economic and social change, with the advent of Mikhail Gorbachev to power and Perestroika. However, even before the collapse of the Soviet Union and the disbanding of the Communist Party of the Soviet Union, surveys demonstrated that Russian young people repudiated the key features of the Communist worldview that their party leaders, schoolteachers, and even parents had tried to instill in them. This generation, caught in the transition between Marxism–Leninism and an unknown future, and wooed by the new domestic political classes, remained largely apathetic. France In France, "Generation X" is not as widely known or used to define its members. Demographically, this denotes those born from the beginning of the 1960s to the early 1980s. There is general agreement that, domestically, the event that is accepted in France as the separating point between the baby boomer generation and Generation X are the French strikes and violent riots of May 1968 with
1973. Guam lies in the path of typhoons and it is common for the island to be threatened by tropical storms and possible typhoons during the wet season. The highest risk of typhoons is from August through November, where typhoons and tropical storms are most probable in the western Pacific. They can, however, occur year-round. Typhoons that have caused major damage on Guam in the American period include the Typhoon of 1900, Karen (1962), Pamela (1976), Paka (1997), and Pongsona (2002). Since Typhoon Pamela in 1976, wooden structures have been largely replaced by concrete structures. During the 1980s, wooden utility poles began to be replaced by typhoon-resistant concrete and steel poles. After the local Government enforced stricter construction codes, many home and business owners built their structures out of reinforced concrete with installed typhoon shutters. Ecology Guam has experienced severe impacts from invasive species upon the natural biodiversity of the island. These include the local extinction of endemic bird species after the introduction of the brown tree snake, an infestation of the Asiatic rhinoceros beetle destroying coconut palms, and the effect of introduced feral mammals and amphibians. Wildfires plague the forested areas of Guam every dry season despite the island's humid climate. Most fires are caused by humans with 80% resulting from arson. Poachers often start fires to attract deer to the new growth. Invasive grass species that rely on fire as part of their natural life cycle grow in many regularly burned areas. Grasslands and "barrens" have replaced previously forested areas leading to greater soil erosion. During the rainy season, sediment is carried by the heavy rains into the Fena Lake Reservoir and Ugum River, leading to water quality problems for southern Guam. Eroded silt also destroys the marine life in reefs around the island. Soil stabilization efforts by volunteers and forestry workers (planting trees) have had little success in preserving natural habitats. Efforts have been made to protect Guam's coral reef habitats from pollution, eroded silt and overfishing, problems that have led to decreased fish populations. This has both ecological and economic value, as Guam is a significant vacation spot for scuba divers, and one study found that Guam's reefs are worth $127 million per year. In recent years, the Department of Agriculture, Division of Aquatic and Wildlife Resources has established several new marine preserves where fish populations are monitored by biologists. These are located at Pati Point, Piti Bomb Holes, Sasa Bay, Achang Reef Flat, and Tumon Bay. Before adopting U.S. Environmental Protection Agency standards, portions of Tumon Bay were dredged by the hotel chains to provide a better experience for hotel guests. Tumon Bay has since been made into a preserve. A federal Guam National Wildlife Refuge in northern Guam protects the decimated sea turtle population in addition to a small colony of Mariana fruit bats. Harvest of sea turtle eggs was a common occurrence on Guam before World War II. The green sea turtle (Chelonia mydas) was harvested legally on Guam before August 1978, when it was listed as threatened under the Endangered Species Act. The hawksbill sea turtle (Eretmochelys imbricata) has been on the endangered list since 1970. In an effort to ensure the protection of sea turtles on Guam, routine sightings are counted during aerial surveys and nest sites are recorded and monitored for hatchlings. Demographics According to the 2010 United States Census, the largest ethnic group are the native Chamorus, accounting for 37.3% of the total population. Asians (including Filipinos, Koreans, Chinese, and Japanese) account for 33% of the total population. Other ethnic groups of Micronesia (including those of Chuukese, Palauan, and Pohnpeians) accounts for 10% of the total population. 9.4% of the population are multiracial (two or more races). White Americans account for 7.1% of the total population. The estimated interracial marriage rate is over 40%. The official languages of the island are English and Chamoru. Filipino is also a common language across the island. Other Pacific island languages and many Asian languages are spoken in Guam as well. Spanish, the language of administration for 300 years, is no longer commonly spoken on the island, although vestiges of the language remain in proper names, loanwords, and place names and it is studied at university and high schools. The most common religion is Catholicism. According to the Pew Research Center, the religious denominations constitute of the following, in 2010: Roman Catholicism: 75% Protestantism: 17.7% Other religions: 1.6% Folk religions: 1.5% Other Christianity: 1.4% Buddhism: 1.1% Eastern Orthodoxy: <1% Hinduism: <1% Islam: <1% Judaism: <1% Culture The culture of Guam is a reflection of traditional Chamoru customs in combination with American, Spanish and Mexican traditions. Post-European-contact Chamoru Guamanian culture is a combination of American, Spanish, Filipino, other Micronesian Islander and Mexican traditions. Few indigenous pre-Hispanic customs remained following Spanish contact. Hispanic influences are manifested in the local language, music, dance, sea navigation, cuisine, fishing, games (such as , , , and ), songs, and fashion. The island's original community is of Chamorro natives who have inhabited Guam for almost 4000 years. They had their own language related to the languages of Indonesia and southeast Asia. The Spanish later called them Chamorros, a derivative of the word Chamorri is "noble race"). They began to grow rice on the island. Historically, the native people of Guam venerated the bones of their ancestors, keeping the skulls in their houses in small baskets, and practicing incantations before them when it was desired to attain certain objects. During Spanish rule (1668–1898) the majority of the population was converted to Catholicism and religious festivities such as Easter and Christmas became widespread. Many Chamorus have Spanish surnames, although few of the inhabitants are themselves descended from the Spaniards. Instead, Spanish names and surnames became commonplace after their conversion to Catholicism and the imposition of the Catálogo alfabético de apellidos in Guam. Historically, the diet of the native inhabitants of Guam consisted of fish, fowl, rice, breadfruit, taro, yams, bananas, and coconuts used in a variety of dishes. Post-contact Chamoru cuisine is largely based on corn, and includes tortillas, tamales, atole, and chilaquiles, which are a clear influence from Mesoamerica, principally Mexico, from Spanish trade with Asia. Due to foreign cultural influence from Spain, most aspects of the early indigenous culture have been lost, though there has been a resurgence in preserving any remaining pre-Hispanic culture in the last few decades. Some scholars have traveled throughout the Pacific Islands conducting research to study what the original Chamoru cultural practices such as dance, language, and canoe building may have been like. Sports Guam's most popular sport is American football, followed by basketball and baseball respectively. Soccer and other sports are also somewhat popular. Guam hosted the Pacific Games in 1975 and 1999. At the 2007 Games, Guam finished 7th of 22 countries in the medal count, and 14th at the 2011 Games. Guam men's national basketball team and the women's team are traditional powerhouses in the Oceania region, behind the Australia men's national basketball team and the New Zealand national basketball team. , the men's team is the reigning champion of the Pacific Games Basketball Tournament. Guam is home to various basketball organizations, including the Guam Basketball Association. The Guam national football team was founded in 1975 and joined FIFA in 1996. It was once considered one of FIFA's weakest teams, and experienced their first victory over a FIFA-registered side in 2009. Guam hosted qualifying games on the island for the first time in 2015 and, in 2018, clinched their first FIFA World Cup Qualifying win. The Guam national rugby union team played its first match in 2005 and has never qualified for a Rugby World Cup. Economy Guam's economy depends primarily on tourism, Department of Defense installations and locally owned businesses. Under the provisions of a special law by Congress, it is Guam's treasury rather than the U.S. treasury that receives the federal income taxes paid by local taxpayers (including military and civilian federal employees assigned to Guam). Tourism Lying in the western Pacific, Guam is a popular destination for Japanese tourists. Its tourist hub, Tumon, features over 20 large hotels, a Duty Free Shoppers Galleria, Pleasure Island district, indoor aquarium, Sandcastle Las Vegas–styled shows and other shopping and entertainment venues. It is a relatively short flight from Asia or Australia compared to Hawaii, with hotels and seven public golf courses accommodating over a million tourists per year. Although 75% of the tourists are Japanese, Guam also receives a sizable number of tourists from South Korea, the U.S., the Philippines, and Taiwan. Significant sources of revenue include duty-free designer shopping outlets, and the American-style malls: Micronesia Mall, Guam Premier Outlets, the Agana Shopping Center, and the world's largest Kmart. The economy had been stable since 2000 due to increased tourism. It was expected to stabilize with the transfer of U.S. Marine Corps' 3rd Marine Expeditionary Force, currently in Okinawa, Japan (approximately 8,000 Marines, along with their 10,000 dependents), to Guam between 2010 and 2015. However, the move was delayed until late 2020, the number of marines decreased to 5,000, and expected to be complete in 2025. In 2003, Guam had a 14% unemployment rate, and the government suffered a $314 million shortfall. As of 2019 the unemployment rate had dropped to 6.1%. By September 2020, however, the unemployment rate had risen again to 17.9%. The Compacts of Free Association between the United States, the Federated States of Micronesia, the Republic of the Marshall Islands, and the Republic of Palau accorded the former entities of the Trust Territory of the Pacific Islands a political status of "free association" with the United States. The Compacts give citizens of these island nations generally no restrictions to reside in the United States (also its territories), and many were attracted to Guam due to its proximity, environmental, and cultural familiarity. Over the years, it was claimed by some in Guam that the territory has had to bear the brunt of this agreement in the form of public assistance programs and public education for those from the regions involved, and the federal government should compensate the states and territories affected by this type of migration. Over the years, Congress had appropriated "Compact Impact" aids to Guam, the Northern Mariana Islands, and Hawaii, and eventually this appropriation was written into each renewed Compact. Some, however, continue to claim the compensation is not enough or that the distribution of actual compensation received is significantly disproportionate. Guam's largest single private sector employer, with about 1,400 jobs, was Continental Micronesia, a subsidiary of Continental Airlines; it is now a part of United Airlines, a subsidiary of Chicago-based United Airlines Holdings, Inc. the Continental Micronesia annual payroll in Guam was $90 million. Military bases Currently, Joint Region Marianas maintains jurisdiction over installations which cover approximately , or 29% of the island's total land area. These include: U.S. Naval Base Guam, U.S. Navy (Santa Rita), comprising the Orote Peninsula, additional lands, and with jurisdiction of the majority of Apra Harbor Andersen Air Force Base, U.S. Air Force (Yigo), including Northwest Field Marine Corps Base Camp Blaz, U.S. Marine Corps (Dededo) Ordnance Annex, U.S. Navy – South Central Highlands (formerly known as Naval Magazine) Naval Computer and Telecommunications Station Guam, U.S. Navy (Dededo), sometimes referred to "NCTS Finegayan" Naval Radio Station Barrigada (Barrigada), often referred to as "Radio Barrigada" Joint Region Marianas Headquarters (Asan), at Nimitz Hill Annex Naval Hospital Guam (Agana Heights) South Finegayan (Dededo), a military housing complex Andersen South (Yigo), formerly Marine Barracks Guam until its closure in 1992 Fort Juan Muña, Guam National Guard (Tamuning) The U.S. military has proposed building a new aircraft carrier berth on Guam and moving 8,600 Marines, and 9,000 of their dependents, to Guam from Okinawa, Japan. Including the required construction workers, this buildup would increase Guam's population by a total of 79,000, a 49% increase over its 2010 population of 160,000. In a February 2010 letter, the United States Environmental Protection Agency sharply criticized these plans because of a water shortfall, sewage problems and the impact on coral reefs. By 2012, these plans had been cut to have only a maximum of 4,800 Marines stationed on the island, two thirds of whom would be there on a rotational basis without their dependents. Government and politics Guam is governed by a popularly elected governor and a unicameral 15-member legislature, whose members are known as senators. Its judiciary is overseen by the Supreme Court of Guam. The District Court of Guam is the court of United States federal jurisdiction in the territory. Guam elects one delegate to the United States House of Representatives, currently Democrat Michael San Nicolas. The delegate does not have a vote on the final passage of legislation, but is accorded a vote in committee, and the privilege to speak to the House. U.S. citizens in Guam vote in a presidential straw poll for their choice in the U.S. presidential general election, but since Guam has no votes in the Electoral College, the poll has no real effect. However, in sending delegates to the Republican and Democratic national conventions, Guam does have influence in the national presidential race. These delegates are elected by local party conventions. Political status In the 1980s and early 1990s, there was a significant movement in favor of this U.S. territory becoming a commonwealth, which would give it a level of self-government similar to Puerto Rico and the Northern Mariana Islands. In a 1982 plebiscite, voters indicated interest in seeking commonwealth status. However, the federal government rejected the version of a commonwealth that the government of Guam proposed, because its clauses were incompatible with the Territorial Clause (Art. IV, Sec. 3, cl. 2) of the U.S. Constitution. Other movements advocate U.S. statehood for Guam, union with the state of Hawaii, or union with the Northern Mariana Islands as a single territory, or independence. A Commission on Decolonization was established in 1997 to educate the people of Guam about the various political status options in its relationship with the U.S.: statehood, free association and independence. The island has been considering another non-binding plebiscite on decolonization since 1998, however, the group was dormant for some years. In 2013, the commission began seeking funding to start a public education campaign. There were few subsequent developments until late 2016. In early December 2016, the Commission scheduled a series of education sessions in various villages about the current status of Guam's relationship with the U.S. and the self-determination options that might
are American citizens but have no vote in the United States presidential elections while residing on Guam and Guam delegates to the United States House of Representatives have no vote on the floor. Indigenous Guamanians are the Chamoru, historically known as the Chamorro, who are related to the Austronesian peoples of Indonesia, the Philippines, Taiwan, Micronesia, and Polynesia. As of 2021, Guam's population is 168,801. Chamoros are the largest ethnic group, but a minority on the multi-ethnic island. The territory spans and has a population density of . The Chamoro people settled the island approximately 3,500 years ago. Portuguese explorer Ferdinand Magellan, while in the service of Spain, was the first European to visit the island on March 6, 1521. Guam was colonized by Spain in 1668. Between the 16th and 18th centuries, Guam was an important stopover for the Spanish Manila Galleons. During the Spanish–American War, the United States captured Guam on June 21, 1898. Under the Treaty of Paris, signed December 10, 1898, Spain ceded Guam to the U.S. effective April 11, 1899. Before World War II, Guam was one of five American jurisdictions in the Pacific Ocean, along with Wake Island in Micronesia, American Samoa and Hawaii in Polynesia, and the Philippines. On December 8, 1941, hours after the attack on Pearl Harbor, Guam was captured by the Japanese, who occupied the island for two and a half years. During the occupation, Guamanians were subjected to forced labor, incarceration, torture and execution. American forces recaptured the island on July 21, 1944, which is commemorated as Liberation Day. Since the 1960s, Guam's economy has been supported primarily by tourism and the U.S. military, for which Guam is a major strategic asset. An unofficial but frequently used territorial motto is "Where America's Day Begins", which refers to the island's proximity to the International Date Line. Guam is among the 17 non-self-governing territories listed by the United Nations, and has been a member of the Pacific Community since 1983. History Pre-Contact era Guam, along with the Mariana Islands, were the first islands settled by humans in Remote Oceania. Incidentally it is also the first and the longest of the ocean-crossing voyages of the Austronesian peoples, and is separate from the later Polynesian settlement of the rest of Remote Oceania. They were first settled around 1500 to 1400 BC by migrants departing from the Philippines. This was followed by a second migration from the Caroline Islands by the first millennium AD, and a third migration from Island Southeast Asia (likely the Philippines or eastern Indonesia) by 900 AD. These original settlers of Guam and the Northern Mariana Islands evolved into the Chamoru people, historically known as Chamorros after first contact with the Spaniards. The ancient Chamoru society had four classes: (chiefs), (upper class), (middle class), and (lower class). The were located in the coastal villages, which meant they had the best access to fishing grounds, whereas the were located in the island's interior. and rarely communicated with each other, and often used as intermediaries. There were also "" or "", shamans with magical powers and "'" or "", healers who used different kinds of plants and natural materials to make medicine. Belief in spirits of ancient Chamorus called "" still persists as a remnant of pre-European culture. It is believed that "" or "" are the only ones who can safely harvest plants and other natural materials from their homes or "" without incurring the wrath of the "." Their society was organized along matrilineal clans. The Chamoru people raised colonnades of megalithic capped pillars called upon which they built their homes. Latte stones are stone pillars that are found only in the Mariana Islands; they are a recent development in Pre-Contact Chamoru society. The latte-stone was used as a foundation on which thatched huts were built. Latte stones consist of a base shaped from limestone called the and with a capstone, or , made either from a large brain coral or limestone, placed on top. A possible source for these stones, the Rota Latte Stone Quarry, was discovered in 1925 on Rota. Spanish era The first European to travel to Guam was Portuguese navigator Ferdinand Magellan, sailing for the King of Spain, when he sighted the island on March 6, 1521, during his fleet's circumnavigation of the globe. Despite Magellan's visit, Guam was not officially claimed by Spain until January 26, 1565, by Miguel López de Legazpi. From 1565 to 1815, Guam and the Northern Mariana Islands, the only Spanish outposts in the Pacific Ocean east of the Philippines, were reprovisioning stops for the Manila galleons, a fleet that covered the Pacific trade route between Acapulco and Manila. Spanish colonization commenced on June 15, 1668, with the arrival of a mission led by Diego Luis de San Vitores, who established the first Catholic church. The islands were part of the Spanish East Indies, and in turn part of the Viceroyalty of New Spain, based in Mexico City. The Spanish-Chamorro Wars on Guam began in 1670 over growing tensions with the Jesuit mission, with the last large-scale uprising in 1683. Intermittent warfare, plus the typhoons of 1671 and 1693, and in particular the smallpox epidemic of 1688, reduced the Chamoru population from 50,000 to 10,000, finally to less than 5,000. The island became a rest stop for whalers starting in 1823. A devastating typhoon struck the island on August 10, 1848, followed by a severe earthquake on January 25, 1849, which resulted in many refugees from the Caroline Islands, victims of the resultant tsunami. After a smallpox epidemic killed 3,644 Guamanians in 1856, Carolinians and Japanese were permitted to settle in the Marianas. American era After almost four centuries as part of the Kingdom of Spain, the United States occupied the island following Spain's defeat in 1898 Spanish–American War, as part of the Treaty of Paris of 1898. Guam was transferred to the United States Navy control on December 23, 1898, by Executive Order 108-A from 25th President William McKinley. Guam was a station for American merchants and warships traveling to and from the Philippines (another American acquisition from Spain) while the Northern Mariana Islands were sold by Spain to Germany for part of its rapidly expanding German Empire. A U.S. Navy yard was established at Piti in 1899, and a United States Marine Corps barracks at Sumay in 1901. A marine seaplane unit was stationed in Sumay from 1921 to 1930, the first in the Pacific. The Commercial Pacific Cable Company built a telegraph/telephone station in 1903 for the first trans-Pacific communications cable, followed by Pan American World Airways established a seaplane base at Sumay for its trans-Pacific China Clipper route. World War II During World War II, Guam was attacked and invaded by Japan on Monday, December 8, 1941, at the same time as the attack on Pearl Harbor, across the International Date Line. The Japanese renamed Guam (Great Shrine Island). The Japanese occupation of Guam lasted for approximately 31 months. During this period, the indigenous people of Guam were subjected to forced labor, family separation, incarceration, execution, concentration camps, and forced prostitution. Approximately 1,000 people died during the occupation, according to later Congressional committee testimony in 2004. Some historians estimate that war violence killed 10% of Guam's then 20,000 population. The United States returned and fought the Battle of Guam from July 21 to August 10, 1944, to recapture the island from Japanese military occupation. July 21 is now celebrated as Liberation Day, a territorial holiday. Post-war After World War II, the Guam Organic Act of 1950 established Guam as an unincorporated organized territory of the United States, provided for the structure of the island's civilian government, and granted the people U.S. citizenship. The Governor of Guam was federally appointed until 1968 when the Guam Elective Governor Act provided for the office's popular election. Since Guam is not a U.S. state, U.S. citizens residing on Guam are not allowed to vote for president and their congressional representative is a non-voting member. They do, however, get to vote for party delegates in presidential primaries. In 1969, a referendum on unification with the Northern Mariana Islands was held and rejected. During the 1970s, Dr. Maryly Van Leer Peck started an engineering program, expanded University of Guam, and founded Guam Community College. The removal of Guam's security clearance by President John F. Kennedy in 1963 allowed for the development of a tourism industry. When the United States closed U.S. Naval Base Subic Bay and Clark Air Base bases in the Philippines after the expiration of their leases in the early 1990s, many of the forces stationed there were relocated to Guam. The 1997 Asian financial crisis, which hit Japan particularly hard, severely affected Guam's tourism industry. Military cutbacks in the 1990s also disrupted the island's economy. Economic recovery was further hampered by devastation from Supertyphoons Paka in 1997 and Pongsona in 2002, as well as the effects of the September 11 terrorist attacks on tourism. Geography and environment Guam is long and wide, giving it an area of (three-fourths the size of Singapore) and making it the 32nd largest island of the United States. It is the southernmost and largest island in the Mariana Island archipelago, as well as the largest in Micronesia. Guam's Point Udall is the westernmost point of the U.S., as measured from the geographic center of the United States. The Mariana chain of which Guam is a part was created by collision of the Pacific and Philippine Sea tectonic plates, with Guam located on the micro Mariana Plate between the two. Guam is the closest land mass to the Mariana Trench, the deep subduction zone that runs east of the Marianas. Volcanic eruptions established the base of the island in the Eocene, roughly 56 to 33.9 million years ago. The north of Guam is a result of this base being covered with layers of coral reef, turning into limestone, and then being thrust upward by tectonic activity to create a plateau. The rugged south of the island is a result of more recent volcanic activity. Cocos Island off the southern tip of Guam is the largest of the many small islets along the coastline. Guam's highest point is Mount Lamlam at above sea level. If its base is considered to be nearby Challenger Deep, the deepest surveyed point in the Oceans, Mount Lamlam is the world's highest mountain at . Politically, Guam is divided into 19 villages. The majority of the population lives on the coralline limestone plateaus of the north, with political and economic activity centered in the central and northern regions. The rugged geography of the south largely limits settlement to rural coastal areas. The western coast is leeward of the trade winds and is the location of Apra Harbor, the capital Hagåtña, and the tourist center of Tumon. The U.S. Defense Department owns about 29% of the island, under the management of Joint Region Marianas. Climate Guam has a tropical rainforest climate (Köppen Af), though its driest month of March almost averages dry enough to qualify as a tropical monsoon climate (Köppen Am). The weather is generally hot and humid throughout the year with little seasonal temperature variation. Hence, Guam is known to have equable temperatures year-round. Trade winds are fairly constant throughout the year, but there is often a weak westerly monsoon influence in summer. Guam has two distinct seasons: Wet and dry season. The dry season runs from January through May and June being the transitional period. The wet season runs from July through November with an average annual rainfall between 1981 and 2010 of around . The wettest month on record at Guam Airport has been August 1997 with and the driest was February 2015 with . The wettest calendar year has been 1976 with and the driest was in 1998 with . The most rainfall in a single day occurred on October 15, 1953, when fell. The mean high temperature is and mean low is . Temperatures rarely exceed or fall below . The relative humidity commonly exceeds 84 percent at night throughout the year, but the average monthly humidity hovers near 66 percent. The highest temperature ever recorded in Guam was on April 18, 1971, and April 1, 1990. A record low of was set on February 1, 2021, while the lowest recorded temperature was 65 °F (18.3 °C), set on February 8, 1973. Guam lies in the path of typhoons and it is common for the island to be threatened by tropical storms and possible typhoons during the wet season. The highest risk of typhoons is from August through November, where typhoons and tropical storms are most probable in the western Pacific. They can, however, occur year-round. Typhoons that have caused major damage on Guam in the American period include the Typhoon of 1900, Karen (1962), Pamela (1976), Paka (1997), and Pongsona (2002). Since Typhoon Pamela in 1976, wooden structures have been largely replaced by concrete structures. During the 1980s, wooden utility poles began to be replaced by typhoon-resistant concrete and steel poles. After the local Government enforced stricter construction codes, many home and business owners built their structures out of reinforced concrete with installed typhoon shutters. Ecology Guam has experienced severe impacts from invasive species upon the natural biodiversity of the island. These include the local extinction of endemic bird species after the introduction of the brown tree snake, an infestation of the Asiatic rhinoceros beetle destroying coconut palms, and the effect of introduced feral mammals and amphibians. Wildfires plague the forested areas of Guam every dry season despite the island's humid climate. Most fires are caused by humans with 80% resulting from arson. Poachers often start fires to attract deer to the new growth. Invasive grass species that rely on fire as part of their natural life cycle grow in many regularly burned areas. Grasslands and "barrens" have replaced previously forested areas leading to greater soil erosion. During the rainy season, sediment is carried by the heavy rains into the Fena Lake Reservoir and Ugum River, leading to water quality problems for southern Guam. Eroded silt also destroys the marine life in reefs around the island. Soil stabilization efforts by volunteers and forestry workers (planting trees) have had little success in preserving natural habitats. Efforts have been made to protect Guam's coral reef habitats from pollution, eroded silt and overfishing, problems that have led to decreased fish populations. This has both ecological and economic value, as Guam is a significant vacation spot for scuba divers, and one study found that Guam's reefs are worth $127 million per year. In recent years, the Department of Agriculture, Division of Aquatic and Wildlife Resources has established several new marine preserves where fish populations are monitored by biologists. These are located at Pati Point, Piti Bomb
well as additional sports-themed editions. Game Boy Pocket The Game Boy Pocket is a redesigned version of the original Game Boy having the same features. It was released in 1996. Notably, this variation is smaller and lighter. It comes in seven different colors; red, yellow, green, black, clear, silver, blue, and pink. Another notable improvement over the original Game Boy is a black-and-white display screen, rather than the green-tinted display of the original Game Boy, that also featured improved response time for less blurring during motion. The Game Boy Pocket takes two AAA batteries as opposed to four AA batteries for roughly ten hours of gameplay. The first model of the Game Boy Pocket did not have an LED to show battery levels, but the feature was added due to public demand. Game Boy Light In April 1998, a variant of the Game Boy Pocket named Game Boy Light was exclusively released in Japan. The differences between the original Game Boy Pocket and the Game Boy Light is that the Game Boy Light takes on two AA batteries for approximately 20 hours of gameplay (when playing without using the light), rather than two AAA batteries, and it has an electroluminescent screen that can be turned on or off. This electroluminescent screen gave games a blue-green tint and allowed the use of the unit in darkened areas. Playing with the light on would allow about 12 hours of play. The Game Boy Light also comes in six different colors; silver, gold, yellow for the Pokémon edition, translucent yellow, clear and translucent red for the Astro Boy edition. The Game Boy Light was superseded by the Game Boy Color six months later and was the only Game Boy to have a backlit screen until the release of the Game Boy Advance SP AGS-101 model in 2005. Game Boy Color family Game Boy Color First released in Japan on October 21, 1998, the Game Boy Color (abbreviated as GBC) added a (slightly smaller) color screen to a form factor similar in size to the Game Boy Pocket. It also has double the processor speed, three times as much memory, and an infrared communications port. Technologically, it was likened to the 8-bit NES video game console from the 1980s although the Game Boy Color has a much larger color palette (56 simultaneous colors out of 32,768 possible) which had some classic NES ports and newer titles. It comes in six different colors; Atomic purple, indigo, berry (red), kiwi (green), dandelion (yellow) and teal. The Game Boy Color also has several special edition variants such as the yellow and silver Pokémon special editions or the Tommy Hilfiger yellow special edition. Like the Game Boy Light, the Game Boy Color takes only two AA batteries. It was the final handheld to have 8-bit graphics and to have a vertical shape. A major component of the Game Boy Color is its near-universal backward compatibility; that is, a Game Boy Color is able to read older Game Boy cartridges and even play them in a selectable color palette (similar to the Super Game Boy). The only black and white Game Boy games known to be incompatible are Road Rash and Joshua & the Battle of Jericho. Backwards compatibility became a major feature of the Game Boy line, since it allowed each new launch to begin with a significantly larger library than any of its competitors. Some games written specifically for the Game Boy Color can be played on older model Game Boys, whereas others cannot (see the Game Paks section for more information). Game Boy Advance family Game Boy Advance In Japan, on March 21, 2001, Nintendo released a significant upgrade to the Game Boy line. The Game Boy Advance (also referred to as GBA) featured a 32 bit 16.8 MHz ARM. It included a Z80 processor and a switch activated by inserting a Game Boy or Game Boy Color game into the slot for backward compatibility, and had a larger, higher resolution screen. Controls were slightly modified with the addition of "L" and "R" shoulder buttons. Like the Game Boy Light and Game Boy Color, the Game Boy Advance takes on two AA batteries. The system was technically likened to the SNES and showed its power with successful ports of SNES titles such as Super Mario World, Super Mario World 2: Yoshi's Island, The Legend of Zelda: A Link to the Past and Donkey Kong Country. There were also new titles that could be found only on the GBA, such as Mario Kart: Super Circuit, F-Zero: Maximum Velocity, Wario Land 4, Mario & Luigi: Superstar Saga and more. A widely criticized drawback of the Game Boy Advance is that the screen is not backlit, making viewing difficult in some conditions. The Game Paks for the GBA are roughly half the length of original Game Boy cartridges and Game Boy Color cartridges, and so older Game Paks would stick out of the top of the unit. When playing older games, the GBA provides the option to play the game at the standard equal square resolution of the original screen or the option to stretch it over the wider GBA screen. The selectable color palettes for the original Game Boy games are identical to what it was on the Game Boy Color. The only Game Boy Color games known to be incompatible are Pocket Music and Chee-Chai Alien. It was the final handheld to require regular batteries and to lack a backlit screen. Game Boy Advance SP First released in Japan on February 14, 2003, the Game Boy Advance SP—Nintendo model AGS-001—resolved several problems with the original Game Boy Advance model. It featured a new smaller clamshell design with a flip-up screen, a switchable internal frontlight, a rechargeable battery for the first time, and the only notable issue is the omission of the headphone jack, which requires a special adapter, purchased separately. In September 2005, Nintendo released the Game Boy Advance SP model AGS-101, that featured a high quality backlit screen instead of a frontlit, similar to the Game Boy Micro screen but larger. It was the final Game Boy and last handheld to have backwards compatibility with Game Boy and Game Boy Color games. Game Boy Micro The third form of Game Boy Advance system, the Game Boy Micro is four and a half inches wide (10 cm), two inches tall (5 cm), and weighs 2.8 ounces (80 g). By far the smallest Game Boy created, it has approximately the same dimensions as an original NES controller pad. Its screen is approximately 2/3 the size of the SP and GBA screens while maintaining the same resolution (240×160 pixels) but boasted a higher quality backlit display with adjustable brightness. Included with the system are two additional faceplates which can be swapped to give the system a new look; Nintendo of America sold additional faceplates on its online store. In Europe, the Game Boy Micro comes with a single faceplate. In Japan, a special Mother 3 limited edition Game Boy Micro was released with the game in the Mother 3 Deluxe Box. Unlike the Game Boy Advance and Game Boy Advance SP, the Game Boy Micro is unable to play any original Game Boy or Game Boy Color games, only playing Game Boy Advance titles (with the exception of the Nintendo e-Reader, discontinued in America, but still available in Japan). Comparison Game Paks Each video game is stored on a plastic cartridge, officially called a "Game Pak" by Nintendo. All cartridges, excluding those for Game Boy Advance, measure 5.8 by 6.5 cm. The cartridge provides the code and game data to the console's CPU. Some cartridges include a small battery with SRAM, flash memory chip, or EEPROM, which allows game data to be saved when the console is turned off. If the battery runs out in a cartridge, then the save data will be lost, however, it is possible to replace the battery with a new battery. To do this, the cartridge must be unscrewed, opened up, and the old battery would be removed and replaced. This may require desoldering the dead battery and soldering the replacement in place. Before 2003, Nintendo used round, flat watch batteries for saving information on the cartridges. These batteries were replaced in newer cartridges because they could only live for a certain amount of time. The cartridge is inserted into the console cartridge slot. If the cartridge is removed while the power is on, and the Game Boy does not automatically reset, the game freezes; the Game Boy may exhibit unexpected behavior, such as rows of zeros appearing on the screen, the sound remaining at the same pitch as was emitted the instant the game was pulled out, saved data may be corrupted, and hardware may be damaged. This applies to most video game consoles that use cartridges. The original Game Boy power switch was designed to prevent the player from being able to remove the cartridge while the power is on. Cartridges intended only for Game Boy Color (and not for the original Game Boy) lack
superior alternatives which would have color graphics instead. This is also apparent in the name (conceived by Shigesato Itoi), which connotes a smaller "sidekick" companion to Nintendo's consoles. Game Boy continues its success to this day and many at Nintendo have dedicated the handheld in Yokoi's memory. Game Boy celebrated its 15th anniversary in 2004, which nearly coincided with the 20-year anniversary of the original Nintendo Entertainment System (NES). To celebrate, Nintendo released the Classic NES Series and an NES controller-themed color scheme for the Game Boy Advance SP. In 2006, Nintendo president Satoru Iwata said on the rumored demise of the Game Boy brand: "No, it's not true after all. What we are repeatedly saying is that for whichever platform, we are always conducting research and development for the new system, be it the Game Boy, or new console or whatever. And what we just told the reporter was that in thinking about the current situation where we are enjoying great sales with the DS and that we are now trying to launch the Wii, it's unthinkable for us to launch any new platform for the handheld system, including the new version of the GBA... Perhaps they misunderstood a part of this story, but as far as the handheld market is concerned [right now] we really want to focus on more sales of the DS; that's all" until Nintendo ceased the production of the Game Boy Advance games and handheld system in North America on May 15, 2010. Classic Game Boy family Game Boy The original gray Game Boy was first released in Japan on April 21, 1989. Based on a Z80 processor, it has a black and green reflective LCD screen, an eight-way directional pad, two action buttons (A and B), and Start and Select buttons with the controls being identical to the NES controller. It plays games from ROM-based media contained in cartridges (sometimes called carts or Game Paks). Its graphics are 8-bit (similar to the NES). The game that pushed the Game Boy into the upper reaches of success was Tetris. Tetris was widely popular, and on the handheld format could be played anywhere. It came packaged with the Game Boy, and broadened its reach; adults and children alike were buying Game Boys in order to play Tetris. Releasing Tetris on the Game Boy was selected as #4 on GameSpy's "25 Smartest Moments in Gaming". The original Game Boy was one of the first cartridge-based systems that supported networking: two devices with a Game Link Cable, or up to four with the Four Player Adapter. In 1995, the "Play it Loud" version of the original Game Boy was released in six different colors; black, red, yellow, green, blue, white and clear as well as additional sports-themed editions. Game Boy Pocket The Game Boy Pocket is a redesigned version of the original Game Boy having the same features. It was released in 1996. Notably, this variation is smaller and lighter. It comes in seven different colors; red, yellow, green, black, clear, silver, blue, and pink. Another notable improvement over the original Game Boy is a black-and-white display screen, rather than the green-tinted display of the original Game Boy, that also featured improved response time for less blurring during motion. The Game Boy Pocket takes two AAA batteries as opposed to four AA batteries for roughly ten hours of gameplay. The first model of the Game Boy Pocket did not have an LED to show battery levels, but the feature was added due to public demand. Game Boy Light In April 1998, a variant of the Game Boy Pocket named Game Boy Light was exclusively released in Japan. The differences between the original Game Boy Pocket and the Game Boy Light is that the Game Boy Light takes on two AA batteries for approximately 20 hours of gameplay (when playing without using the light), rather than two AAA batteries, and it has an electroluminescent screen that can be turned on or off. This electroluminescent screen gave games a blue-green tint and allowed the use of the unit in darkened areas. Playing with the light on would allow about 12 hours of play. The Game Boy Light also comes in six different colors; silver, gold, yellow for the Pokémon edition, translucent yellow, clear and translucent red for the Astro Boy edition. The Game Boy Light was superseded by the Game Boy Color six months later and was the only Game Boy to have a backlit screen until the release of the Game Boy Advance SP AGS-101 model in 2005. Game Boy Color family Game Boy Color First released in Japan on October 21, 1998, the Game Boy Color (abbreviated as GBC) added a (slightly smaller) color screen to a form factor similar in size to the Game Boy Pocket. It also has double the processor speed, three times as much memory, and an infrared communications port. Technologically, it was likened to the 8-bit NES video game console from the 1980s although the Game Boy Color has a much larger color palette (56 simultaneous colors out of 32,768 possible) which had some classic NES ports and newer titles. It comes in six different colors; Atomic purple, indigo, berry (red), kiwi (green), dandelion (yellow) and teal. The Game Boy Color also has several special edition variants such as the yellow and silver Pokémon special editions or the Tommy Hilfiger yellow special edition. Like the Game Boy Light, the Game Boy Color takes only two AA batteries. It was the final handheld to have 8-bit graphics and to have a vertical shape. A major component of the Game Boy Color is its near-universal backward compatibility; that is, a Game Boy Color is able to read older Game Boy cartridges and even play them in a selectable color palette (similar to the Super Game Boy). The only black and white Game Boy games known to be incompatible are Road Rash and Joshua & the Battle of Jericho. Backwards compatibility became a major feature of the Game Boy line, since it allowed each new launch to begin with a significantly larger library than any of its competitors. Some games written specifically for the Game Boy Color can be played on older model Game Boys, whereas others cannot (see the Game Paks section for more information). Game Boy Advance family Game Boy Advance In Japan, on March 21, 2001, Nintendo released a significant upgrade to the Game Boy line. The Game Boy Advance (also referred to as GBA) featured a 32 bit 16.8 MHz ARM. It included a Z80 processor and a switch activated by inserting a Game Boy or Game Boy Color game into the slot for backward compatibility, and had a larger, higher resolution screen. Controls were slightly modified with the addition of "L" and "R" shoulder buttons. Like the Game Boy Light and Game Boy Color, the Game Boy Advance takes on two AA batteries. The system was technically likened to the SNES and showed its power with successful ports of SNES titles such as Super Mario World, Super Mario World 2: Yoshi's Island, The Legend of Zelda: A Link to the Past and Donkey Kong Country. There were also new titles that could be found only on the GBA, such as Mario Kart: Super Circuit, F-Zero: Maximum Velocity, Wario Land 4, Mario & Luigi: Superstar Saga and more. A widely criticized drawback of the Game Boy Advance is that the screen is not backlit, making viewing difficult in some conditions. The Game Paks for the GBA are roughly half the length of original Game Boy cartridges and Game Boy Color cartridges, and so older Game Paks would stick out of the top of the unit. When playing older games, the GBA provides the option to play the game at the standard equal square resolution of the original screen or the option to stretch it over the wider GBA screen. The selectable color palettes for the original Game Boy games are identical to what it was on the Game Boy Color. The only Game Boy Color games known to be incompatible are Pocket Music and Chee-Chai Alien. It was the final handheld to require regular batteries and to lack a backlit screen. Game Boy Advance SP First released in Japan on February 14, 2003, the Game Boy Advance SP—Nintendo model AGS-001—resolved several problems with the original Game Boy Advance model. It featured
min Docking Docked: July 19, 1966 - 04:15:00 UTC Undocked: July 20, 1966 - 19:00:00 UTC Space walk Collins - EVA 1 (stand up) Start: July 19, 1966, 21:44:00 UTC End: July 19, 1966, 22:33:00 UTC Duration: 0 hours, 49 minutes Collins - EVA 2 Start: July 20, 1966, 23:01:00 UTC End: July 20, 1966, 23:40:00 UTC Duration: 0 hours, 39 minutes Objectives Gemini 10 was designed to achieve rendezvous and docking with an Agena Target Vehicle (ATV), and EVA. It was also planned to dock with the ATV from the Gemini 8 mission. This Agena's battery power had failed months earlier, and an approach and docking would demonstrate the ability to rendezvous with a passive object. It would be also the first mission to fire the Agena's own rocket, allowing them to reach higher orbits. Gemini 10 established that radiation at high altitude was not a problem. After docking with their Agena booster in low orbit, Young and Collins used it to climb temporarily to . After leaving the first Agena, they then rendezvoused with the derelict Agena left over from the aborted Gemini 8 flight—thus executing the program's first double rendezvous. With no electricity on board the second Agena, the rendezvous was accomplished with eyes only—no radar. After the rendezvous, Collins spacewalked over to the dormant Agena at the end of a tether, making him the first person to meet another spacecraft in orbit. Collins then retrieved a cosmic dust-collecting panel from the side of the Agena. As he was concentrating on keeping his tether clear of the Gemini and Agena, Collins' Hasselblad camera worked itself free and drifted away, so he was unable to take photographs during the spacewalk. Flight The Agena launched perfectly for the second time, after problems had occurred with the targets for Gemini 6 and 9. Gemini 10 followed 100 minutes later and entered a orbit. They were behind the Agena. Two anomalous events occurred during the launch. At liftoff, a propellant fill umbilical became snared with its release lanyard. It ripped out of the LC-19 service tower and remained attached to the second stage during ascent. Tracking camera footage also showed that the first stage oxidizer tank dome ruptured after staging and released a cloud of nitrogen tetroxide. The telemetry package on the first stage had been disabled at staging, so visual evidence was the only data available. Film review of the Titan II ICBM launches found at least seven other instances of post-staging tank ruptures, most likely caused by flying debris, second stage engine exhaust, or structural bending. NASA finally decided that this phenomenon did not pose any safety risk to the astronauts and took no corrective action. First rendezvous Collins was unable to use the sextant for navigation as it did not seem to work as expected. At first he mistook airglow as the real horizon when trying to make some fixes on stars. When the image didn't seem right he tried another instrument, but this was not practical to use as it had a very small field of view. They had a backup in the form of the computers on the ground. They made their first burn to put them into a orbit. However Young didn't realize that during the next burn, he had the spacecraft turned slightly, which meant that they introduced an out-of-plane error. This meant two extra burns were necessary, and by the time they had docked with the Agena, 60% of their fuel had been consumed. It was decided to keep the Gemini docked to the Agena as long as possible, as this would mean that they could use the fuel on board the Agena for attitude control. The first burn of the Agena engine lasted 80 seconds and put them in a orbit. This was the highest a person had ever been, although the record was soon surpassed by Gemini 11, which went to over . This burn was quite a ride for the crew. Because the Gemini and Agena docked nose-to-nose, the forces experienced were "eyeballs out" as opposed to "eyeballs in" for a launch from Earth. The crew took a couple of pictures when they reached apogee but were more interested in what was going on in the spacecraft — checking the systems and watching the radiation dosage meter. After this they had their sleep period which lasted for eight hours and then they were ready for another busy day. The
accomplished with eyes only—no radar. After the rendezvous, Collins spacewalked over to the dormant Agena at the end of a tether, making him the first person to meet another spacecraft in orbit. Collins then retrieved a cosmic dust-collecting panel from the side of the Agena. As he was concentrating on keeping his tether clear of the Gemini and Agena, Collins' Hasselblad camera worked itself free and drifted away, so he was unable to take photographs during the spacewalk. Flight The Agena launched perfectly for the second time, after problems had occurred with the targets for Gemini 6 and 9. Gemini 10 followed 100 minutes later and entered a orbit. They were behind the Agena. Two anomalous events occurred during the launch. At liftoff, a propellant fill umbilical became snared with its release lanyard. It ripped out of the LC-19 service tower and remained attached to the second stage during ascent. Tracking camera footage also showed that the first stage oxidizer tank dome ruptured after staging and released a cloud of nitrogen tetroxide. The telemetry package on the first stage had been disabled at staging, so visual evidence was the only data available. Film review of the Titan II ICBM launches found at least seven other instances of post-staging tank ruptures, most likely caused by flying debris, second stage engine exhaust, or structural bending. NASA finally decided that this phenomenon did not pose any safety risk to the astronauts and took no corrective action. First rendezvous Collins was unable to use the sextant for navigation as it did not seem to work as expected. At first he mistook airglow as the real horizon when trying to make some fixes on stars. When the image didn't seem right he tried another instrument, but this was not practical to use as it had a very small field of view. They had a backup in the form of the computers on the ground. They made their first burn to put them into a orbit. However Young didn't realize that during the next burn, he had the spacecraft turned slightly, which meant that they introduced an out-of-plane error. This meant two extra burns were necessary, and by the time they had docked with the Agena, 60% of their fuel had been consumed. It was decided to keep the Gemini docked to the Agena as long as possible, as this would mean that they could use the fuel on board the Agena for attitude control. The first burn of the Agena engine lasted 80 seconds and put them in a orbit. This was the highest a person had ever been, although the record was soon surpassed by Gemini 11, which went to over . This burn was quite a ride for the crew. Because the Gemini and Agena docked nose-to-nose, the forces experienced were "eyeballs out" as opposed to "eyeballs in" for a launch from Earth. The crew took a couple of pictures when they reached apogee but were more interested in what was going on in the spacecraft — checking the systems and watching the radiation dosage meter. After this they had their sleep period which lasted for eight hours and then they were ready for another busy day. The crew's first order of business was to make a second burn with the Agena engine to put them into the same orbit as the Gemini 8 Agena. This was at 20:58 UTC on July 19 and lasted 78 seconds and took off their speed, putting them into a orbit. They made one more burn of the Agena to circularize their orbit to . EVA 1 The first of two EVAs on Gemini 10 was a standup EVA, where Collins would stand in the open hatch and take photographs of stars as part of experiment S-13. They used a 70 mm general purpose camera to image the southern Milky Way in ultraviolet. After orbital sunrise Collins photographed a color plate on the side of the spacecraft (MSC-8) to see whether film reproduced colors accurately in space. He reentered the spacecraft six minutes early when both astronauts found that their eyes were irritated, which was caused by a minor leak of lithium hydroxide in the astronauts' oxygen supply. After repressurizing the cabin, they ran the oxygen at high rates and flushed the environment system. After the exercise of the EVA Young and Collins slept in their second 'night' in space. The next 'morning' they started preparing for the second rendezvous and another EVA. Second rendezvous After undocking from their Agena, the crew thought they sighted the Gemini 8 Agena. It however turned out to be their own Agena away, while their target was away. It wasn't until just over away that they saw it as a faint star. After a few more correction burns, they were station-keeping away from the Gemini 8 Agena. They found the Agena to be very stable and in good condition. EVA 2 At 48 hours and 41 minutes into the mission, the second EVA began. Collins' first task was to retrieve a Micrometeorite Collector (S-12) from the side of the spacecraft. This he accomplished with some difficulty
of acacias and palms. A notable example of ancient ornamental gardens were the Hanging Gardens of Babylon—one of the Seven Wonders of the Ancient World —while ancient Rome had dozens of gardens. Wealthy ancient Egyptians used gardens for providing shade. Egyptians associated trees and gardens with gods, believing that their deities were pleased by gardens. Gardens in ancient Egypt were often surrounded by walls with trees planted in rows. Among the most popular species planted were date palms, sycamores, fig trees, nut trees, and willows. These gardens were a sign of higher socioeconomic status. In addition, wealthy ancient Egyptians grew vineyards, as wine was a sign of the higher social classes. Roses, poppies, daisies and irises could all also be found in the gardens of the Egyptians. Assyria was also renowned for its beautiful gardens. These tended to be wide and large, some of them used for hunting game—rather like a game reserve today—and others as leisure gardens. Cypresses and palms were some of the most frequently planted types of trees. Gardens were also available in Kush. In Musawwarat es-Sufra, the Great Enclosure dated to the 3rd century BC included splendid gardens. Ancient Roman gardens were laid out with hedges and vines and contained a wide variety of flowers—acanthus, cornflowers, crocus, cyclamen, hyacinth, iris, ivy, lavender, lilies, myrtle, narcissus, poppy, rosemary and violets—as well as statues and sculptures. Flower beds were popular in the courtyards of rich Romans. The Middle Ages The Middle Ages represent a period of decline in gardens for aesthetic purposes. After the fall of Rome, gardening was done for the purpose of growing medicinal herbs and/or decorating church altars. Monasteries carried on a tradition of garden design and intense horticultural techniques during the medieval period in Europe. Generally, monastic garden types consisted of kitchen gardens, infirmary gardens, cemetery orchards, cloister garths and vineyards. Individual monasteries might also have had a "green court", a plot of grass and trees where horses could graze, as well as a cellarer's garden or private gardens for obedientiaries, monks who held specific posts within the monastery. Islamic gardens were built after the model of Persian gardens and they were usually enclosed by walls and divided in four by watercourses. Commonly, the centre of the garden would have a reflecting pool or pavilion. Specific to the Islamic gardens are the mosaics and glazed tiles used to decorate the rills and fountains that were built in these gardens. By the late 13th century, rich Europeans began to grow gardens for leisure and for medicinal herbs and vegetables. They surrounded the gardens by walls to protect them from animals and to provide seclusion. During the next two centuries, Europeans started planting lawns and raising flowerbeds and trellises of roses. Fruit trees were common in these gardens and also in some, there were turf seats. At the same time, the gardens in the monasteries were a place to grow flowers and medicinal herbs but they were also a space where the monks could enjoy nature and relax. The gardens in the 16th and 17th century were symmetric, proportioned and balanced with a more classical appearance. Most of these gardens were built around a central axis and they were divided into different parts by hedges. Commonly, gardens had flowerbeds laid out in squares and separated by gravel paths. Gardens in Renaissance were adorned with sculptures, topiary and fountains. In the 17th century, knot gardens became popular along with the hedge mazes. By this time, Europeans started planting new flowers such as tulips, marigolds and sunflowers. Cottage gardens Cottage gardens, which emerged in Elizabethan times, appear to have originated as a local source for herbs and fruits. One theory is that they arose out of the Black Death of the 1340s, when the death of so many laborers made land available for small cottages with personal gardens. According to the late 19th-century legend of origin, these gardens were originally created by the workers that lived in the cottages of the villages, to provide them with food and herbs, with flowers planted among them for decoration. Farm workers were provided with cottages that had architectural quality set in a small garden—about —where they could grow food and keep pigs and chickens. Authentic gardens of the yeoman cottager would have included a beehive and livestock, and frequently a pig and sty, along with a well. The peasant cottager of medieval times was more interested in meat than flowers, with herbs grown for medicinal use rather than for their beauty. By Elizabethan times there was more prosperity, and thus more room to grow flowers. Even the early cottage garden flowers typically had their practical use—violets were spread on the floor (for their pleasant scent and keeping out vermin); calendulas and primroses were both attractive and used in cooking. Others, such as sweet William and hollyhocks, were grown entirely for their beauty. 18th century In the 18th century gardens were laid out more naturally, without any walls. This style of smooth undulating grass, which would run straight to the house, clumps, belts and scattering of trees and his serpentine lakes formed by invisibly damming small rivers, were a new style within the English landscape, a "gardenless" form of landscape gardening, which swept away almost all the remnants of previous formally patterned styles. The English landscape garden usually included a lake, lawns set against groves of trees, and often contained shrubberies, grottoes, pavilions, bridges and follies such as mock temples, Gothic ruins, bridges, and other picturesque architecture, designed to recreate an idyllic pastoral landscape. This new style emerged in England in the early 18th century, and spread across Europe, replacing the more formal, symmetrical garden à la française of the 17th century as the principal gardening style of Europe. The English garden presented an idealized view of nature. They were often inspired by paintings of landscapes by Claude Lorraine and Nicolas Poussin, and some were Influenced by the classic Chinese gardens of the East, which had recently been described by European travelers. The work of Lancelot 'Capability' Brown was particularly influential. Also, in 1804 the Horticultural Society was formed. Gardens of the 19th century contained plants such as the monkey puzzle or Chile pine. This is also the time when the so-called "gardenesque" style of gardens evolved. These gardens displayed a wide variety of flowers in a rather small space. Rock gardens increased in popularity in the 19th century. India: In India, in the ancient times, patterns from sacred geometry and mandalas were used to design their gardens. Distinct mandala patterns denoted specific deities, planets, or even constellations. Such a garden was also referred to as a 'Mandala Vaatika'. The word 'Vaatika' can mean garden, plantation or parterre. Types Residential gardening takes place near the home, in a space referred to as the garden. Although a garden typically is located on the land near a residence, it may also be located on a roof, in an atrium, on a balcony, in a windowbox, on a patio or vivarium. Gardening also takes place in non-residential green areas, such as parks, public or semi-public gardens (botanical gardens or zoological gardens), amusement parks, along transportation corridors, and around tourist attractions and garden hotels. In these situations, a staff of gardeners or groundskeepers maintains the gardens. Indoor gardening is concerned with the growing of houseplants within a residence or building, in a conservatory, or in a greenhouse. Indoor gardens are sometimes incorporated as part of air conditioning or heating systems. Indoor gardening extends the growing season in the fall and spring and can be used for winter gardening. Native plant gardening is concerned with the use of native plants with or without the intent of creating wildlife habitat. The goal is to create a garden in harmony with, and adapted to a given area. This type of gardening typically reduces water usage, maintenance, and fertilization costs, while increasing native faunal interest. Water gardening is concerned with growing plants adapted to pools and ponds. Bog gardens are also considered a type of water garden. These all require special conditions and considerations. A simple water garden may consist solely of a tub containing the water and plant(s). In aquascaping, a garden is created within an aquarium tank. Container gardening is concerned with growing plants in any type of container either indoors or
which emerged in Elizabethan times, appear to have originated as a local source for herbs and fruits. One theory is that they arose out of the Black Death of the 1340s, when the death of so many laborers made land available for small cottages with personal gardens. According to the late 19th-century legend of origin, these gardens were originally created by the workers that lived in the cottages of the villages, to provide them with food and herbs, with flowers planted among them for decoration. Farm workers were provided with cottages that had architectural quality set in a small garden—about —where they could grow food and keep pigs and chickens. Authentic gardens of the yeoman cottager would have included a beehive and livestock, and frequently a pig and sty, along with a well. The peasant cottager of medieval times was more interested in meat than flowers, with herbs grown for medicinal use rather than for their beauty. By Elizabethan times there was more prosperity, and thus more room to grow flowers. Even the early cottage garden flowers typically had their practical use—violets were spread on the floor (for their pleasant scent and keeping out vermin); calendulas and primroses were both attractive and used in cooking. Others, such as sweet William and hollyhocks, were grown entirely for their beauty. 18th century In the 18th century gardens were laid out more naturally, without any walls. This style of smooth undulating grass, which would run straight to the house, clumps, belts and scattering of trees and his serpentine lakes formed by invisibly damming small rivers, were a new style within the English landscape, a "gardenless" form of landscape gardening, which swept away almost all the remnants of previous formally patterned styles. The English landscape garden usually included a lake, lawns set against groves of trees, and often contained shrubberies, grottoes, pavilions, bridges and follies such as mock temples, Gothic ruins, bridges, and other picturesque architecture, designed to recreate an idyllic pastoral landscape. This new style emerged in England in the early 18th century, and spread across Europe, replacing the more formal, symmetrical garden à la française of the 17th century as the principal gardening style of Europe. The English garden presented an idealized view of nature. They were often inspired by paintings of landscapes by Claude Lorraine and Nicolas Poussin, and some were Influenced by the classic Chinese gardens of the East, which had recently been described by European travelers. The work of Lancelot 'Capability' Brown was particularly influential. Also, in 1804 the Horticultural Society was formed. Gardens of the 19th century contained plants such as the monkey puzzle or Chile pine. This is also the time when the so-called "gardenesque" style of gardens evolved. These gardens displayed a wide variety of flowers in a rather small space. Rock gardens increased in popularity in the 19th century. India: In India, in the ancient times, patterns from sacred geometry and mandalas were used to design their gardens. Distinct mandala patterns denoted specific deities, planets, or even constellations. Such a garden was also referred to as a 'Mandala Vaatika'. The word 'Vaatika' can mean garden, plantation or parterre. Types Residential gardening takes place near the home, in a space referred to as the garden. Although a garden typically is located on the land near a residence, it may also be located on a roof, in an atrium, on a balcony, in a windowbox, on a patio or vivarium. Gardening also takes place in non-residential green areas, such as parks, public or semi-public gardens (botanical gardens or zoological gardens), amusement parks, along transportation corridors, and around tourist attractions and garden hotels. In these situations, a staff of gardeners or groundskeepers maintains the gardens. Indoor gardening is concerned with the growing of houseplants within a residence or building, in a conservatory, or in a greenhouse. Indoor gardens are sometimes incorporated as part of air conditioning or heating systems. Indoor gardening extends the growing season in the fall and spring and can be used for winter gardening. Native plant gardening is concerned with the use of native plants with or without the intent of creating wildlife habitat. The goal is to create a garden in harmony with, and adapted to a given area. This type of gardening typically reduces water usage, maintenance, and fertilization costs, while increasing native faunal interest. Water gardening is concerned with growing plants adapted to pools and ponds. Bog gardens are also considered a type of water garden. These all require special conditions and considerations. A simple water garden may consist solely of a tub containing the water and plant(s). In aquascaping, a garden is created within an aquarium tank. Container gardening is concerned with growing plants in any type of container either indoors or outdoors. Common containers are pots, hanging baskets, and planters. Container gardening is usually used in atriums and on balconies, patios, and roof tops. Hügelkultur is concerned with growing plants on piles of rotting wood, as a form of raised bed gardening and composting in situ. An English loanword from German, it means "mound garden." Toby Hemenway, noted permaculture author and teacher, considers wood buried in trenches to also be a form of hugelkultur referred to as a dead wood swale. Hugelkultur is practiced by Sepp Holzer as a method of forest gardening and agroforestry, and by Geoff Lawton as a method of dryland farming and desert greening. When used as a method of disposing of large volumes of waste wood and woody debris, hugelkultur accomplishes carbon sequestration. It is also a form of xeriscaping. Community gardening is a social activity in which an area of land is gardened by a group of people, providing access to fresh produce, herbs, flowers and plants as well as access to satisfying labor, neighborhood improvement, sense of community and connection to the environment. Community gardens are typically owned in trust by local governments or nonprofits. Garden sharing partners landowners with gardeners in need of land. These shared gardens, typically front or back yards, are usually used to produce food that is divided between the two parties. Organic gardening uses natural, sustainable methods, fertilizers and pesticides to grow non-genetically modified crops. Biodynamic gardening or biodynamic agriculture is similar to organic gardening, but it includes various esoteric concepts drawn from the ideas of Rudolf Steiner, such as astrological sowing and planting calendar and particular field and compost preparations. Commercial gardening is a more intensive type of gardening that involves the production of vegetables, nontropical fruits, and flowers from local farmers. Commercial gardening began because farmers would sell locally to stop food from spoiling faster because of the transportation of goods from a far distance. Mediterranean agriculture is also a common practice that commercial gardeners use. Mediterranean agriculture is the practice of cultivating animals such as sheep to help weed and provide manure for vine crops, grains, or citrus. Gardeners can easily train these animals to not eat the actual plant. Social aspects People can express their political or social views in gardens, intentionally or not. The lawn vs. garden issue is played out in urban planning as the debate over the "land ethic" that is to determine urban land use and whether hyper hygienist bylaws (e.g. weed control) should apply, or whether land should generally be allowed to exist in its natural wild state. In a famous Canadian Charter of Rights case, "Sandra Bell vs. City of Toronto", 1997, the right to cultivate all native species, even most varieties deemed noxious or allergenic, was upheld as part of the right of free expression. Community gardening comprises a wide variety of approaches to sharing land and gardens. People often surround their house and garden with a hedge. Common hedge plants are privet, hawthorn, beech, yew, leyland cypress, hemlock, arborvitae, barberry, box, holly, oleander, forsythia and lavender. The idea of open gardens without hedges may be distasteful to those who enjoy privacy. The Slow Food movement has sought in some countries to add an edible school yard and garden classrooms to schools, e.g. in Fergus, Ontario, where these were added to a public school to augment the kitchen classroom. Garden sharing, where urban landowners allow gardeners to grow on their property in exchange for a share of the harvest, is associated with the desire to control the quality of one's food,
first art galleries to show graffitists to the public were Fashion Moda in the Bronx, Now Gallery and Fun Gallery, both in the East Village, Manhattan. A 2006 exhibition at the Brooklyn Museum displayed graffiti as an art form that began in New York's outer boroughs and reached great heights in the early 1980s with the work of Crash, Lee, Daze, Keith Haring, and Jean-Michel Basquiat. It displayed 22 works by New York graffitists, including Crash, Daze, and Lady Pink. In an article about the exhibition in the magazine Time Out, curator Charlotta Kotik said that she hoped the exhibition would cause viewers to rethink their assumptions about graffiti. From the 1970s onwards, Burhan Dogancay photographed urban walls all over the world; these he then archived for use as sources of inspiration for his painterly works. The project today known as "Walls of the World" grew beyond even his own expectations and comprises about 30,000 individual images. It spans a period of 40 years across five continents and 114 countries. In 1982, photographs from this project comprised a one-man exhibition titled "Les murs murmurent, ils crient, ils chantent..." (The walls whisper, shout and sing...) at the Centre Georges Pompidou in Paris. In Australia, art historians have judged some local graffiti of sufficient creative merit to rank them firmly within the arts. Oxford University Press's art history text Australian Painting 1788–2000 concludes with a long discussion of graffiti's key place within contemporary visual culture, including the work of several Australian practitioners. Between March and April 2009, 150 artists exhibited 300 pieces of graffiti at the Grand Palais in Paris. Environmental effects Spray paint has many negative environmental effects. The paint contains toxic chemicals, and the can uses volatile hydrocarbon gases to spray the paint onto a surface. Volatile organic compound (VOC) leads to ground level ozone formation and most of graffiti related emissions are VOCs. A 2010 paper estimates 4,862 tons of VOCs were released in the United States in activities related to graffiti. Government responses Asia In China, Mao Zedong in the 1920s used revolutionary slogans and paintings in public places to galvanise the country's communist revolution. Based on different national conditions, many people believe that China's attitude towards Graffiti is fierce, but in fact, according to Lance Crayon in his film Spray Paint Beijing: Graffiti in the Capital of China, Graffiti is generally accepted in Beijing, with artists not seeing much police interference. Political and religiously sensitive graffiti, however, is not allowed. In Hong Kong, Tsang Tsou Choi was known as the King of Kowloon for his calligraphy graffiti over many years, in which he claimed ownership of the area. Now some of his work is preserved officially. In Taiwan, the government has made some concessions to graffitists. Since 2005 they have been allowed to freely display their work along some sections of riverside retaining walls in designated "Graffiti Zones". From 2007, Taipei's department of cultural affairs also began permitting graffiti on fences around major public construction sites. Department head Yong-ping Lee (李永萍) stated, "We will promote graffiti starting with the public sector, and then later in the private sector too. It's our goal to beautify the city with graffiti". The government later helped organize a graffiti contest in Ximending, a popular shopping district. graffitists caught working outside of these designated areas still face fines up to NT$6,000 under a department of environmental protection regulation. However, Taiwanese authorities can be relatively lenient, one veteran police officer stating anonymously, "Unless someone complains about vandalism, we won't get involved. We don't go after it proactively." In 1993, after several expensive cars in Singapore were spray-painted, the police arrested a student from the Singapore American School, Michael P. Fay, questioned him, and subsequently charged him with vandalism. Fay pleaded guilty to vandalizing a car in addition to stealing road signs. Under the 1966 Vandalism Act of Singapore, originally passed to curb the spread of communist graffiti in Singapore, the court sentenced him to four months in jail, a fine of S$3,500 (US$2,233), and a caning. The New York Times ran several editorials and op-eds that condemned the punishment and called on the American public to flood the Singaporean embassy with protests. Although the Singapore government received many calls for clemency, Fay's caning took place in Singapore on 5 May 1994. Fay had originally received a sentence of six strokes of the cane, but the presiding president of Singapore, Ong Teng Cheong, agreed to reduce his caning sentence to four lashes. In South Korea, Park Jung-soo was fined two million South Korean won by the Seoul Central District Court for spray-painting a rat on posters of the G-20 Summit a few days before the event in November 2011. Park alleged that the initial in "G-20" sounds like the Korean word for "rat", but Korean government prosecutors alleged that Park was making a derogatory statement about the president of South Korea, Lee Myung-bak, the host of the summit. This case led to public outcry and debate on the lack of government tolerance and in support of freedom of expression. The court ruled that the painting, "an ominous creature like a rat" amounts to "an organized criminal activity" and upheld the fine while denying the prosecution's request for imprisonment for Park. Europe In Europe, community cleaning squads have responded to graffiti, in some cases with reckless abandon, as when in 1992 in France a local Scout group, attempting to remove modern graffiti, damaged two prehistoric paintings of bison in the Cave of Mayrière supérieure near the French village of Bruniquel in Tarn-et-Garonne, earning them the 1992 Ig Nobel Prize in archeology. In September 2006, the European Parliament directed the European Commission to create urban environment policies to prevent and eliminate dirt, litter, graffiti, animal excrement, and excessive noise from domestic and vehicular music systems in European cities, along with other concerns over urban life. In Budapest, Hungary, both a city-backed movement called I Love Budapest and a special police division tackle the problem, including the provision of approved areas. United Kingdom The Anti-Social Behaviour Act 2003 became Britain's latest anti-graffiti legislation. In August 2004, the Keep Britain Tidy campaign issued a press release calling for zero tolerance of graffiti and supporting proposals such as issuing "on the spot" fines to graffiti offenders and banning the sale of aerosol paint to anyone under the age of 16. The press release also condemned the use of graffiti images in advertising and in music videos, arguing that real-world experience of graffiti stood far removed from its often-portrayed "cool" or "edgy'" image. To back the campaign, 123 Members of Parliament (MPs) (including then Prime Minister Tony Blair), signed a charter which stated: "Graffiti is not art, it's crime. On behalf of my constituents, I will do all I can to rid our community of this problem." In the UK, city councils have the power to take action against the owner of any property that has been defaced under the Anti-social Behaviour Act 2003 (as amended by the Clean Neighbourhoods and Environment Act 2005) or, in certain cases, the Highways Act. This is often used against owners of property that are complacent in allowing protective boards to be defaced so long as the property is not damaged. In July 2008, a conspiracy charge was used to convict graffitists for the first time. After a three-month police surveillance operation, nine members of the DPM crew were convicted of conspiracy to commit criminal damage costing at least £1 million. Five of them received prison sentences, ranging from eighteen months to two years. The unprecedented scale of the investigation and the severity of the sentences rekindled public debate over whether graffiti should be considered art or crime. Some councils, like those of Stroud and Loerrach, provide approved areas in the town where graffitists can showcase their talents, including underpasses, car parks, and walls that might otherwise prove a target for the "spray and run". Australia In an effort to reduce vandalism, many cities in Australia have designated walls or areas exclusively for use by graffitists. One early example is the "Graffiti Tunnel" located at the Camperdown Campus of the University of Sydney, which is available for use by any student at the university to tag, advertise, poster, and create "art". Advocates of this idea suggest that this discourages petty vandalism yet encourages artists to take their time and produce great art, without worry of being caught or arrested for vandalism or trespassing. Others disagree with this approach, arguing that the presence of legal graffiti walls does not demonstrably reduce illegal graffiti elsewhere. Some local government areas throughout Australia have introduced "anti-graffiti squads", who clean graffiti in the area, and such crews as BCW (Buffers Can't Win) have taken steps to keep one step ahead of local graffiti cleaners. Many state governments have banned the sale or possession of spray paint to those under the age of 18 (age of majority). However, a number of local governments in Victoria have taken steps to recognize the cultural heritage value of some examples of graffiti, such as prominent political graffiti. Tough new graffiti laws have been introduced in Australia with fines of up to A$26,000 and two years in prison. Melbourne is a prominent graffiti city of Australia with many of its lanes being tourist attractions, such as Hosier Lane in particular, a popular destination for photographers, wedding photography, and backdrops for corporate print advertising. The Lonely Planet travel guide cites Melbourne's street as a major attraction. All forms of graffiti, including sticker art, poster, stencil art, and wheatpasting, can be found in many places throughout the city. Prominent street art precincts include; Fitzroy, Collingwood, Northcote, Brunswick, St. Kilda, and the CBD, where stencil and sticker art is prominent. As one moves farther away from the city, mostly along suburban train lines, graffiti tags become more prominent. Many international artists such as Banksy have left their work in Melbourne and in early 2008 a perspex screen was installed to prevent a Banksy stencil art piece from being destroyed, it has survived since 2003 through the respect of local street artists avoiding posting over it, although it has recently had paint tipped over it. New Zealand In February 2008 Helen Clark, the New Zealand prime minister at that time, announced a government crackdown on tagging and other forms of graffiti vandalism, describing it as a destructive crime representing an invasion of public and private property. New legislation subsequently adopted included a ban on the sale of paint spray cans to persons under 18 and increases in maximum fines for the offence from NZ$200 to NZ$2,000 or extended community service. The issue of tagging become a widely debated one following an incident in Auckland during January 2008 in which a middle-aged property owner stabbed one of two teenage taggers to death and was subsequently convicted of manslaughter. United States Tracker databases Graffiti databases have increased in the past decade because they allow vandalism incidents to be fully
the use of graffiti by avant-garde artists have a history dating back at least to the Asger Jorn, who in 1962 painting declared in a graffiti-like gesture "the avant-garde won't give up". Many contemporary analysts and even art critics have begun to see artistic value in some graffiti and to recognize it as a form of public art. According to many art researchers, particularly in the Netherlands and in Los Angeles, that type of public art is, in fact an effective tool of social emancipation or, in the achievement of a political goal. In times of conflict, such murals have offered a means of communication and self-expression for members of these socially, ethnically, or racially divided communities, and have proven themselves as effective tools in establishing dialog and thus, of addressing cleavages in the long run. The Berlin Wall was also extensively covered by graffiti reflecting social pressures relating to the oppressive Soviet rule over the GDR. Many artists involved with graffiti are also concerned with the similar activity of stenciling. Essentially, this entails stenciling a print of one or more colors using spray-paint. Recognized while exhibiting and publishing several of her coloured stencils and paintings portraying the Sri Lankan Civil War and urban Britain in the early 2000s, graffitists Mathangi Arulpragasam, aka M.I.A., has also become known for integrating her imagery of political violence into her music videos for singles "Galang" and "Bucky Done Gun", and her cover art. Stickers of her artwork also often appear around places such as London in Brick Lane, stuck to lamp posts and street signs, she having become a muse for other graffitists and painters worldwide in cities including Seville. Personal expression Many graffitists choose to protect their identities and remain anonymous or to hinder prosecution. With the commercialization of graffiti (and hip hop in general), in most cases, even with legally painted "graffiti" art, graffitists tend to choose anonymity. This may be attributed to various reasons or a combination of reasons. Graffiti still remains the one of four hip hop elements that is not considered "performance art" despite the image of the "singing and dancing star" that sells hip hop culture to the mainstream. Being a graphic form of art, it might also be said that many graffitists still fall in the category of the introverted archetypal artist. Banksy is one of the world's most notorious and popular street artists who continues to remain faceless in today's society. He is known for his political, anti-war stencil art mainly in Bristol, England, but his work may be seen anywhere from Los Angeles to Palestine. In the UK, Banksy is the most recognizable icon for this cultural artistic movement and keeps his identity a secret to avoid arrest. Much of Banksy's artwork may be seen around the streets of London and surrounding suburbs, although he has painted pictures throughout the world, including the Middle East, where he has painted on Israel's controversial West Bank barrier with satirical images of life on the other side. One depicted a hole in the wall with an idyllic beach, while another shows a mountain landscape on the other side. A number of exhibitions also have taken place since 2000, and recent works of art have fetched vast sums of money. Banksy's art is a prime example of the classic controversy: vandalism vs. art. Art supporters endorse his work distributed in urban areas as pieces of art and some councils, such as Bristol and Islington, have officially protected them, while officials of other areas have deemed his work to be vandalism and have removed it. Pixnit is another artist who chooses to keep her identity from the general public. Her work focuses on beauty and design aspects of graffiti as opposed to Banksy's anti-government shock value. Her paintings are often of flower designs above shops and stores in her local urban area of Cambridge, Massachusetts. Some store owners endorse her work and encourage others to do similar work as well. "One of the pieces was left up above Steve's Kitchen, because it looks pretty awesome"- Erin Scott, the manager of New England Comics in Allston, Massachusetts. Graffiti artists may become offended if photographs of their art are published in a commercial context without their permission. In March 2020, the Finnish graffiti artist Psyke expressed his displeasure at the newspaper Ilta-Sanomat publishing a photograph of a Peugeot 208 in an article about new cars, with his graffiti prominently shown on the background. The artist claims he does not want his art being used in commercial context, not even if he were to receive compensation. Radical and political Graffiti often has a reputation as part of a subculture that rebels against authority, although the considerations of the practitioners often diverge and can relate to a wide range of attitudes. It can express a political practice and can form just one tool in an array of resistance techniques. One early example includes the anarcho-punk band Crass, who conducted a campaign of stenciling anti-war, anarchist, feminist, and anti-consumerist messages throughout the London Underground system during the late 1970s and early 1980s. In Amsterdam graffiti was a major part of the punk scene. The city was covered with names such as "De Zoot", "Vendex", and "Dr Rat". To document the graffiti a punk magazine was started that was called Gallery Anus. So when hip hop came to Europe in the early 1980s there was already a vibrant graffiti culture. The student protests and general strike of May 1968 saw Paris bedecked in revolutionary, anarchistic, and situationist slogans such as L'ennui est contre-révolutionnaire ("Boredom is counterrevolutionary") and Lisez moins, vivez plus ("Read less, live more"). While not exhaustive, the graffiti gave a sense of the 'millenarian' and rebellious spirit, tempered with a good deal of verbal wit, of the strikers. The developments of graffiti art which took place in art galleries and colleges as well as "on the street" or "underground", contributed to the resurfacing in the 1990s of a far more overtly politicized art form in the subvertising, culture jamming, or tactical media movements. These movements or styles tend to classify the artists by their relationship to their social and economic contexts, since, in most countries, graffiti art remains illegal in many forms except when using non-permanent paint. Since the 1990s with the rise of Street Art, a growing number of artists are switching to non-permanent paints and non-traditional forms of painting. Contemporary practitioners, accordingly, have varied and often conflicting practices. Some individuals, such as Alexander Brener, have used the medium to politicize other art forms, and have used the prison sentences enforced on them as a means of further protest. The practices of anonymous groups and individuals also vary widely, and practitioners by no means always agree with each other's practices. For example, the anti-capitalist art group the Space Hijackers did a piece in 2004 about the contradiction between the capitalistic elements of Banksy and his use of political imagery. Territorial graffiti marks urban neighborhoods with tags and logos to differentiate certain groups from others. These images are meant to show outsiders a stern look at whose turf is whose. The subject matter of gang-related graffiti consists of cryptic symbols and initials strictly fashioned with unique calligraphies. Gang members use graffiti to designate membership throughout the gang, to differentiate rivals and associates and, most commonly, to mark borders which are both territorial and ideological. Berlin human rights activist Irmela Mensah-Schramm has received global media attention and numerous awards for her 35-year campaign of effacing neo-Nazi and other right-wing extremist graffiti throughout Germany, often by altering hate speech in humorous ways. Gallery As advertising Graffiti has been used as a means of advertising both legally and illegally. Bronx-based TATS CRU has made a name for themselves doing legal advertising campaigns for companies such as Coca-Cola, McDonald's, Toyota, and MTV. In the UK, Covent Garden's Boxfresh used stencil images of a Zapatista revolutionary in the hopes that cross referencing would promote their store. Smirnoff hired artists to use reverse graffiti (the use of high pressure hoses to clean dirty surfaces to leave a clean image in the surrounding dirt) to increase awareness of their product. Offensive graffiti Graffiti may also be used as an offensive expression. This form of graffiti may be difficult to identify, as it is mostly removed by the local authority (as councils which have adopted strategies of criminalization also strive to remove graffiti quickly). Therefore, existing racist graffiti is mostly more subtle and at first sight, not easily recognized as "racist". It can then be understood only if one knows the relevant "local code" (social, historical, political, temporal, and spatial), which is seen as heteroglot and thus a 'unique set of conditions' in a cultural context. A spatial code for example, could be that there is a certain youth group in an area that is engaging heavily in racist activities. So, for residents (knowing the local code), a graffiti containing only the name or abbreviation of this gang already is a racist expression, reminding the offended people of their gang activities. Also a graffiti is in most cases, the herald of more serious criminal activity to come. A person who does not know these gang activities would not be able to recognize the meaning of this graffiti. Also if a tag of this youth group or gang is placed on a building occupied by asylum seekers, for example, its racist character is even stronger. By making the graffiti less explicit (as adapted to social and legal constraints), these drawings are less likely to be removed, but do not lose their threatening and offensive character. Elsewhere, activists in Russia have used painted caricatures of local officials with their mouths as potholes, to show their anger about the poor state of the roads. In Manchester, England a graffitists painted obscene images around potholes, which often resulted in their being repaired within 48 hours. Decorative and high art In the early 1980s, the first art galleries to show graffitists to the public were Fashion Moda in the Bronx, Now Gallery and Fun Gallery, both in the East Village, Manhattan. A 2006 exhibition at the Brooklyn Museum displayed graffiti as an art form that began in New York's outer boroughs and reached great heights in the early 1980s with the work of Crash, Lee, Daze, Keith Haring, and Jean-Michel Basquiat. It displayed 22 works by New York graffitists, including Crash, Daze, and Lady Pink. In an article about the exhibition in the magazine Time Out, curator Charlotta Kotik said that she hoped the exhibition would cause viewers to rethink their assumptions about graffiti. From the 1970s onwards, Burhan Dogancay photographed urban walls all over the world; these he then archived for use as sources of inspiration for his painterly works. The project today known as "Walls of the World" grew beyond even his own expectations and comprises about 30,000 individual images. It spans a period of 40 years across five continents and 114 countries. In 1982, photographs from this project comprised a one-man exhibition titled "Les murs murmurent, ils crient, ils chantent..." (The walls whisper, shout and sing...) at the Centre Georges Pompidou in Paris. In Australia, art historians have judged some local graffiti of sufficient creative merit to rank them firmly within the arts. Oxford University Press's art history text Australian Painting 1788–2000 concludes with a long discussion of graffiti's key place within contemporary visual culture, including the work of several Australian practitioners. Between March and April 2009, 150 artists exhibited 300 pieces of graffiti at the Grand Palais in Paris. Environmental effects Spray paint has many negative environmental effects. The paint contains toxic chemicals, and the can uses volatile hydrocarbon gases to spray the paint onto a surface. Volatile organic compound (VOC) leads to ground level ozone formation and most of graffiti related emissions are VOCs. A 2010 paper estimates 4,862 tons of VOCs were released in the United States in activities related to graffiti. Government responses Asia In China, Mao Zedong in the 1920s used revolutionary slogans and paintings in public places to galvanise the country's communist revolution. Based on different national conditions, many people believe that China's attitude towards Graffiti is fierce, but in fact, according to Lance Crayon in his film Spray Paint Beijing: Graffiti in the Capital
movements controlled via animatronics. However, because of the creature's horizontal posture, the stuntmen had to wear metal leg extenders, which allowed them to stand off the ground with their feet bent forward. The film's special effects crew also built a scale animatronic Godzilla for close-up scenes, whose size outmatched that of Stan Winston's T. rex in Jurassic Park. Kurt Carley performed the suitmation sequences for the adult Godzilla. In Godzilla (2014), the character was portrayed entirely via CGI. Godzilla's design in the reboot was intended to stay true to that of the original series, though the film's special effects team strove to make the monster "more dynamic than a guy in a big rubber suit." To create a CG version of Godzilla, the Moving Picture Company (MPC) studied various animals such as bears, Komodo dragons, lizards, lions and wolves, which helped the visual effects artists visualize Godzilla's body structure, like that of its underlying bone, fat and muscle structure, as well as the thickness and texture of its scales. Motion capture was also used for some of Godzilla's movements. T. J. Storm provided the performance capture for Godzilla by wearing sensors in front of a green screen. Storm reprised the role of Godzilla in Godzilla: King of the Monsters, portraying the character through performance capture. In Shin Godzilla, a majority of the character was portrayed via CGI, with Mansai Nomura portraying Godzilla through motion capture. Appearances Cultural impact Godzilla is one of the most recognizable symbols of Japanese popular culture worldwide and remains an important facet of Japanese films, embodying the kaiju subset of the tokusatsu genre. Godzilla's vaguely humanoid appearance and strained, lumbering movements endeared it to Japanese audiences, who could relate to Godzilla as a sympathetic character, despite its wrathful nature. Audiences respond positively to the character because it acts out of rage and self-preservation and shows where science and technology can go wrong. In 1967, the Keukdong Entertainment Company of South Korea, with production assistance from Toei Company, produced Yongary, Monster from the Deep, a reptilian monster who invades South Korea to consume oil. The film and character has often been branded as an imitation of Godzilla. Godzilla has been considered a filmographic metaphor for the United States, as well as an allegory of nuclear weapons in general. The earlier Godzilla films, especially the original, portrayed Godzilla as a frightening nuclear-spawned monster. Godzilla represented the fears that many Japanese held about the atomic bombings of Hiroshima and Nagasaki and the possibility of recurrence. As the series progressed, so did Godzilla, changing into a less destructive and more heroic character. Ghidorah (1964) was the turning point in Godzilla's transformation from villain to hero, by pitting him against a greater threat to humanity, King Ghidorah. Godzilla has since been viewed as an anti-hero. Roger Ebert cites Godzilla as a notable example of a villain-turned-hero, along with King Kong, Jaws (James Bond), the Terminator and John Rambo. Godzilla is considered "the original radioactive superhero" due to his accidental radioactive origin story predating Spider-Man (1962 debut), though Godzilla did not become a hero until Ghidorah in 1964. By the 1970s, Godzilla came to be viewed as a superhero, with the magazine King of the Monsters in 1977 describing Godzilla as "Superhero of the '70s." Godzilla had surpassed Superman and Batman to become "the most universally popular superhero of 1977" according to Donald F. Glut. Godzilla was also voted the most popular movie monster in The Monster Times poll in 1973, beating Count Dracula, King Kong, the Wolf Man, the Mummy, the Creature from the Black Lagoon and the Frankenstein Monster. In 1996, Godzilla received the MTV Lifetime Achievement Award, as well as being given a star on the Hollywood Walk of Fame in 2004 to celebrate the premiere of the character's 50th anniversary film, Godzilla: Final Wars. Godzilla's pop-cultural impact has led to the creation of numerous parodies and tributes, as seen in media such as Bambi Meets Godzilla, which was ranked as one of the "50 greatest cartoons", two episodes of Mystery Science Theater 3000 and the song "Godzilla" by Blue Öyster Cult. Godzilla has also been used in advertisements, such as in a commercial for Nike, where Godzilla lost an oversized one-on-one game of basketball to a giant version of NBA player Charles Barkley. The commercial was subsequently adapted into a comic book illustrated by Jeff Butler. Godzilla has also appeared in a commercial for Snickers candy bars, which served as an indirect promo for the 2014 film. Godzilla's success inspired the creation of numerous other monster characters, such as Gamera, Reptilicus of Denmark, Yonggary of South Korea, Pulgasari of North Korea, Gorgo of the United Kingdom and the Cloverfield monster of the United States. Dakosaurus is an extinct sea crocodile of the Jurassic Period, which researchers informally nicknamed "Godzilla". Paleontologists have written tongue-in-cheek speculative articles about Godzilla's biology, with Ken Carpenter tentatively classifying it as a ceratosaur based on its skull shape, four-fingered hands and dorsal scutes and paleontologist Darren Naish expressing skepticism, while commenting on Godzilla's unusual morphology. Godzilla's ubiquity in pop-culture has led to the mistaken assumption that the character is in the public domain, resulting in litigation by Toho to protect their corporate asset from becoming a generic trademark. In April 2008, Subway depicted a giant monster in a commercial for their Five Dollar Footlongs sandwich promotion. Toho filed a lawsuit against Subway for using the character without permission, demanding $150,000 in compensation. In February 2011, Toho sued Honda for depicting a fire-breathing monster in a commercial for the Honda Odyssey. The monster was never mentioned by name, being seen briefly on a video screen inside the minivan. The Sea Shepherd Conservation Society christened a vessel the MV Gojira. Its purpose is to target and harass Japanese whalers in defense of whales in the Southern Ocean Whale Sanctuary. The MV Gojira was renamed the in May 2011, due to legal pressure from Toho. Gojira is the name of a French death metal band, formerly known as Godzilla; legal problems forced the band to change their name. In May 2015, Toho launched a lawsuit against Voltage Pictures over a planned picture starring Anne Hathaway. Promotional material released at the Cannes Film Festival used images of Godzilla. Steven Spielberg cited Godzilla as an inspiration for Jurassic Park (1993), specifically Godzilla, King of the Monsters! (1956), which he grew up watching. Spielberg described Godzilla as "the most masterful of all the dinosaur movies because it made you believe it was really happening." Godzilla also influenced the Spielberg film Jaws (1975). Godzilla has also been cited as an inspiration by filmmakers Martin Scorsese and Tim Burton. The main-belt asteroid 101781 Gojira, discovered by American astronomer Roy Tucker at the Goodricke-Pigott Observatory in 1999, was named in honor of the creature. The official naming citation was published by the Minor Planet Center on 11 July 2018 (). Cultural ambassador In April 2015, the Shinjuku ward of Tokyo named Godzilla a special resident and official tourism
would all later be reincorporated in the Godzilla designs from The Return of Godzilla (1984) onward. The most consistent Godzilla design was maintained from Godzilla vs. Biollante (1989) to Godzilla vs. Destoroyah (1995), when the suit was given a cat-like face and double rows of teeth. Several suit actors had difficulties in performing as Godzilla due to the suits' weight, lack of ventilation and diminished visibility. Haruo Nakajima, who portrayed Godzilla from 1954 to 1972, said the materials used to make the 1954 suit (rubber, plastic, cotton, and latex) were hard to find after World War II. The suit weighed 100 kilograms after its completion and required two men to help Nakajima put it on. When he first put it on, he sweated so heavily that his shirt was soaked within seconds. Kenpachiro Satsuma in particular, who portrayed Godzilla from 1984 to 1995, described how the Godzilla suits he wore were even heavier and hotter than their predecessors because of the incorporation of animatronics. Satsuma himself suffered numerous medical issues during his tenure, including oxygen deprivation, near-drowning, concussions, electric shocks and lacerations to the legs from the suits' steel wire reinforcements wearing through the rubber padding. The ventilation problem was partially solved in the suit used in 1994's Godzilla vs. SpaceGodzilla, which was the first to include an air duct that allowed suit actors to last longer during performances. In The Return of Godzilla (1984), some scenes made use of a 16-foot high robotic Godzilla (dubbed the "Cybot Godzilla") for use in close-up shots of the creature's head. The Cybot Godzilla consisted of a hydraulically-powered mechanical endoskeleton covered in urethane skin containing 3,000 computer operated parts which permitted it to tilt its head and move its lips and arms. In Godzilla (1998), special effects artist Patrick Tatopoulos was instructed to redesign Godzilla as an incredibly fast runner. At one point, it was planned to use motion capture from a human to create the movements of the computer-generated Godzilla, but it was said to have ended up looking too much like a man in a suit. Tatopoulos subsequently reimagined the creature as a lean, digitigrade bipedal, iguana-like creature that stood with its back and tail parallel to the ground, rendered via CGI. Several scenes had the monster portrayed by stuntmen in suits. The suits were similar to those used in the Toho films, with the actors' heads being located in the monster's neck region and the facial movements controlled via animatronics. However, because of the creature's horizontal posture, the stuntmen had to wear metal leg extenders, which allowed them to stand off the ground with their feet bent forward. The film's special effects crew also built a scale animatronic Godzilla for close-up scenes, whose size outmatched that of Stan Winston's T. rex in Jurassic Park. Kurt Carley performed the suitmation sequences for the adult Godzilla. In Godzilla (2014), the character was portrayed entirely via CGI. Godzilla's design in the reboot was intended to stay true to that of the original series, though the film's special effects team strove to make the monster "more dynamic than a guy in a big rubber suit." To create a CG version of Godzilla, the Moving Picture Company (MPC) studied various animals such as bears, Komodo dragons, lizards, lions and wolves, which helped the visual effects artists visualize Godzilla's body structure, like that of its underlying bone, fat and muscle structure, as well as the thickness and texture of its scales. Motion capture was also used for some of Godzilla's movements. T. J. Storm provided the performance capture for Godzilla by wearing sensors in front of a green screen. Storm reprised the role of Godzilla in Godzilla: King of the Monsters, portraying the character through performance capture. In Shin Godzilla, a majority of the character was portrayed via CGI, with Mansai Nomura portraying Godzilla through motion capture. Appearances Cultural impact Godzilla is one of the most recognizable symbols of Japanese popular culture worldwide and remains an important facet of Japanese films, embodying the kaiju subset of the tokusatsu genre. Godzilla's vaguely humanoid appearance and strained, lumbering movements endeared it to Japanese audiences, who could relate to Godzilla as a sympathetic character, despite its wrathful nature. Audiences respond positively to the character because it acts out of rage and self-preservation and shows where science and technology can go wrong. In 1967, the Keukdong Entertainment Company of South Korea, with production assistance from Toei Company, produced Yongary, Monster from the Deep, a reptilian monster who invades South Korea to consume oil. The film and character has often been branded as an imitation of Godzilla. Godzilla has been considered a filmographic metaphor for the United States, as well as an allegory of nuclear weapons in general. The earlier Godzilla films, especially the original, portrayed Godzilla as a frightening nuclear-spawned monster. Godzilla represented the fears that many Japanese held about the atomic bombings of Hiroshima and Nagasaki and the possibility of recurrence. As the series progressed, so did Godzilla, changing into a less destructive and more heroic character. Ghidorah (1964) was the turning point in Godzilla's transformation from villain to hero, by pitting him against a greater threat to humanity, King Ghidorah. Godzilla has since been viewed as an anti-hero. Roger Ebert cites Godzilla as a notable example of a villain-turned-hero, along with King Kong, Jaws (James Bond), the Terminator and John Rambo. Godzilla is considered "the original radioactive superhero" due to his accidental radioactive origin story predating Spider-Man (1962 debut), though Godzilla did not become a hero until Ghidorah in 1964. By the 1970s, Godzilla came to be viewed as a superhero, with the magazine King of the Monsters in 1977 describing Godzilla as "Superhero of the '70s." Godzilla had surpassed Superman and Batman to become "the most universally popular superhero of 1977" according to Donald F. Glut. Godzilla was also voted the most popular movie monster in The Monster Times poll in 1973, beating Count Dracula, King Kong, the Wolf Man, the Mummy, the Creature from the Black Lagoon and the Frankenstein Monster. In 1996, Godzilla received the MTV Lifetime Achievement Award, as well as being given a star on the Hollywood Walk of Fame in 2004 to celebrate the premiere of the character's 50th anniversary film, Godzilla: Final Wars. Godzilla's pop-cultural impact has led to the creation of numerous parodies and tributes, as seen in media such as Bambi Meets Godzilla, which was ranked as one of the "50 greatest cartoons", two episodes of Mystery Science Theater 3000 and the song "Godzilla" by Blue Öyster Cult. Godzilla has also been used in advertisements, such as in a commercial for Nike, where Godzilla lost an oversized one-on-one game of basketball to a giant version of NBA player Charles Barkley. The commercial was subsequently adapted into a comic book illustrated by Jeff Butler. Godzilla has also appeared in a commercial for Snickers candy bars, which served as an indirect promo for the 2014 film. Godzilla's success inspired the creation of numerous other monster characters, such as Gamera, Reptilicus of Denmark, Yonggary of South Korea, Pulgasari of North Korea, Gorgo of the United Kingdom and the Cloverfield monster of the United States. Dakosaurus is an extinct sea crocodile of the Jurassic Period, which researchers informally nicknamed "Godzilla". Paleontologists have written tongue-in-cheek speculative articles about Godzilla's biology, with Ken Carpenter tentatively classifying it as a ceratosaur based on its skull shape, four-fingered hands and dorsal scutes and paleontologist Darren Naish expressing skepticism, while commenting on Godzilla's unusual morphology. Godzilla's ubiquity in pop-culture has led to the mistaken assumption that the character is in the public domain, resulting in litigation by Toho to protect their corporate asset from becoming a generic trademark. In April 2008, Subway depicted a giant monster in a commercial for their Five Dollar Footlongs sandwich promotion. Toho filed a lawsuit against Subway for using the character without permission, demanding $150,000 in compensation. In February 2011, Toho sued Honda for depicting a fire-breathing monster in a commercial for the Honda Odyssey. The monster was never mentioned by name, being seen briefly on a video screen inside the minivan. The Sea Shepherd Conservation Society christened a vessel the MV Gojira. Its purpose is to target and harass Japanese whalers in defense of whales in the Southern Ocean Whale Sanctuary. The MV Gojira was renamed the in May 2011, due to legal pressure from Toho. Gojira is the name of a French death metal band, formerly known as Godzilla; legal problems forced the band to change their name. In May 2015, Toho launched a lawsuit against Voltage Pictures over a planned picture starring Anne Hathaway. Promotional material released at the Cannes Film Festival used images of Godzilla. Steven Spielberg cited Godzilla as an inspiration for Jurassic Park (1993), specifically Godzilla, King of the Monsters! (1956), which he grew up watching. Spielberg described Godzilla as "the most masterful of all the dinosaur movies because it made you believe it was really happening." Godzilla also influenced the Spielberg film Jaws (1975). Godzilla has also been cited as an inspiration by filmmakers Martin Scorsese and Tim Burton. The main-belt asteroid 101781 Gojira, discovered by American astronomer Roy Tucker at the Goodricke-Pigott Observatory in 1999, was named in honor of the creature. The official naming citation was published by the Minor Planet Center on 11 July 2018 (). Cultural ambassador In April 2015, the Shinjuku ward of Tokyo named Godzilla a special resident and official tourism ambassador to encourage tourism. During an unveiling of a giant Godzilla bust at Toho headquarters, Shinjuku mayor Kenichi
had been composed by Henry Mancini, Hans J. Salter, and even a track from Heinz Roemheld). These films include Creature from the Black Lagoon, Bend of the River, Untamed Frontier, The Golden Horde, Frankenstein Meets the Wolf Man, Man Made Monster, Thunder on the Hill, While the City Sleeps, Against All Flags, The Monster That Challenged the World, The Deerslayer and music from the TV series Wichita Town. Cues from these scores were used to almost completely replace the original Japanese score by Akira Ifukube and give the film a more Western sound. They also obtained stock footage from the film The Mysterians from RKO (the film's U.S. copyright holder at the time) which was used not only to represent the ICS, but which was also utilized during the film's climax. Stock footage of a massive earthquake from The Mysterians was employed to make the earthquake caused by Kong and Godzilla's plummet into the ocean much more violent than the tame tremor seen in the Japanese version. This added footage features massive tidal waves, flooded valleys, and the ground splitting open swallowing up various huts. Beck spent roughly $15,500 making his English version and sold the film to Universal-International for roughly $200,000 on April 29, 1963. The film opened in New York on June 26 of that year. Starting in 1963, Toho's international sales booklets began advertising an English dub of King Kong vs. Godzilla alongside Toho-commissioned, unedited international dubs of movies such as Giant Monster Varan and The Last War. By association, it is thought that this King Kong vs. Godzilla dub is an unedited English-language international version not known to have been released on home video. Release Theatrical In Japan, the film was released on August 11, 1962, where it played alongside Myself and I for a two week period, afterward, it was extended by one more week and screened alongside the anime film Touring the World. The film was re-released twice as part of the Toho Champion Festival, a film festival that ran from 1969 through 1978 that featured numerous films packaged together and aimed at children, first in 1970, and then again in 1977, to coincide with the Japanese release of the 1976 version of King Kong. After its theatrical re-releases, the film was screened two more times at specialty festivals. In 1979, to celebrate Godzilla's 25th anniversary, the film was reissued as part of a triple bill festival known as The Godzilla Movie Collection (Gojira Eiga Zenshu). It played alongside Invasion of Astro-Monster and Godzilla vs. Mechagodzilla. This release is known among fans for its exciting and dynamic movie poster featuring all the main kaiju from these three films engaged in battle. Then in 1983, the film was screened as part of The Godzilla Resurrection Festival (Gojira no Fukkatsu). This large festival featured 10 Godzilla/kaiju films in all (Godzilla, King Kong vs. Godzilla, Mothra vs. Godzilla, Ghidorah, the Three-Headed Monster, Invasion of Astro-Monster, Godzilla vs. Mechagodzilla, Rodan, Mothra, Atragon, and King Kong Escapes). In North America, King Kong vs. Godzilla premiered in New York City on June 26, 1963. The film was also released in many international markets. In Germany, it was known as Die Rückkehr des King Kong ("The Return of King Kong") and in Italy as Il trionfo di King Kong ("The triumph of King Kong"). In France, it was released in 1976. Home media The Japanese version of this film was released numerous times through the years by Toho on different home video formats. The film was first released on VHS in 1985 and again in 1991. It was released on LaserDisc in 1986 and 1991, and then again in 1992 in its truncated 74-minute form as part of a laserdisc box set called the Godzilla Toho Champion Matsuri. Toho then released the film on DVD in 2001. They released it again in 2005 as part of the Godzilla Final Box DVD set, and again in 2010 as part of the Toho Tokusatsu DVD Collection. This release was volume #8 of the series and came packaged with a collectible magazine that featured stills, behind-the-scenes photos, interviews, and more. In the summer of 2014, the film was released for the first time on Blu-ray as part of the company releasing the entire series on the Blu-ray format for Godzilla's 60th anniversary. The 4K Ultra High Definition remaster of the film was released on Blu-Ray in both a two disc deluxe box set and a standard one disc in May 2021. The American version was released on VHS by GoodTimes Entertainment (which acquired the license to some of Universal's film catalogue) in 1987, and then on DVD to commemorate the 35th anniversary of the film's U.S release in 1998. Both of these releases were full frame. Universal Pictures released the English-language version of the film on DVD in widescreen as part of a two-pack bundle with King Kong Escapes in 2005, and then on its own as an individual release on September 15, 2009. They then re-released the film on Blu-ray on April 1, 2014, along with King Kong Escapes. This release sold $749,747 worth of Blu-rays. FYE released an exclusive Limited Edition Steelbook version of this Blu-ray on September 10, 2019. In 2019, the Japanese and American versions were included in a Blu-ray box set released by The Criterion Collection, which included all 15 films from the franchise's Shōwa era. Reception Box office In Japan, this film has the highest box office attendance figures of all of the Godzilla films to date. It sold 11.2 million tickets during its initial theatrical run, accumulating in distribution rental earnings. The film was the fourth highest-grossing film in Japan that year, behind The Great Wall (Shin no shikōtei), Sanjuro, and 47 Samurai and was Toho's second biggest moneymaker. At an average 1962 Japanese ticket price, ticket sales were equivalent to estimated gross receipts of approximately (). Including re-releases, the film accumulated a lifetime figure of 12.6 million tickets sold in Japan, with distribution rental earnings of . The 1970 re-release sold 870,000 tickets, equivalent to estimated gross receipts of approximately (). The 1977 re-release sold 480,000 tickets, equivalent to estimated gross receipts of approximately (). This adds up to total estimated Japanese gross receipts of approximately (). In the United States, the film grossed $2.7 million, accumulating a profit (via rentals) of $1.25 million. In France, where it released in 1976, the film sold 554,695 tickets, equivalent to estimated gross receipts of approximately ($1,667,650). This adds up to total estimated gross receipts of approximately worldwide. Preservation The original Japanese version of King Kong vs. Godzilla is infamous for being one of the most poorly-preserved tokusatsu films. In 1970, director Ishiro Honda prepared an edited version of the film for the Toho Champion Festival, a children's matinee program that showcased edited re-releases of older kaiju films along with cartoons and then-new kaiju films. Honda cut 24 minutes from the film's original negative and, as a result, the highest quality source for the cut footage was lost. For years, all that was thought to remain of the uncut 1962 version was a faded, heavily damaged 16mm element from which rental prints had been made. 1980s restorations for home video integrated the 16mm deleted scenes into the 35mm Champion cut, resulting in wildly inconsistent picture quality. In 1991, Toho issued a restored laserdisc incorporating the rediscovery of a reel of 35mm trims of the deleted footage from the original negative. The resultant quality was far superior to previous reconstructions, but not perfect; an abrupt cut caused by missing frames at the beginning or end of a trim is evident whenever the master switches between the Champion cut and a 35mm trim within the same shot. This laserdisc master was utilized for Toho's 2001 DVD release with few changes. In 2014, Toho released a new restoration of the film on Blu-Ray, which utilized the 35mm edits once again, but only those available for reels 2-7 of the film were able to be located. The remainder of video for the deleted portions was sourced from the earlier Blu-Ray of the U.S. version, in addition to the previous 480i 1991 laserdisc master. On July 14, 2016, a 4K restoration of a completely 35mm sourced version of the film aired on The Godzilla First Impact, a series of 4K broadcasts of Godzilla films on the Nihon Eiga Senmon Channel. Legacy Due to the great box office success of this film, Toho wanted to produce a sequel immediately. Shinichi Sekizawa was brought back to write the screenplay tentatively called Continuation King Kong vs Godzilla. Sekizawa revealed that Kong had killed Godzilla during their underwater battle in Sagami Bay with a line of dialogue stating "Godzilla, who sank and died in the waters off Atami". As the story progressed, Godzilla's body is salvaged from the Ocean by a group of entrepreneurs who hope to display the remains at a planned resort. Meanwhile King Kong is found in Africa where he had been protecting a baby (the sole survivor of a plane crash). After the baby is rescued by investigators, and is taken back to Japan, Kong follows the group and rampages through the country looking for the infant. Godzilla is then revived with hopes of driving off Kong. The story ends with both monsters plummeting into a volcano. The project was ultimately cancelled. A couple of years later, Toho conceived the idea to pit Godzilla against a giant Frankenstein Monster and assigned Takeshi Kimura in 1964 to write a screenplay titled Frankenstein vs. Godzilla. However, Toho would cancel this project as well and instead decided to match Mothra against Godzilla in Mothra vs. Godzilla. This began a formula where kaiju from past Toho films would be added into the Godzilla franchise. Toho was interested in producing a series around their version of King Kong, but were refused by RKO. However, Toho would handle the character once more in 1967 to help Rankin/Bass co-produce their film King Kong Escapes, which was loosely based on a cartoon series Rankin/Bass had produced. Henry G. Saperstein was impressed with the giant octopus scene and requested a giant octopus to appear in Frankenstein Conquers the World and The War of the Gargantuas. The giant octopus appeared in an alternate ending for Frankenstein Conquers the World that was intended for overseas markets, but went unused. As a result, the octopus instead appeared in the opening of The War of the Gargantuas. The film's Godzilla suit was reused for certain scenes in Mothra vs. Godzilla The film's Godzilla design also formed the basis for some early merchandise in the U.S. in the 1960s, such as a model kit by Aurora Plastics Corporation, and a board game by Ideal Toys. This game was released alongside a King Kong game in 1963 to coincide with the U.S. theatrical release of the film. The film's King Kong suit was recycled and altered for the second episode of Ultra Q and the water scenes for King Kong Escapes. Scenes of the film's giant octopus attack were recycled for the 23rd episode of Ultra Q. In 1992, to coincide with the company's 60th anniversary, Toho expressed interest in remaking the film as Godzilla vs. King Kong. However, producer Tomoyuki Tanaka stated that obtaining the rights to King Kong proved difficult. Toho then considered producing Godzilla vs. Mechani-Kong but effects director Koichi Kawakita confirmed that obtaining the likeness of King Kong also proved difficult. Mechani-Kong was replaced by Mechagodzilla, and the project was developed into Godzilla vs. Mechagodzilla II in 1993. During the production of Pirates of the Caribbean: Dead Man's Chest, animation director Hal Hickel instructed his team to watch King Kong vs. Godzilla, specifically the giant octopus scene, to use as a reference when animating the Kraken's tentacles. The film has been referenced in pop culture through various media. It was referenced in Da Lench Mob's 1992 single "Guerillas in tha Mist". It was spoofed in advertising for a Bembos burger commercial from Peru, for Ridsect Lizard Repellant, and for the board game Connect 4. It was paid homage to in comic books by DC Comics, Bongo Comics, and Disney Comics. It was spoofed in The Simpsons episode "Wedding for Disaster". In 2015, Legendary Entertainment announced plans for a King Kong vs Godzilla film of their own (unrelated to Toho's version), which was released on March 26, 2021. Dual ending myth For many years, a popular myth has persisted that in the Japanese version of this film, Godzilla emerges as the winner. The myth originated in the pages of Spacemen magazine, a 1960s sister magazine to the influential publication Famous Monsters of Filmland. In an article about the film, it is incorrectly stated that there were two endings and "If you see King Kong vs Godzilla in Japan, Hong Kong or some Oriental sector of the world, Godzilla wins!" The article was reprinted in various issues of Famous
really had a clear insight into television". Filming Special effects director Eiji Tsuburaya was planning on working on other projects at this point in time such as a new version of a fairy tale film script called Kaguyahime (Princess Kaguya), but he postponed those to work on this project with Toho instead since he was such a huge fan of King Kong. He stated in an early 1960s interview with the Mainichi Newspaper, "But my movie company has produced a very interesting script that combined King Kong and Godzilla, so I couldn't help working on this instead of my other fantasy films. The script is special to me; it makes me emotional because it was King Kong that got me interested in the world of special photographic techniques when I saw it in 1933." Early drafts of the script were sent back with notes from the studio asking that the monster antics be made as "funny as possible". This comical approach was embraced by Tsuburaya, who wanted to appeal to children's sensibilities and broaden the genre's audience. Much of the monster battle was filmed to contain a great deal of humour but the approach was not favoured by most of the effects crew, who "couldn't believe" some of the things Tsuburaya asked them to do, such as Kong and Godzilla volleying a giant boulder back and forth. With the exception of the next film, Mothra vs. Godzilla, this film began the trend to portray Godzilla and the monsters with more and more anthropomorphism as the series progressed, to appeal more to younger children. Ishirō Honda was not a fan of the dumbing down of the monsters. Years later, Honda stated in an interview. "I don't think a monster should ever be a comical character." "The public is more entertained when the great King Kong strikes fear into the hearts of the little characters." The decision was also taken to shoot the film in a (2.35:1) scope ratio (Tohoscope) and to film in color (Eastman Color), marking both monsters' first widescreen and color portrayals. Toho had planned to shoot this film on location in Sri Lanka, but had to forgo that (and scale back on production costs) because it ended up paying RKO roughly ¥80 million ($220,000) for the rights to the King Kong character. The bulk of the film was shot on the Japanese island of Izu Ōshima instead. The movie's production budget came out to (). Suit actors Shoichi Hirose (King Kong) and Haruo Nakajima (Godzilla) were given mostly free rein by Eiji Tsuburaya to choreograph their own moves. The men would rehearse for hours and would base their moves on that from professional wrestling (a sport that was growing in popularity in Japan), in particular the movies of Toyonobori. During pre-production, Eiji Tsuburaya had toyed with the idea of using Willis O'Brien's stop-motion technique instead of the suitmation process used in the first two Godzilla films, but budgetary concerns prevented him from using the process, and the more cost-efficient suitmation was used instead. However, some brief stop motion was used in a couple of quick sequences. One of these sequences was animated by Koichi Takano, who was a member of Eiji Tsuburaya's crew. A brand new Godzilla suit was designed for this film and some slight alterations were done to its overall appearance. These alterations included the removal of its tiny ears, three toes on each foot rather than four, enlarged central dorsal fins, and a bulkier body. These new features gave Godzilla a more reptilian/dinosaurian appearance. Outside of the suit, a meter-high model and a small puppet were also built. Another puppet (from the waist up) was also designed that had a nozzle in the mouth to spray out liquid mist simulating Godzilla's atomic breath. However the shots in the film where this prop was employed (far away shots of Godzilla breathing its atomic breath during its attack on the Arctic Military base) were ultimately cut from the film. These cut scenes can be seen in the Japanese theatrical trailer. Finally, a separate prop of Godzilla's tail was also built for close-up practical shots when its tail would be used (such as the scene where Godzilla trips Kong with its tail). The tail prop would be swung offscreen by a stage hand. Sadamasa Arikawa (who worked with Eiji Tsuburaya) said that the sculptors had a hard time coming up with a King Kong suit that appeased Tsuburaya. The first suit was rejected for being too fat with long legs giving Kong what the crew considered an almost cute look. A few other designs were done before Tsuburaya would approve the final look that was ultimately used in the film. The suit's body design was a team effort by brothers Koei Yagi and Kanji Yagi and was covered with expensive yak hair, which Eizo Kaimai hand-dyed brown. Because RKO instructed that the face must be different from the original's design, sculptor Teizo Toshimitsu based Kong's face on the Japanese macaque rather than a gorilla, and designed two separate masks. As well, two separate pairs of arms were also created. One pair were extended arms that were operated by poles inside the suit to better give Kong a gorilla-like illusion, while the other pair were at normal arms-length and featured gloves that were used for scenes that required Kong to grab items and wrestle with Godzilla. Suit actor Hirose had to be sewn into the suit in order to hide the zipper. This would force him to be trapped inside the suit for large amounts of time and would cause him much physical discomfort. In the scene where Kong drinks the berry juice and falls asleep, he was trapped in the suit for three hours. Hirose stated in an interview "Sweat came pouring out like a flood and it got into my eyes too. When I came out, I was pale all over". Besides the suit with the two separate arm attachments, a meter-high model and a puppet of Kong (used for closeups) were also built. As well, a huge prop of Kong's hand was built for the scene where he grabs Mie Hama (Fumiko) and carries her off. For the attack of the giant octopus, four live octopuses were used. They were forced to move among the miniature huts by having hot air blown onto them. After the filming of that scene was finished, three of the four octopuses were released. The fourth became special effects director Eiji Tsuburaya's dinner. These sequences were filmed on a miniature set outdoors on the Miura Coast. Along with the live animals, two rubber octopus props were built, with the larger one being covered with plastic wrap to simulate mucous. Some stop-motion tentacles were also created for the scene where the octopus grabs a native and tosses him. These sequences were shot indoors at Toho Studios. Since King Kong was seen as the bigger draw and since Godzilla was still a villain at this point in the series, the decision was made to not only give King Kong top billing but also to present him as the winner of the climactic fight. While the ending of the film does look somewhat ambiguous, Toho confirmed that King Kong was indeed the winner in their 1962–63 English-language film program Toho Films Vol. 8, which states in the film's plot synopsis, A spectacular duel is arranged on the summit of Mt. Fuji and King Kong is victorious. But after he has won... American version When John Beck sold the King Kong vs Prometheus script to Toho (which became King Kong vs. Godzilla), he was given exclusive rights to produce a version of the film for release in non-Asian territories. He was able to line up a couple of potential distributors in Warner Bros. and Universal-International even before the film began production. Beck, accompanied by two Warner Bros. representatives, attended at least two private screenings of the film on the Toho Studios lot before it was released in Japan. John Beck enlisted the help of two Hollywood writers, Paul Mason and Bruce Howard, to write a new screenplay. After discussions with Beck, the two wrote the American version and worked with editor Peter Zinner to remove scenes, recut others, and change the sequence of several events. To give the film more of an American feel, Mason and Howard decided to insert new footage that would convey the impression that the film was actually a newscast. The television actor Michael Keith played newscaster Eric Carter, a United Nations reporter who spends much of the time commenting on the action from the U.N. Headquarters via an International Communications Satellite (ICS) broadcast. Harry Holcombe was cast as Dr. Arnold Johnson, the head of the Museum of Natural History in New York City, who tries to explain Godzilla's origin and his and Kong's motivations. The new footage, directed by Thomas Montgomery, was shot in three days. Beck and his crew were able to obtain library music from a host of older films (music tracks that had been composed by Henry Mancini, Hans J. Salter, and even a track from Heinz Roemheld). These films include Creature from the Black Lagoon, Bend of the River, Untamed Frontier, The Golden Horde, Frankenstein Meets the Wolf Man, Man Made Monster, Thunder on the Hill, While the City Sleeps, Against All Flags, The Monster That Challenged the World, The Deerslayer and music from the TV series Wichita Town. Cues from these scores were used to almost completely replace the original Japanese score by Akira Ifukube and give the film a more Western sound. They also obtained stock footage from the film The Mysterians from RKO (the film's U.S. copyright holder at the time) which was used not only to represent the ICS, but which was also utilized during the film's climax. Stock footage of a massive earthquake from The Mysterians was employed to make the earthquake caused by Kong and Godzilla's plummet into the ocean much more violent than the tame tremor seen in the Japanese version. This added footage features massive tidal waves, flooded valleys, and the ground splitting open swallowing up various huts. Beck spent roughly $15,500 making his English version and sold the film to Universal-International for roughly $200,000 on April 29, 1963. The film opened in New York on June 26 of that year. Starting in 1963, Toho's international sales booklets began advertising an English dub of King Kong vs. Godzilla alongside Toho-commissioned, unedited international dubs of movies such as Giant Monster Varan and The Last War. By association, it is thought that this King Kong vs. Godzilla dub is an unedited English-language international version not known to have been released on home video. Release Theatrical In Japan, the film was released on August 11, 1962, where it played alongside Myself and I for a two week period, afterward, it was extended by one more week and screened alongside the anime film Touring the World. The film was re-released twice as part of the Toho Champion Festival, a film festival that ran from 1969 through 1978 that featured numerous films packaged together and aimed at children, first in 1970, and then again in 1977, to coincide with the Japanese release of the 1976 version of King Kong. After its theatrical re-releases, the film was screened two more times at specialty festivals. In 1979, to celebrate Godzilla's 25th anniversary, the film was reissued as part of a triple bill festival known as The Godzilla Movie Collection (Gojira Eiga Zenshu). It played alongside Invasion of Astro-Monster and Godzilla vs. Mechagodzilla. This release is known among fans for its exciting and dynamic movie poster featuring all the main kaiju from these three films engaged in battle. Then in 1983, the film was screened as part of The Godzilla Resurrection Festival (Gojira no Fukkatsu). This large festival featured 10 Godzilla/kaiju films in all (Godzilla, King Kong vs. Godzilla, Mothra vs. Godzilla, Ghidorah, the Three-Headed Monster, Invasion of Astro-Monster, Godzilla vs. Mechagodzilla, Rodan, Mothra, Atragon, and King Kong Escapes). In North America, King Kong vs. Godzilla premiered in New York City on June 26, 1963. The film was also released in many international markets. In Germany, it was known as Die Rückkehr
King Kong rather than Godzilla. The film's working title was Operation Robinson Crusoe: King Kong vs. Ebirah, and the project was rejected by Rankin/Bass Productions before being accepted by Toho, after which King Kong's role in the film was replaced by Godzilla. Even though Eiji Tsuburaya was given directorial credit for the special effects, Sadamasa Arikawa actually directed the special effects under the supervision of Tsuburaya, who had his own company, Tsuburaya Productions, at the time. Toho had decided to set the film on an island to cut back on special effects costs. Arikawa has cited the film as a frustrating experience, stating, "There were major limitations on the budget from the studio. Toho couldn't have made too many demands about the budget if Mr. Tsuburaya had been in charge. The studio knew I was also doing TV work then, so they must have figured I could produce the movie cheaply." Special effects The underwater sequences were filmed on an indoor soundstage where the Godzilla and Ebirah suits were filmed through the glass of a water-filled aquarium, with some scenes of the Godzilla suit shot separately underwater as well. Haruo Nakajima (the suit performer for Godzilla) wore a wet suit under the Godzilla suit for every scene that required him to be in the water, which took a week to complete the water scenes, Nakajima stated, "I worked overtime until about eight o'clock every day. Even though I wore a wet suit under the costume, I got cold. But I never got sick, because I was so tense during the filming." Filming This is the first of two Godzilla films in which a Pacific island is the primary setting, rather than a location inside Japan. The
required him to be in the water, which took a week to complete the water scenes, Nakajima stated, "I worked overtime until about eight o'clock every day. Even though I wore a wet suit under the costume, I got cold. But I never got sick, because I was so tense during the filming." Filming This is the first of two Godzilla films in which a Pacific island is the primary setting, rather than a location inside Japan. The second and final one is Son of Godzilla (1967). Release Ebirah, Horror of the Deep was released theatrically in Japan on December 17, 1966, where it was distributed by Toho. The American version of the film was released directly to television by Continental Distributing in 1968 under the title Godzilla versus the Sea Monster. The film may have received theatrical distribution in the United States as a Walter Reade, Jr. Presentation, but this has not been confirmed. Home media The film was released on DVD on February 8, 2005 by Sony Pictures Home Entertainment. The film was released on Blu-ray on May 6, 2014 by Kraken Releasing. In 2019, the Japanese version was included in a Blu-ray box set released by the Criterion Collection, which included all 15 films from the franchise's Shōwa era. References Sources External links Godzilla on the web (Japan) 1960s children's fantasy films 1960s fantasy films 1960s monster movies 1960s science fiction films 1966 films Films about bank robbery Films about terrorism in Asia Films directed by Jun Fukuda Films dubbed by Frontier Enterprises Films produced by Tomoyuki Tanaka Films scored by Masaru Sato Films set in Kanagawa Prefecture Films set in Tokyo Films set on fictional islands Films with screenplays by Shinichi Sekizawa Godzilla films Giant monster films Japanese films Japanese sequel films Japanese-language films Kaiju films Mad scientist films Mothra Toho
Ebirah, Horror of the Deep. This included Jun Fukuda (director), Sadamasa Arikawa (special effects), and Masaru Sato (composer). This was the first film where Arikawa was officially listed as the director of Special Effects, although he did receive some supervision from Tsuburaya when he was available. Toho wanted to create a baby Godzilla to appeal to the "date crowd" (a genre of films that were very popular among young couples during this time period), with the idea that girls would like a "cute" baby monster. For the idea behind Minilla, Fukuda stated, "We wanted to take a new approach, so we gave Godzilla a child. We thought it would be a little strange if we gave Godzilla a daughter, so instead we gave him a son". Fukuda also wanted to portray the monsters almost as people in regards to the father-son relationship between Godzilla and Minilla, as Fukuda stated "We focused on the relationship between Godzilla and his son throughout the course of Son of Godzilla. Minilla was designed to incorporate features of not only a baby Godzilla but a human baby was well. "Marchan the Dwarf" was hired to play the character due to his ability to play-act and to give the character a childlike ambiance. He was also hired because of his ability to perform athletic rolls and flips inside the thick rubber suit. The Godzilla suit built for this film was the biggest in terms of size and girth. This was done in order to give Godzilla a "maternal" appearance and to give a parent-like stature in contrast next to Minilla. Because of the size of the suit, seasoned Godzilla suit actor Haruo Nakajima was only hired to play Godzilla in two scenes because the suit was much too big for him to wear. The smaller suit he had worn for the films Ebirah, Horror of the Deep and Invasion of Astro-Monster was used for these sequences. The much larger Seji Onaka instead played Godzilla in the film, although he was replaced midway through filming by Hiroshi Sekita after he broke his fingers. Outside of the two monster suits, various marionettes and puppets were used to portray the Island's gigantic inhabitants. The various giant preying mantises known as Kamacuras and the huge spider Kumonga. Arikawa would usually have 20 puppeteers at a time working on the various marionettes. The massive Kumonga puppet needed 2 to 3 people at a time to operate each leg. Filming took place in Guam and areas in Japan including Gotemba, Lake Yamana, the Fuji Five Lakes region, and Oshima. A sequence that shows Godzilla leaving Minilla behind on the freezing Sollgel Island and making it to shore before turning back was cut from the final film's ending. A portion of this sequence has been preserved in both the trailer and an outtake reel included with the Godzilla Final Box DVD collection as supplemental material. Release Theatrical Son of Godzilla
the idea that girls would like a "cute" baby monster. For the idea behind Minilla, Fukuda stated, "We wanted to take a new approach, so we gave Godzilla a child. We thought it would be a little strange if we gave Godzilla a daughter, so instead we gave him a son". Fukuda also wanted to portray the monsters almost as people in regards to the father-son relationship between Godzilla and Minilla, as Fukuda stated "We focused on the relationship between Godzilla and his son throughout the course of Son of Godzilla. Minilla was designed to incorporate features of not only a baby Godzilla but a human baby was well. "Marchan the Dwarf" was hired to play the character due to his ability to play-act and to give the character a childlike ambiance. He was also hired because of his ability to perform athletic rolls and flips inside the thick rubber suit. The Godzilla suit built for this film was the biggest in terms of size and girth. This was done in order to give Godzilla a "maternal" appearance and to give a parent-like stature in contrast next to Minilla. Because of the size of the suit, seasoned Godzilla suit actor Haruo Nakajima was only hired to play Godzilla in two scenes because the suit was much too big for him to wear. The smaller suit he had worn for the films Ebirah, Horror of the Deep and Invasion of Astro-Monster was used for these sequences. The much larger Seji Onaka instead played Godzilla in the film, although he was replaced midway through filming by Hiroshi Sekita after he broke his fingers. Outside of the two monster suits, various marionettes and puppets were used to portray the Island's gigantic inhabitants. The various giant preying mantises known as Kamacuras and the huge spider Kumonga. Arikawa would usually have 20 puppeteers at a time working on the various marionettes. The massive Kumonga puppet needed 2 to 3 people at a time to operate each leg. Filming took place in Guam and areas in Japan including Gotemba, Lake Yamana, the Fuji Five Lakes region, and Oshima. A sequence that shows Godzilla leaving Minilla behind on the freezing Sollgel Island and making it to shore before turning back was cut from the final film's ending. A portion of this sequence has been preserved in both the trailer and an outtake reel included with the Godzilla Final Box DVD collection as supplemental material. Release Theatrical Son of Godzilla was distributed theatrically in Japan by Toho on December 16, 1967. The film was released theatrically in the United Kingdom in August of 1969, as a double feature with Ebirah, Horror of the Deep. Son of Godzilla was never released theatrically in the United States, instead being released directly to television by Walter Reade Sterling as well as American International Pictures (AIP-TV) in some markets in 1969. The American television version was cut to 84 minutes. Home media In 2005, the film was released on DVD by Sony Pictures in its original uncut length with the original Japanese audio and Toho's international English dub. In 2019, the Japanese version and export English version was included in a Blu-ray box set released by the Criterion Collection, which included all 15 films from the franchise's Shōwa era. Reception In a contemporary review, the Monthly Film Bulletin declared the film to be "out of the top drawer of the Toho Company's monster file, with the
Dragon, which begins to torch Tokyo and destroys the control center on Ogasawara. Suddenly, Godzilla attacks and destroys the Kilaaks' underground base, revealing that the Earth's monsters instinctively know who their enemies are. Captain Yamabe then pursues the Fire Dragon in the SY-3 and narrowly achieves victory for the human race. The Fire Dragon is revealed to be a flaming Kilaak saucer and is destroyed. With the Kilaaks defeated, Godzilla and the other monsters eventually return to Monster Island to live in peace. Cast Production Per the waning popularity of the Godzilla series, special effects director Sadamasa Arikawa noted that Toho were going to potentially end the Godzilla series as "Producer Tanaka figured that all the ideas had just run out." The film was written by Takeshi Kimura and Ishirō Honda, making it the first Godzilla film since Godzilla Raids Again not written by Shinichi Sekizawa. Takeshi Kimura is credited to the pen name Kaoru Mabuchi in the film's credits. Kimura and Honda's script developed the concept of Monsterland (referred to as Monster Island in future films). The earliest screenplay was titled Kaiju Chushingura (The word chushingura refers to a famous historical story in Japan about the rebellion of 47 samurai who took revenge after their master was unjustly forced to commit suicide). Written in 1967 by Kimura, this version of the film was to include “all of the monsters”, according to Ishiro Honda in an interview. The story called for Godzilla, Minilla, Anguirus, Rodan, Mothra, Gorosaurus, Manda, Baragon, Kumonga, Varan, Magma, Kamacuras, Gaira, Sanda, and King Kong to appear in the film. When it was decided to adapt Two Godzillas!: Japan SOS (an earlier version of Son of Godzilla) instead, the script was shelved for next year, by then the rights to Kong had expired. Ishiro Honda also wanted to show lunar colonies and brand new hybrid monsters, the results of interbreeding and genetic splicing. He also wanted to delve more deeply into undersea farming to feed the monsters. But because of budget constraints he couldn't show all this. In later scripts, the number of monsters was cut as well. As the film has several monsters who continuously return in the films, the location was developed to be a faraway island where the monsters are pacified. This tied other films not related to the Godzilla series within its universe, as creatures such as Manda (from Atragon) and Varan (Varan the Unbelievable) exist. The film features footage from Ghidorah, the Three-Headed Monster (1964), specifically King Ghidorah's fiery birth scene. New monster suits for Godzilla and Anguirus were constructed for the film, while Rodan, Kumonga, Minilla, Gorosaurus, Manda, Baragon, Mothra, and King Ghidorah suits were modified from previous films, with King Ghidorah having less detail than he had in previous films. Release Destroy All Monsters was released in Japan on 1 August 1968 where it was distributed by Toho. It was released on a double bill with a reissue of the film Atragon. The film was reissued theatrically in Japan in 1972 where it was re-edited by Honda to a 74-minute running time and released with the title Gojira: Dengeki Taisakusen ( Godzilla: Lightning Fast Strategy). Destroy All Monsters continued the decline in ticket sales in Japan for the Godzilla series, earning 2.6 million in ticket sales. In comparison, Invasion of Astro-Monster brought in 3.8 million and Son of Godzilla collected 2.5 million. The film was released in the United States by American International Pictures with an English-language dub on 23 May 1969. The film premiered in the United States in Cincinnati. American International Pictures hired Titra Studios to dub the film into English. The American version of the film remains relatively close to the Japanese original. Among the more notable removed elements include Akira Ifukube's title theme and a brief shot of Minilla shielding his eyes and ducking when King Ghidorah drops Anguirus from the sky. Destroy All Monsters was shown on American television until the early 1980s. It resurfaced on cable broadcast on the Sci-Fi Channel in 1996. Home media Destroy All Monsters was released on VHS by ADV Films in 1998 which featured English-dubbed dialogue from Toho's own international version of the film. In 2011, Tokyo Shock released the film on DVD and Blu-ray and in 2014 the company re-released it on DVD and Blu-ray. In 2019, the Japanese version and export English version were included in a Blu-ray box set released by the Criterion Collection, which included all 15 films from the franchise's Shōwa era. Critical reception From contemporary reviews, both Variety and Monthly
May 23, 1969. Contemporary American reviews were mixed, with praise mainly held for the climactic monster battle. Retrospectively, the film has received more praise, and is considered a favorite among Godzilla fans for its "audacious and simple story", "innovative action sequences", and a "memorably booming" score by Akira Ifukube. Plot At the close of the 20th century (1999 in the dub), all of the Earth's kaiju have been collected by the United Nations Science Committee and confined in an area known as Monster Island, located in the Ogasawara island chain. A special control center is constructed underneath the island to ensure that the monsters stay secure and to serve as a research facility to study them. When communications with Monster Island are suddenly and mysteriously severed, and all of the monsters begin attacking world capitals, Dr. Yoshida of the UNSC orders Captain Yamabe and the crew of his spaceship, Moonlight SY-3, to investigate Ogasawara. There, they discover that the scientists, led by Dr. Otani, have become mind-controlled slaves of a feminine alien race identifying themselves as the Kilaaks, who reveal that they are in control of the monsters. Their leader demands that the human race surrender, or face total annihilation. Godzilla attacks New York City, Rodan invades Moscow, Mothra lays waste to Beijing, Gorosaurus destroys Paris (although Baragon was credited for its destruction), and Manda attacks London. These attacks were set in to motion to draw attention away from Japan, so that the aliens can establish an underground stronghold near Mount Fuji in Japan. The Kilaaks then turn their next major attack onto Tokyo and, without serious opposition, become arrogant in their aims until the UNSC discover, after recovering the Kilaaks' monster mind-control devices from around the world, that they have switched to broadcasting the control signals from their base under the Moon's surface. In a desperate battle, the crew of the SY-3 destroys the Kilaak's lunar outpost and returns the alien control system to Earth. With all of the monsters under the control of the UNSC, the Kilaaks call King Ghidorah. The three-headed space dragon is dispatched to protect the alien stronghold at Mount Fuji, and battles Godzilla, Minilla, Mothra, Rodan, Gorosaurus, Anguirus, and Kumonga. While seemingly invincible, King Ghidorah is eventually overpowered by the combined strength of the Earth monsters and is killed. Refusing to admit defeat, the Kilaaks produce their ace, a burning monster they call the Fire Dragon, which begins to torch Tokyo and destroys the control center on Ogasawara. Suddenly, Godzilla attacks and destroys the Kilaaks' underground base, revealing that the Earth's monsters instinctively know who their enemies are. Captain Yamabe then pursues the Fire Dragon in the SY-3 and narrowly achieves victory for the human race. The Fire Dragon is revealed to be a flaming Kilaak saucer and is destroyed. With the Kilaaks defeated, Godzilla and the other monsters eventually return to Monster Island to live in peace. Cast Production Per the waning popularity of the Godzilla series, special effects director Sadamasa Arikawa noted that Toho were going to potentially end the Godzilla series as "Producer Tanaka figured that all the ideas had just run out." The film was written by Takeshi Kimura and Ishirō Honda, making it the first Godzilla film since Godzilla Raids Again not written by Shinichi Sekizawa. Takeshi Kimura is credited to the pen name Kaoru Mabuchi in the film's credits. Kimura and Honda's script developed the concept of Monsterland (referred to as Monster Island in future films). The earliest screenplay was titled Kaiju Chushingura (The word chushingura refers to a famous historical story in Japan about the rebellion of 47 samurai who took revenge after their master was unjustly forced to commit suicide). Written in 1967 by Kimura, this version of the film was to include “all of the monsters”, according to Ishiro Honda in an interview. The story called for Godzilla, Minilla, Anguirus, Rodan, Mothra, Gorosaurus, Manda, Baragon, Kumonga, Varan, Magma, Kamacuras, Gaira, Sanda, and King Kong to appear in the film. When it was decided to adapt Two Godzillas!: Japan SOS (an earlier version of Son of Godzilla) instead, the script was shelved for next year, by then the rights to Kong had expired. Ishiro Honda also wanted to show lunar colonies and brand new hybrid monsters, the results of interbreeding and genetic splicing. He also wanted to
less pointed, and the buzzsaw didn't move, since it was made of static pieces. This suit also has different-sized back fins, a more circular visor, scales running up the back/sides of the neck and longer legs compared to the original version. Teruyoshi Nakano recalls how the film was rushed and that it took three weeks to shoot, stating, "It went into productions without enough preparation. There was no time to ask Mr. Sekizawa to write the script, so Mr. Sekizawa kind of thought up the general story and director Fukuda wrote the screenplay. The screenplay was completed right before crank-in". Filming Like previous Godzilla films, Godzilla vs. Megalon heavily employs stock footage from previous films such as Mothra vs. Godzilla (1964), The War of the Gargantuas (1966), Ebirah, Horror of the Deep (1966), Destroy All Monsters (1968), Godzilla vs. Hedorah (1971), and Godzilla vs. Gigan (1972). English versions In 1976, Cinema Shares gave Godzilla vs. Megalon a wide theatrical release in the United States and launched a massive marketing campaign for the film, along with the poster, buttons with one of the four monsters' faces on them were released. Given away at theatrical showings was a comic that told a simplified version of the film, which incorrectly named Jet Jaguar as "Robotman" and Gigan as "Borodan". These incorrect names were also featured in the U.S. trailer. Initially, Cinema Shares screened Toho's international English version but to ensure a G rating, several cuts were made, which resulted in the film running three minutes shorter than the original version. Godzilla vs. Megalon is the first Godzilla film to receive an American prime time network television premiere, where it was broadcast nationwide at 9:00 PM on NBC on March 15, 1977. However, to accommodate commercials, the film was only shown in a one-hour time slot, which resulted in the film being cut down to 48 minutes. John Belushi hosted the broadcast where he did some skits, all in a Godzilla suit. Mel Maron (who was president of Cinema Shares at the time) chose to release Godzilla vs. Megalon because he saw Godzilla as a heroic figure by that point and felt the timing was right to show children a hero who was a friendly monster and not Superman. The U.S. rights for the film eventually fell into the public domain in the late 80s, which resulted in companies releasing poorly-cropped, fullscreen VHS tapes mastered from pan and scan sources. This also led to the film being featured in Mystery Science Theater 3000. In 1988, New World Video intended to release the original uncut version of the English dub but, declined the project, due to a lack of budget that was required for a full release. However, despite this, the film was released uncut and in widescreen in 1992 by UK company Polygram Ltd as a double feature with Godzilla vs. Gigan. In 1998 the film was again released by UK company, 4 Front Video. As of now it appears those are the only two VHS tapes on the film that are unedited and in high quality. It was also released on DVD by Power Multimedia in 1999 in Taiwan. Originally the Sci-Fi Channel (now SyFy) showed the cut version, until finally in 2002 as Toho regained ownership of that title alongside Godzilla vs. Gigan and Godzilla vs. Mechagodzilla (both of which also were released by Cinema Shares) and broadcast the film fully uncut for the first time in the U.S. Release Box office In Japan, Godzilla vs. Megalon sold approximately 980,000 tickets. It was the first Godzilla film to sell less than one million admissions. It earned ¥220 million in Japan distribution income (rentals). The film was a success in American theaters, earning $383,744 in its first three days in Texas and Louisiana alone. The film grossed about worldwide. Critical reception Godzilla vs. Megalon was released theatrically in America on May 9, 1976, though the San Francisco Chronicle indicates that it opened there in June, and The New York Times indicates that it opened in New York City on July 11. The New York Times film critic Vincent Canby, who a decade before had given a negative review to Ghidorah, the Three-Headed Monster, gave Godzilla vs. Megalon a generally positive review. In his review on July 12, 1976, Canby said, "Godzilla vs. Megalon completes the canonization of Godzilla...It's been a remarkable transformation of character - the dragon has become St. George...It's wildly preposterous, imaginative and funny (often intentionally). It demonstrates the rewards of friendship, between humans as well as monsters, and it is gentle." Godzilla vs. Megalon has attracted the ire of many Godzilla fans in the decades since its original release. The film contributed to the reputation of Godzilla films in the United States as cheap children's entertainment that should not be taken seriously. It has been described as "incredibly, undeniably, mind-numbingly bad" and one of the "poorer moments" in the history of kaiju films. Author Stephen Mark Rainey's critique of the film was strongly negative, published in Japanese Giants, issue four. 1977. Edited and published by Bradford G. Boyle. In particular, the special effects of the film have been heavily criticized. One review described the Godzilla costume as appearing to be "crossed with Kermit the Frog" and another sneeringly compared it to Godzilla vs. Gigan, stating that it did "everything wrong that Gigan did, and then some." However, most of the criticism is of the lack of actual special effects work, as most of it consists of stock footage from previous films, including Godzilla vs. Gigan and Ghidorah, the Three-Headed Monster, but a few pieces of effects work have garnered praise, specifically a scene where Megalon breaks through a dam and the draining of the lake. The other aspects of the film have been similarly skewered. The acting is usually described as flat and generally poor, and as not improving, or sometimes, worsening, the already weak script. One part of the film, on the other hand, has garnered almost universal praise: Godzilla's final attack on Megalon, a flying kick. It has been called the saving grace of the film, and was made famous by the mock exclamations of shock and awe displayed on Godzilla vs. Megalon's appearance on Mystery Science Theater 3000. Through the end of season three to the middle of season five, that clip would be shown at the opening of each show. Despite all this, the film is also one of the most widely seen Godzilla films in the United States — it was popular in its initial theatrical release, largely due to an aggressive marketing campaign, including elaborate posters of the two title monsters battling atop New York City's World Trade Center towers, presumably to capitalize on the hype surrounding the Dino De Laurentiis remake of King Kong, which used a similar image for its own poster.
monster. Eventually, the trio of heroes manage to escape their situation with the Seatopians and reunite to devise a plan to send Jet Jaguar to get Godzilla's help using Jet Jaguar's secondary control system. After uniting with Japan's Defense Force, Goro manages to regain control of Jet Jaguar and sends the robot to Monster Island to bring Godzilla to fight Megalon. Without a guide to control its actions, Megalon flails around relentlessly and aimlessly fighting with the Defense Force and destroying the outskirts of Tokyo. The Seatopians learn of Jet Jaguar's turn and thus send out a distress call to their allies, the Space Hunter Nebula M aliens (from the previous film) to send the alien monster Gigan to assist their allies. As Godzilla journeys to fight Megalon, Jet Jaguar starts acting on its own and ignoring commands to the surprise of its inventors, and grows to gigantic proportions to face Megalon himself until Godzilla arrives. The battle is roughly at a standstill between robot and cyborg, until Gigan arrives and both Megalon and Gigan double team against Jet Jaguar. Godzilla finally arrives to assist Jet Jaguar and the odds become even. After a long and brutal fight, Gigan and Megalon both retreat and Godzilla and Jet Jaguar shake hands on a job well done. Jet Jaguar bids Godzilla farewell and Godzilla returns to its home on Monster Island. Jet Jaguar turns back to its human size, and returns home with Goro and Rokuro. Cast Production Development The origins of Megalon can be traced back to 1969's All Monsters Attack, as the original working idea for the film's antagonist Gabara was initially envisioned as a giant mole cricket called Gebara. The character was later reworked into Kaoru Mabuchi's 1971 treatment for Godzilla vs. the Space Monsters: Earth Defense Directive, a precursor to 1972's Godzilla vs. Gigan. The proposal called for Megalon to be paired with Gigan and King Ghidorah under the command of the hostile alien invader Miko, only to be defeated and driven off by the combined might of Godzilla, Anguirus, and a brand new monster called Majin Tuol. The next draft of the script, titled The Return of King Ghidorah!, retained the core villain cast of Gigan, King Ghidorah, and Megalon, but replaced Anguirus and Majin Tuol with Varan and Rodan. However, most of the proposed monsters were cut, leading to the final version of Godzilla vs. Gigan. Contrary to popular belief, there is no evidence Godzilla vs. Megalon was originally planned as a Jet Jaguar solo film, and no Japanese sources have surfaced which claim otherwise. Rather, the creation of Jet Jaguar was the result of a contest Toho had for children in mid-to-late 1972. The winner of the contest was an elementary school student, who submitted the drawing of a robot called Red Arone. Red Arone was turned into a monster suit, but when the child was shown the suit, he became upset because the suit did not resemble his original design. The boy's original design was white but the costume was colored red, blue and yellow. Red Arone was used for publicity, but Toho had renamed the character Jet Jaguar and had special effects director Teruyoshi Nakano redesign the character, only keeping the colors from the Red Arone suit. The Red Arone suit had a different head and wings. According to Teruyoshi Nakano, Godzilla vs. Megalon was a replacement project for another film that was cancelled at the last minute, and evidence suggests this cancelled film was Godzilla vs. Red Moon, slated for 1973. As a result, the project was postponed during pre-production. Screenwriter Shinichi Sekizawa had no time to write out a full script, and instead thought out a general story. Director Jun Fukuda ultimately ended up writing the screenplay. To make up for lost production time, the film was shot in a hasty three weeks. The production time totaled nearly six months from planning to finish. The film had three early treatments, each written by Shinichi Sekizawa, one was titled Godzilla vs. The Megalon Brothers: The Undersea Kingdom's Annihilation Strategy which was completed in September 1972. The second was titled Insect Monster Megalon vs. Godzilla: Undersea Kingdom's Annihilation Strategy, which was turned in on September 5, 1972, and the third draft was submitted on September 7, 1972. Creature design According to Teruyoshi Nakano, the Godzilla suit used in this film (nicknamed "MegaroGoji" メガロゴジ )
Miki Saegusa. He openly admitted that directing a Godzilla film was secondary to his desire to make a James Bond movie, and thus added elements of the spy film genre into the plot. Unlike the case with later, more committee-driven Godzilla films, Ōmori was given considerable leeway in writing and directing the film, which Toho staff later judged to have been an error resulting in a movie with a very narrow audience. Special effects Koichi Kawakita, who had previously worked for Tsuburaya Productions, replaced Teruyoshi Nakano as head of the series' special effects unit after Toho became impressed at his work in Gunhed. Kawakita made use of Gunhed's special effects team Studio OX, and initially wanted to make Godzilla more animal-like, using crocodiles as references, but was berated by Tanaka, who declared Godzilla to be "a monster" rather than an animal. Kenpachiro Satsuma returned to portray Godzilla, hoping to improve his performance by making it less anthropomorphic than in previous films. Suitmaker Noboyuki Yasamaru created a Godzilla suit made specifically with Satsuma's measurements in mind, unlike the previous one which was initially built for another performer and caused Satsuma discomfort. The resulting 242 lb suit proved more comfortable than the last, having a lower center of gravity and more mobile legs. A second 176 lb suit was built for outdoor underwater scenes. The head's size was reduced, and the whites around the eyes removed. On the advice of story finalist Shinichiro Kobayashi, a double row of teeth was incorporated in the jaws. As with the previous film, animatronic models were used for close-up shots. These models were an improvement over the last, as they were made from the same molds used for the main costume, and included an articulated tongue and intricate eye motion. The suit's dorsal plates were filled with light bulbs for scenes in which Godzilla uses his atomic ray, thus lessening reliance on optical animation, though they electrocuted Satsuma the first time they were activated. Satsuma was also obliged to wear protective goggles when in the suit during scenes in which Godzilla battles the JSDF, as real explosives were used on set. The film was mainly shot at the Toho lot, although some filming occued on location at the East Fuji Maneuver Area. Designing and building the Biollante props proved problematic, as traditional suitmation techniques made realizing the requested design of the creature's first form difficult, and the resulting cumbersome model for Biollante's final form was met with disbelief from the special effects team. Biollante's first form was performed by Masao Takegami, who sat within the model's trunk area on a platform just above water level. While the creature's head movements were simple to operate, its vines were controlled by an intricate array of overhead wires which proved difficult for Satsuma to react to during combat scenes as they offered no tension, thus warranting Satsuma to feign receiving blows from them, despite not being able to perceive them. Biollante's final form was even more difficult to operate, as its vine network took hours to rig up on set. Visibility in both the Godzilla and final form Biollante suits was poor, thus causing difficulties for Takegami in aiming the creature's head when firing sap, which permanently stained anything it landed on. While it was initially decided to incorporate stop motion animation into the film, the resulting sequences were scrapped, as Kawakita felt they failed to blend in with the live-action footage effectively. The film however became the first of its kind to use CGI, though its usage was limited to scenes involving computer generated schematics. The original cut of the movie had the first battle culminating in Biollante's spores falling around the hills surrounding Lake Ashino and blooming into fields of flowers, though this was removed as the flowers were out of scale. Music Unlike the previous film, Godzilla vs. Biollante incorporates themes from Akira Ifukube's original Godzilla theme, though the majority of the soundtrack was composed of original themes by Koichi Sugiyama. The score was orchestrated by conductor David Howell through the Kansai Philarmonic, though Howell himself had never viewed the movie, and thus was left to interpret what the scenes would consist of when conducting the orchestra. English version After the film was released in Japan, Toho commissioned a Hong Kong company named Omni Productions to dub the film into English. In early 1990, Toho entered discussions with Miramax to distribute the film. When talks broke off, Toho filed a lawsuit in Los Angeles Federal Court, accusing Miramax of entering an oral agreement in June to pay Toho $500,000 to distribute the film. This lawsuit delayed the film's release for two years. An out of court settlement was reached with Miramax buying the rights to the film for an unreported figure. Miramax would have entertained thoughts of releasing the film in theaters, but in the end it was decided to release the film straight to home video instead. HBO released the film on VHS in 1992 and Laserdisc in 1993. Miramax utilized the uncut English international version of the film for this release. Release Home mediaGodzilla vs. Biollante was released on VHS by HBO Home Video on November 25, 1992. It was later relicensed by Miramax and released on Blu-ray and DVD by Echo Bridge on December 4, 2012. It was released as a double feature and 8-disk movie pack on both Blu-ray and DVD with Mega Shark Versus Giant Octopus (2009) by Echo Bridge Home Entertainment in 2013. It was last released by Lionsgate on Blu-ray and DVD on October 7, 2014. It's quite likely that Miramax's rights have reverted back to Toho since, as this release has since gone out of print. Reception Box office In Japan, the film sold approximately 2 million tickets, grossing . Critical reactionGodzilla vs. Biollante has received positive reviews, with praise for the story, music and visuals. Ed Godziszewski of Monster Zero said the film is "by no means a classic" but felt that "for the first time in well over 20 years, a [Godzilla] script is presented with some fresh, original ideas and themes." Joseph Savitski of Beyond Hollywood said the film's music is "a major detraction", but added that it's "not only one of the most
the film had been of little financial benefit to Toho, and the failure of King Kong Lives following year convinced him that audiences were not ready for a continuation of the Godzilla series. He relented after the success of Little Shop of Horrors, and proceeded to hold a public story-writing contest for a possible script. In consideration of The Return of Godzilla'''s marginal success in Japan, Tanaka insisted that the story focus on a classic monster vs. monster theme. Tanaka handed the five finalist entries to director Kazuki Ōmori, despite the two's initially hostile relationship; the latter had previously held Tanaka responsible for the decline in the Godzilla series' quality during the 1970s. Ōmori chose the entry of dentist Shinichiro Kobayashi, who wrote his story with the hypothetical death of his daughter in mind. Kobayashi's submission was notable for its emphasis on dilemmas concerning biotechnology rather than nuclear energy, and revolved around a scientist grieving for his deceased daughter and attempting to keep her soul alive by merging her genes with those of a plant. The scientist's initial experiments would have resulted in the creation of a giant rat-like amphibian called Deutalios, which would have landed in Tokyo Bay and been killed by Godzilla. A female reporter investigating the scientist's activities would have suffered from psychic visions of plants with humanoid faces compelling her to infiltrate the scientist's laboratory. The scientist would have later confessed his intentions, and the finale would have had Godzilla battling a human-faced Biollante who defeats him by searing his flesh with acid. Ōmori proceeded to modify the story into a workable script over a period of three years, using his background as a biologist to create a plausible plot involving genetic engineering and botany. In order to preserve the series' anti-nuclear message, he linked the creation of Biollante to the use of Godzilla cells, and replaced Kobayashi's journalist character with Miki Saegusa. He openly admitted that directing a Godzilla film was secondary to his desire to make a James Bond movie, and thus added elements of the spy film genre into the plot. Unlike the case with later, more committee-driven Godzilla films, Ōmori was given considerable leeway in writing and directing the film, which Toho staff later judged to have been an error resulting in a movie with a very narrow audience. Special effects Koichi Kawakita, who had previously worked for Tsuburaya Productions, replaced Teruyoshi Nakano as head of the series' special effects unit after Toho became impressed at his work in Gunhed. Kawakita made use of Gunhed's special effects team Studio OX, and initially wanted to make Godzilla more animal-like, using crocodiles as references, but was berated by Tanaka, who declared Godzilla to be "a monster" rather than an animal. Kenpachiro Satsuma returned to portray Godzilla, hoping to improve his performance by making it less anthropomorphic than in previous films. Suitmaker Noboyuki Yasamaru created a Godzilla suit made specifically with Satsuma's measurements in mind, unlike the previous one which was initially built for another performer and caused Satsuma discomfort. The resulting 242 lb suit proved more comfortable than the last, having a lower center of gravity and more mobile legs. A second 176 lb suit was built for outdoor underwater scenes. The head's size was reduced, and the whites around the eyes removed. On the advice of story finalist Shinichiro Kobayashi, a double row of teeth was incorporated in the jaws. As with the previous film, animatronic models were used for close-up shots. These models were an improvement over the last, as they were made from the same molds used for the main costume, and included an articulated tongue and intricate eye motion. The suit's dorsal plates were filled with light bulbs for scenes in which Godzilla uses his atomic ray, thus lessening reliance on optical animation, though they electrocuted Satsuma the first time they were activated. Satsuma was also obliged to wear protective goggles when in the suit during scenes in which Godzilla battles the JSDF, as real explosives were used
also released the film to television in late 1978, this time under Toho's international title, Terror of Mechagodzilla. Unlike The Terror of Godzilla, the television version remained mostly uncut, with only the shot of Katsura's naked breasts excised. Saperstein's editors also added a 10-minute prologue that served as a brief history of Godzilla, with footage from Saperstein's English versions of Invasion of Astro-Monster and All Monsters Attack (the latter of which utilized stock footage from both Ebirah, Horror of the Deep and Son of Godzilla). In the mid-1980s, the U.S. television version, Terror of Mechagodzilla, was replaced by the theatrical edit, The Terror of Godzilla, on television and home video. For some reason, the title was also changed to Terror of Mechagodzilla. The 1994 Paramount release of Terror of Mechagodzilla listed a running time of 89 minutes on the slipcase, implying that this release would be the longer version first shown on American TV. The actual video cassette featured the edited theatrical version. In a 1995 interview with G-Fan magazine, Saperstein was surprised to hear about this mistake. In 1997 on Channel 4 in the U.K., three Godzilla movies were shown back to back late at night, starting with Godzilla vs. Megalon, Godzilla vs. Gigan and then Terror of Mechagodzilla; all were dubbed versions. This showing was uncut, including the Katsura nudity scene, but it did not have the Western-made prologue. In the mid-2000s, the television version showed up again on Monsters HD, and in 2007, it made its home video debut as the U.S. version on the Classic Media DVD. Although the added prologue was originally framed for fullscreen television, it was cropped and shown in widescreen on the disc. The rest of the movie featured the audio from Saperstein's television version synced to video from the Japanese version. The first article about the movie's storyline was published in Japanese Giants #4 in 1977, edited and published by Bradford G. Boyle, and was written by Richard H. Campbell, creator of The Godzilla Fan News Letter (a.k.a. "The Gang"). Box office In Japan, the film sold 980,000 tickets. Despite earning positive reviews, it would be the least-attended Godzilla film in Japan and also one of only two Godzilla films to sell less than 1 million tickets. This was part of a decline in attendance for monster movies as a whole and Toho put the production of monster movies on hold. Toho had no intention of permanently ending the Godzilla series. Throughout the remainder of the 1970s, several new Godzilla stories were submitted by various writers and producers. None of these films, however, were ultimately made. It was not until 1984 and Godzilla'''s 30th anniversary that Toho would
Godzilla, Mechagodzilla 2, and Titanosaurus, respectively. The film was released theatrically in Japan on March 15, 1975. It received a limited release in the United States in 1978 by Bob Conn Enterprises under the title The Terror of Godzilla. The film remains the least financially successful entry in the Godzilla franchise to this day. Plot Following the events of Godzilla vs. Mechagodzilla, Interpol agents search for Mechagodzilla's remains at the bottom of the Okinawan Sea in the hopes of gathering information on the robot's builders, the alien Simeons. However, their submarine is attacked by a giant, aquatic dinosaur called Titanosaurus and the crew vanishes. Interpol launches an investigation into the incident. With the help of marine biologist Akira Ichinose, they trace Titanosaurus to a reclusive, mad scientist named Shinzô Mafune, who wants to destroy mankind. While the group is visiting the scientist's old home, they meet Mafune's daughter, Katsura, who claims her father is dead and that she burned his notes about Titanosaurus at his request. Unbeknownst to Interpol, the living Mafune is visited by Tsuda, aide to the Simeon leader Mugal, who is leading a project to rebuild Mechagodzilla. Mugal offers the Simeons' services to Mafune so that their respective monsters can wipe out mankind and allow them to rebuild the world for themselves. Complicating matters, Ichinose falls in love with Katsura and unwittingly gives her Interpol's information on the Simeons, Mechagodzilla, and Titanosaurus. She is also revealed to be a cyborg, having undergone cybernetic surgery after she was nearly killed during one of her father's experiments as a child, and implanted with Mechagodzilla's control device. Additionally, an impatient Mafune releases Titanosaurus on Yokosuka without the aliens' permission. While Interpol discovers the dinosaur is vulnerable to supersonic waves, Katsura destroys their supersonic wave oscillator. However, Godzilla arrives and easily defeats Titanosaurus, causing the latter to retreat. When Ichinose visits Katsura, the Simeons capture him and force him to watch as they unleash Mechagodzilla 2 and Titanosaurus on Tokyo while Interpol struggles to repair their wave oscillator and the Japanese armed forces struggle to fend off the monsters. Godzilla arrives, but is initially outmatched until Interpol distracts Titanosaurus with the repaired wave oscillator, allowing Godzilla to focus on Mechagodzilla 2. Interpol agents infiltrate the aliens' hideout, rescue Ichinose, and kill Mafune and many of the aliens. The remaining Simeons attempt to escape, but Godzilla shoots down their ships with its atomic breath. The wounded Katsura shoots herself to destroy Mechagodzilla 2's control device and dies in Ichinose's arms. With the robot non-functional, Godzilla tosses it into a chasm before blasting it with its atomic breath, causing it to explode and get buried. With help from Interpol, Godzilla then defeats Titanosaurus, who returns to the sea. Cast Production Development The original screenplay that Yukiko Takayama created after winning Toho's story contest for the next installment in the Godzilla series was picked by assistant producer Kenji Tokoro and was submitted for approval on July 1, 1974, less than four months after Godzilla vs. Mechagodzilla was released. The original concept is similar to the finished version of Terror of Mechagodzilla, with many of the changes being budgetary in nature. The most obvious alteration is the removal of the two dinosaurs called the Titans, which merged to become Titanosaurus in the first draft. It was an interesting concept, although something that was also under-explained, considering the magnitude of such an occurrence of
Godzilla breaks from its restraints and causes Ghidorah to send both crashing into the ocean. Emmy then returns to the future, but not before informing Terasawa that she is his descendant. At the bottom of the ocean, Godzilla awakens and roars over Mecha-King Ghidorah's remains before swimming away. Cast Production Conception Although the previously filmed Godzilla vs. Biollante had been the most expensive Godzilla film produced at the time, its low audience attendance and loss of revenue convinced executive producer and Godzilla series creator Tomoyuki Tanaka to revitalize the series by bringing back iconic monsters from pre-1984 Godzilla movies, specifically Godzilla's archenemy King Ghidorah. Godzilla vs. Biollante director and writer Kazuki Ōmori had initially hoped to start a standalone series centered on Mothra, and was in the process of rewriting a 1990 script for the unrealized film Mothra vs. Bagan. The film was ultimately scrapped by Toho, under the assumption that, unlike Godzilla, Mothra would have been a difficult character to market overseas. The planning stages for a sequel to Godzilla vs. Biollante were initially hampered by Tanaka's deteriorating health, thus prompting the takeover of Shōgo Tomiyama as producer. The new producer felt that the financial failure of Godzilla vs. Biollante was due to the plot being too sophisticated for child audiences, and thus intended to return some of the fantasy elements of the pre-1984 Godzilla films to the series. Ōmori himself blamed the lackluster performance of Godzilla vs. Biollante on competition with Back to the Future Part II, and thus concluded that audiences wanted plots involving time travel. His approach to the film also differed from Godzilla vs. Biollante in his greater emphasis on developing the personalities of the monsters rather than the human characters. Akira Ifukube agreed to compose the film's score on the insistence of his daughter, after as he was dissatisfied with the way his compositions had been treated in Godzilla vs. Biollante. Special effects The Godzilla suits used in Godzilla vs. Biollante were reused in Godzilla vs. King Ghidorah, though with slight modifications. The original suit used for land-based and full body shots had its head replaced with a wider and flatter one, and the body cut in half. The upper half was used in scenes where Godzilla emerges from the sea and during close-ups during the character's first fight with King Ghidorah. The suit used previously for scenes set at sea was modified with rounder shoulders, a more prominent chest, and an enhanced face, and was used throughout the majority of the film's Godzilla scenes. The redesigned King Ghidorah featured much more advanced wirework puppetry than its predecessors, and effects team leader Koichi Kawakita designed the "Godzillasaurus" as a more paleontologically accurate-looking dinosaur than Godzilla itself as a nod to American filmmakers aspiring to direct their own Godzilla films with the intention of making the monster more realistic. Ōmori's original draft specified that the dinosaur that would become Godzilla was a Tyrannosaurus, though this was rejected by creature designer Shinji Nishikawa, who stated that he "couldn't accept that a tyrannosaur could become Godzilla". The final suit combined features of Tyrannosaurus with Godzilla, and real octopus blood was used during the bombardment scene. Because the Godzillasaurus' arms were much smaller than Godzilla's, suit performer Wataru Fukuda had to operate them with levers within the costume. The creature's distress calls were recycled Gamera cries. Home media The Columbia/TriStar Home Video DVD version was released in 1998 as a single disc double feature with Godzilla vs. Mothra. The picture was full frame (1.33:1) [NTSC] and the audio in English (2.0). There were no subtitles. Extras included the trailer for Godzilla vs. King Ghidorah and Godzilla vs. Mothra. The Sony Blu-ray version was released on May 6, 2014 as a two-disc double feature with Godzilla vs. Mothra. The picture was MPEG-4 AVC (1.85:1) [1080p] and the audio was in Japanese and English (DTS-HD Master Audio 2.0). Subtitles were added in English, English SDH and French. Extras included the theatrical trailer and three teasers in HD with English subtitles. Reception Joseph Savitski of Beyond Hollywood said "This entry in the popular monster series is a disappointing and flawed effort unworthy of the “Godzilla” name." Film historian and critic David Kalat wrote "Despite its shortcomings, illogic, and overpopulated cast, Godzilla vs. King Ghidorah is crammed full of ideas, richly visualized innovations, a genuine spirit of fun, and some of the most complex emotional manipulation ever to grace the series." Controversy The film was considered controversial at the time of its release, being contemporary to a period of economic tension between America and Japan, but mainly due to its fictional World War II depictions. Gerald Glaubitz of the Pearl Harbor Survivors Association appeared alongside director Kazuki Ōmori on Entertainment Tonight and condemned the film as being in "very poor taste" and detrimental to American-Japanese relations. Ishirō Honda also criticized Ōmori, stating that the scene in which Godzilla attacks and crushes American G.I.s went "too far". Conversely, Godzilla historian Steve Ryfle said American media reports of supposed anti-Americanism "weren't really thought-provoking or insightful." Ōmori has denied all such allegations, stating that the American extras in the film had been "happy about being crushed and squished by Godzilla." Commenting on the controversy in 2006, Ōmori stated: References External links 1991 films 1990s monster movies 1991 science fiction films Android (robot) films Films about dinosaurs English-language films Films about dragons Films about nuclear war and weapons Films about time travel Films directed by Kazuki Ōmori Films produced by Tomoyuki Tanaka Films scored by Akira Ifukube Films set in 1944 Films set in 1992 Films set in the 23rd century Films set in Fukuoka Films set in Hiroshima Films set in Sapporo Films set
dinosaur from Lagos Island before the island is irradiated in 1954, thus preventing the mutation of the creature into Godzilla. As proof of their story, Emmy presents a copy of Terasawa's book, which has not yet been completed in the present. The Futurians, Terasawa, Miki Saegusa, and Professor Mazaki, board a time shuttle and travel back to 1944 to Lagos Island. There, as American forces land and engage the Japanese forces commanded by Shindo, the dinosaur attacks and kills the American soldiers. The American navy then bombs the dinosaur from the sea and gravely wounds it. After Shindo and his men leave the island, M-11 teleports the dinosaur from Lagos Island to the Bering Strait. Before returning to 1992, the Futurians secretly leave three small creatures called Dorats on Lagos Island, which are exposed to radiation from the hydrogen bomb test in 1954 and merge to become King Ghidorah. After returning to 1992, the Futurians use King Ghidorah to subjugate Japan and issue an ultimatum, but Japan refuses to surrender. Feeling sympathy for the Japanese people, Emmy reveals to Terasawa the truth behind the Futurians' mission: in the future, Japan is an economic superpower that has surpassed the United States, Russia, and China. The Futurians traveled back in time in order to change history and prevent Japan's future economic dominance by creating King Ghidorah and using it to destroy present-day Japan. At the same time, they also planned to erase Godzilla from history so that it would not pose a threat to their plans. After M-11 brings Emmy back to the UFO, she reprograms the android so it will help her. Shindo plans to send his nuclear submarine to the Bering Strait and irradiate the dinosaur in order to recreate Godzilla. However, Terasawa discovers too late that a Russian nuclear submarine sank there in the 1970s and released enough radiation to mutate the dinosaur into Godzilla. En route to the Bering Strait, Shindo's submarine is destroyed by Godzilla, who absorbs its radiation, recovers from the ANEB and becomes larger. Godzilla arrives in Japan and is met by King Ghidorah. They fight at equal strength, each immune to the other's attacks. With M-11 and Terasawa's aid, Emmy sabotages the UFO's control over King Ghidorah, causing the three-headed monster to lose focus during the battle. Godzilla eventually ends the battle by blasting off Ghidorah's middle head. Before sending King Ghidorah crashing into the ocean, Godzilla destroys the UFO, killing Wilson and Grenchiko. It then turns its attention to Tokyo, destroying the city and killing Shindo. Emmy travels to the future with M-11 and returns to the present day with Mecha-King Ghidorah, a cybernetic version of King Ghidorah. The cybernetic Ghidorah blasts Godzilla with beams, which proves useless. Godzilla then counters by relentlessly blasting Ghidorah with its atomic breath before Ghidorah launches clamps to restrain Godzilla. Ghidorah carries Godzilla out of Japan, but Godzilla breaks from its restraints and causes Ghidorah to send both crashing into the ocean. Emmy then returns to the future, but not before informing Terasawa that she is his descendant. At the bottom of the ocean, Godzilla awakens and roars over Mecha-King Ghidorah's remains before swimming away. Cast Production Conception Although the previously filmed Godzilla vs. Biollante had been the most expensive Godzilla film produced at the time, its low audience attendance and loss of revenue convinced executive producer and Godzilla series creator Tomoyuki Tanaka to revitalize the series by bringing back iconic monsters from pre-1984 Godzilla movies, specifically Godzilla's archenemy King Ghidorah. Godzilla vs. Biollante director and writer Kazuki Ōmori had initially hoped to start a standalone series centered on Mothra, and was in the process of rewriting a 1990 script for the unrealized film Mothra vs. Bagan. The film was ultimately scrapped by Toho, under the assumption that, unlike Godzilla, Mothra would have been a difficult character to market overseas. The planning stages for a sequel to Godzilla vs. Biollante were initially hampered by Tanaka's deteriorating health, thus prompting the takeover of Shōgo Tomiyama as producer. The new producer felt that the financial failure of Godzilla vs. Biollante was due to the plot being too sophisticated for child audiences, and thus intended to return some of the fantasy elements of the pre-1984 Godzilla films to the series. Ōmori himself blamed the lackluster performance of Godzilla vs. Biollante on competition with Back to the Future Part II, and thus concluded that audiences wanted plots involving time travel. His approach to the film also differed from Godzilla vs. Biollante in his greater emphasis on developing the personalities of the monsters rather than the human characters. Akira Ifukube agreed to compose the film's score on the insistence of his daughter, after as he was dissatisfied with the way his compositions had been treated in Godzilla vs. Biollante. Special effects The Godzilla suits used in Godzilla vs. Biollante were reused in Godzilla vs. King Ghidorah, though with slight modifications. The original suit used for land-based and full body shots had its head replaced with a wider and flatter one, and the body cut in half. The upper half was used in scenes where Godzilla emerges
a dark twin to Mothra. The character was later renamed Battra (a portmanteau of "battle" and "Mothra"), as the first name was disharmonious in Japanese. Tomiyama had intended to feature Mothra star Frankie Sakai, but was unable to because of scheduling conflicts. The final battle between Godzilla, Mothra and Battra was originally meant to have a more elaborate conclusion; as in the final product, Godzilla would have been transported to sea, only to kill Battra and plunge into the ocean. However, the site of their fall would have been the submerged, Stonehenge-like ruins of the Cosmos civilization, which would have engulfed and trapped Godzilla with a forcefield activated by Mothra. Ishirō Honda, who directed the first Godzilla film and many others, visited the set shortly before dying. Special effects Koichi Kawakita continued his theme of giving Godzilla's opponents the ability to metamorphose, and had initially intended to have Mothra killed off, only to be reborn as the cybernetic moth MechaMothra, though this was scrapped early in production, thus making Godzilla vs. Mothra the first post-1984 Godzilla movie to not feature a mecha contraption. The underwater scenes were filmed through an aquarium filled with fish set between the performers and the camera. Kawakita's team constructed a new Godzilla suit from previously used molds, though it was made slimmer than previous suits, the neck given more prominent ribbing, and the arrangement of the character's dorsal plates was changed so that the largest plate was placed on the middle of the back. The arms were more flexible at the biceps, and the face was given numerous cosmetic changes; the forehead was reduced and flattened, the teeth scaled down, and the eyes given a golden tint. The head was also electronically modified to allow more vertical mobility. Filming the Godzilla scenes was hampered when the suit previously used for Godzilla vs. Biollante and Godzilla vs. King Ghidorah, which was needed for some stunt-work, was stolen from Toho studios, only to be recovered at Lake Okutama in bad condition. The remains of the suit were recycled for the first battle sequence. Godzilla's roar was reverted to the high-pitched shriek from pre-1984 Godzilla films, while Battra's sound effects were recycled from those of Rodan. In designing Battra, which the script described as a "black Mothra", artist Shinji Nishikawa sought to distance its design from Mothra's by making its adult form more similar to its larval one than is the case with Mothra, and combining Mothra's two eyes into one. Release Godzilla vs. Mothra was released in Japan on December 12, 1992 where it was distributed by Toho. The film sold approximately 4,200,000 tickets in Japan, becoming the number one Japanese film on the domestic market in the period that included the year 1993. It earned ¥2.22 billion in distribution income, and grossed in total. The film was released in the United States as Godzilla and Mothra: The Battle for Earth on April 28, 1998 on home video by Columbia TriStar Home Video. Critical reaction Review aggregation website Rotten Tomatoes has a 75% approval rating from critics, based on 8 reviews with an average score of 6.3/10. Ed Godziszewski of Monster Zero said, "Rushed into production but a few months after Godzilla vs. King Ghidorah, this film is unable to hide its hurried nature [but] effects-wise, the film makes up for the story's shortcomings and then some." Japan Hero said, "While this movie is not the best of the Heisei series, it is still a really interesting movie. The battles are cool, and Battra was an interesting idea. If you have never seen this movie, I highly recommend it." Stomp Tokyo said the film is "one of the better Godzilla movies in that the scenes in which monsters
Godzilla from attacking Yokohama. Originally conceived as a standalone Mothra film entitled Mothra vs. Bagan, the film is notable for its return to a more fantasy-based, family-oriented atmosphere, evocative of older Godzilla films. Although he did not return as director, Ōmori continued his trend of incorporating Hollywood elements into his screenplay, in this case nods to the Indiana Jones franchise. Godzilla vs. Mothra was released theatrically in Japan on December 12, 1992, and was followed by Godzilla vs. Mechagodzilla II the following year. Godzilla vs. Mothra was released direct-to-video in the United States in 1998 by Columbia Tristar Home Video under the title Godzilla and Mothra: The Battle for Earth. The film was the second highest-grossing film in Japan in 1993, with Jurassic Park being the highest-grossing. Plot In mid-1992, following the events of Godzilla vs. King Ghidorah, a meteoroid crashes in the Ogasawara Trench and awakens Godzilla. Six months later, explorer Takuya Fujito is detained after stealing an ancient artifact. Later, a representative of the Japanese Prime Minister offers to have Takuya's charges dropped if he explores Infant Island with his ex-wife, Masako Tezuka and Kenji Ando, the secretary of the rapacious Marutomo company. After the trio arrives on the island, they find a cave containing a depiction of two giant insects in battle. Further exploration leads them to a giant egg and a pair of diminutive humanoids called the Cosmos, who identify the egg as belonging to Mothra. The Cosmos tell of an ancient civilization that tried to control the Earth's climate 12,000 years ago, thus provoking the Earth into creating Battra. Battra, a male divine moth similar to Mothra, but much more fearsome in appearance, destroyed the civilisation and their weather-controlling device but then became uncontrollable, and started to harm the very planet that created him. Mothra was then sent by the Earth to fight Battra, who eventually lost. The Cosmos explain how the meteoroid uncovered Mothra's egg, and may have awoken Battra, who is still embittered over humanity's interference in the Earth's natural order. The Marutomo company sends a freighter to Infant Island to pick up the egg, ostensibly to protect it. As they are sailing, Godzilla surfaces and heads toward the newly hatched Mothra larva. Battra, also as a larva, soon appears and joins the fight, allowing Mothra to retreat. The battle between Godzilla and Battra is eventually taken underwater, where the force of the battle causes a giant crack on the Philippine Sea Plate that swallows the two. Masako and Takuya later discover Ando's true intentions when he kidnaps the Cosmos and takes them to Marutomo headquarters, where the CEO intends to use them for publicity purposes. Mothra enters Tokyo in an attempt to rescue the Cosmos, but is attacked by the JSDF. The wounded Mothra heads for the National Diet Building and starts constructing a cocoon around herself. Meanwhile, Godzilla surfaces from Mount Fuji, while Battra frees himself from the Earth's crust and continues towards Japan. Both Mothra and Battra attain their imago forms and converge at Yokohama Cosmo World where they begin to fight once more. Godzilla interrupts the battle and attacks Mothra, but Battra comes to her aid and briefly incapacitates Godzilla. Regrouping, the two moths decide to join forces against Godzilla, determining him to be the greater threat to the planet. Eventually, Mothra and Battra overwhelm Godzilla and carry it over the ocean. Godzilla bites Battra's neck and fires its atomic breath into the wound, killing him. A tired Mothra drops Godzilla and the lifeless Battra into the water below, sealing Godzilla below the surface by creating a mystical glyph with scales from her wings. The next morning, the Cosmos explain that Battra had been waiting many years to destroy an even larger asteroid that would threaten the Earth in 1999. Mothra had promised she would stop the future collision if Battra were to die, and she and the Cosmos leave Earth as the humans bid farewell. Cast Production The idea of shooting a movie featuring a revamped Mothra dated back to a screenplay
Diet Committee member Kin Sugai as Ozawa, Diet Committee member Kokuten Kōdō as the old fisherman Tadashi Okabe as the assistant of Dr. Tanabe Jiro Mitsuaki as an employee of the Nankai Salvage Company Ren Imaizumi as a radio officer of the Nankai Salvage Company Sokichi Maki as the chief at the Maritime Safety Agency Kenji Sahara as a partygoer Haruo Nakajima as Godzilla and a reporter Katsumi Tezuka as Godzilla and a newspaper deskman Cast taken from Japan's Favorite Mon-Star. Themes In the film, Godzilla symbolizes nuclear holocaust from Japan's perspective and has since been culturally identified as a strong metaphor for nuclear weapons. Producer Tomoyuki Tanaka stated that, "The theme of the film, from the beginning, was the terror of the bomb. Mankind had created the bomb, and now nature was going to take revenge on mankind." Director Ishirō Honda filmed Godzilla's Tokyo rampage to mirror the atomic bombings of Hiroshima and Nagasaki, stating, "If Godzilla had been a dinosaur or some other animal, he would have been killed by just one cannonball. But if he were equal to an atomic bomb, we wouldn't know what to do. So, I took the characteristics of an atomic bomb and applied them to Godzilla." On March 1, 1954, just a few months before the film was made, the Japanese fishing vessel Daigo Fukuryū Maru ("Lucky Dragon No. 5") had been showered with radioactive fallout from the U.S. military's 15-megaton "Castle Bravo" hydrogen bomb test at nearby Bikini Atoll. The boat's catch was contaminated, spurring a panic in Japan about the safety of eating fish, and the crew was sickened, with one crew member eventually dying from radiation sickness. This event led to the emergence of a large and enduring anti-nuclear movement that gathered 30 million signatures on an anti-nuclear petition by August 1955 and eventually became institutionalized as the Japan Council against Atomic and Hydrogen Bombs. The film's opening scene of Godzilla destroying a Japanese vessel is a direct reference to these events, and had a strong impact on Japanese viewers with this recent event still fresh in the mind of the public. Academics Anne Allison, Thomas Schnellbächer, and Steve Ryfle have said that Godzilla contains political and cultural undertones that can be attributed to what the Japanese had experienced in World War II and that Japanese audiences were able to connect emotionally to the monster. They theorized that these viewers saw Godzilla as a victim and felt that the creature's backstory reminded them of their experiences in World War II. These academics have also claimed that as the atomic bomb testing that woke Godzilla was carried out by the United States, the film in a way can be seen to blame the United States for the problems and struggles that Japan experienced after World War II had ended. They also felt that the film could have served as a cultural coping method to help the people of Japan move on from the events of the war. Brian Merchant from Motherboard called the film "a bleak, powerful metaphor for nuclear power that still endures today" and on its themes, he stated: "It's an unflinchingly bleak, deceptively powerful film about coping with and taking responsibility for incomprehensible, manmade tragedy. Specifically, nuclear tragedies. It's arguably the best window into post-war attitudes towards nuclear power we've got—as seen from the perspective of its greatest victims." Terrence Rafferty from The New York Times said Godzilla was "an obvious gigantic, unsubtle, grimly purposeful metaphor for the atomic bomb" and felt the film was "extraordinarily solemn, full of earnest discussions". Mark Jacobson from the website of New York magazine said that Godzilla "transcends humanist prattle. Very few constructs have so perfectly embodied the overriding fears of a particular era. He is the symbol of a world gone wrong, a work of man that once created cannot be taken back or deleted. He rears up out of the sea as a creature of no particular belief system, apart from even the most elastic version of evolution and taxonomy, a reptilian id that lives inside the deepest recesses of the collective unconscious that cannot be reasoned with, a merciless undertaker who broaches no deals." Regarding the film, Jacobson stated, "Honda's first Godzilla... is in line with these inwardly turned post-war films and perhaps the most brutally unforgiving of them. Shame-ridden self-flagellation was in order, and who better to supply the rubber-suited psychic punishment than the Rorschach-shaped big fella himself?" Tim Martin from The Daily Telegraph (London) said that the original 1954 film was "a far cry from its B-movie successors. It was a sober allegory of a film with ambitions as large as its thrice-normal budget, designed to shock and horrify an adult audience. Its roster of frightening images—cities in flames, overstuffed hospitals, irradiated children—would have been all too familiar to cinemagoers for whom memories of Hiroshima and Nagasaki were still less than a decade old, while its script posed deliberately inflammatory questions about the balance of postwar power and the development of nuclear energy." Martin also commented how the film's themes were omitted in the American version, stating, "Its thematic preoccupation with nuclear energy proved even less acceptable to the American distributors who, after buying the film, began an extensive reshoot and recut for Western markets." Production Crew Ishirō Honda – director, co-writer Eiji Tsuburaya – special effects director Kōji Kajita – assistant director Teruo Maki – production manager Choshiro Ishii – lighting Takeo Kita – chief art director Satoshi Chuko – art director Akira Watanabe – special effects art director Kuichirō Kishida – special effects lighting Teizō Toshimitsu – monster builder Hisashi Shimonaga – sound recording Ichiro Minawa – sound and musical effects Personnel taken from Japan's Favorite Mon-Star. Development In 1954, Toho originally planned to produce , a Japanese-Indonesian co-production about the aftermath of the Japanese occupation of Indonesia. However, anti-Japanese sentiment in Indonesia forced political pressure on the government to deny visas for the Japanese filmmakers. The film was to be co-produced with Perfini, filmed on location in Jakarta in color (a first for a major Toho production), and was to open markets for Japanese films in Southeast Asia. Producer Tomoyuki Tanaka flew to Jakarta to renegotiate with the Indonesian government but was unsuccessful and on the flight back to Japan, conceived the idea for a giant monster film inspired by the 1953 film The Beast from 20,000 Fathoms and the Daigo Fukuryū Maru incident that happened in March 1954. The film's opening sequence is a direct reference to the incident. Tanaka felt the film had potential due to nuclear fears generating news and monster films becoming popular, due to the financial success of The Beast from 20,000 Fathoms and the 1952 re-release of King Kong, the latter of which earned more money than previous releases. During his flight, Tanaka wrote an outline with the working title and pitched it to executive producer Iwao Mori. Mori approved the project in mid–April 1954 after special effects director Eiji Tsuburaya agreed to do the film's effects and confirmed that the film was financially feasible. Mori also felt the project was a perfect vehicle for Tsuburaya and to test the storyboarding system that he instituted at the time. Mori also approved Tanaka's choice to have Ishirō Honda direct the film and shortened the title of the production to Project G (G for Giant), as well as giving the production classified status and ordered Tanaka to minimize his attention on other films and mainly focus on Project G. Toho originally intended for Senkichi Taniguchi to direct the film, as he was originally attached to direct In the Shadow of Glory, however, Taniguchi declined the assignment. Honda was not Toho's first choice for the film's director, however, his war-time experience made him an ideal candidate for the film's anti-nuclear themes. Several other directors passed on the project, feeling the idea was "stupid", however, Honda accepted the assignment due to this interest in science and "unusual things", stating, "I had no problem taking it seriously." It was during the production of Godzilla that Honda worked with assistant director Kōji Kajita for the first time. Afterwards, Kajita would go on to collaborate with Honda as his chief assistant director for 17 films over the course of 10 years. Due to sci-fi films lacking respect from film critics, Honda, Tanaka, and Tsuburaya agreed on depicting a monster attack as if it were a real event, with the serious tone of a documentary. Writing Tsuburaya submitted an outline of his own, written three years prior to Godzilla; it featured a giant octopus attacking ships in the Indian Ocean. In May 1954, Tanaka hired sci-fi writer Shigeru Kayama to write the story. Only 50 pages long and written in 11 days, Kayama's treatment depicted Dr. Yamane wearing dark shades, a cape and living in a European-style house from which he only emerged at night. Godzilla was portrayed as more animal-like by coming ashore to feed on animals, with an ostensibly gorilla-like interest in females. Kayama's story also featured less destruction and borrowed a scene from The Beast from 20,000 Fathoms by having Godzilla attack a lighthouse. Takeo Murata and Honda co-wrote the screenplay in three weeks, confining themselves in a Japanese inn in Tokyo's Shibuya ward. On writing the script, Murata stated, "Director Honda and I... racked our brains to make Mr. Kayama's original treatment into a full, working vision." Murata said that Tsuburaya and Tanaka pitched their ideas as well. Tanaka requested that they do not spend too much money, while Tsuburaya encouraged them to "do whatever it takes to make it work". Murata and Honda redeveloped key characters and elements by adding Emiko's love triangle. In Kayama's story, Serizawa was depicted as merely a colleague of Dr. Yamane's. Godzilla's full appearance was to be revealed during the Odo Island hurricane but Honda and Murata opted to show parts of the creature as the film built up to his full reveal. Honda and Murata also introduced the characters Hagiwara and Dr. Tanabe in their draft but the role of Shinkichi, who had a substantial role in Kayama's story, was cut down. A novelization, written by Kayama, was published on October 25, 1954 by Iwatani Bookstore as . Creature design Godzilla was designed by Teizō Toshimitsu and Akira Watanabe under Eiji Tsuburaya's supervision. Early on, Tanaka contemplated having the monster be gorilla-like or whale-like in design, due to the name "Gojira" (a combination of the Japanese words for "gorilla", , and "whale", ), but eventually settled on a dinosaur-like design. Kazuyoshi Abe was hired earlier to design Godzilla but his ideas were later rejected due to Godzilla looking too humanoid and mammalian, with a head shaped like a mushroom cloud; however, Abe was retained to help draw the film's storyboards. Toshimitsu and Watanabe decided to base Godzilla's design on dinosaurs and, by using dinosaur books and magazines as a reference, combined elements of a Tyrannosaurus, Iguanodon and the dorsal fins of a Stegosaurus. Despite wanting to have utilized stop motion animation, Tsuburaya reluctantly settled on suitmation. Toshimitsu sculpted three clay models on which the suit would be based. The first two were rejected but the third was approved by Tsuburaya, Tanaka, and Honda. The Godzilla suit was constructed by Kanji Yagi, Koei Yagi, and Eizo Kaimai, who used thin bamboo sticks and wire to build a frame for the interior of the suit and added metal mesh and cushioning over it to bolster its structure and finally applied coats of latex. Coats of molten rubber were additionally applied, followed by carved indentations and strips of latex glued onto the surface of the suit to create Godzilla's scaly hide. This first version of the suit weighed 100 kilograms (220 pounds). For close-ups, Toshimitsu created a smaller scale, mechanical, hand-operated puppet that sprayed streams of mist from its mouth to act as Godzilla's atomic breath. Haruo Nakajima and Katsumi Tezuka were chosen to perform in the Godzilla suit, due to their strength and endurance. At the first costume fitting, Nakajima fell down while inside the suit, due to the heavy latex and inflexible materials used to create the suit. This first version of the suit was cut in half and used for scenes requiring
sends paleontologist Kyohei Yamane to lead an investigation on the island, where giant radioactive footprints and a trilobite are discovered. The village alarm bell is rung and Yamane and the villagers rush to see the monster, retreating after seeing that it is a giant dinosaur. Yamane presents his findings in Tokyo, estimating that Godzilla is 50 m tall and is evolved from an ancient sea creature becoming a terrestrial creature. He concludes that Godzilla has been disturbed by underwater hydrogen bomb testing. Debate ensues about notifying the public about the danger of the monster. Meanwhile, 17 ships are lost at sea. Ten frigates are dispatched to attempt to kill the monster using depth charges. The mission disappoints Yamane, who wants Godzilla to be studied. When Godzilla survives the attack, officials appeal to Yamane for ideas to kill the monster, but Yamane tells them that Godzilla is unkillable, having survived H-bomb testing, and must be studied. Yamane's daughter, Emiko, decides to break off her arranged engagement to Yamane's colleague, Daisuke Serizawa, because of her love for Hideto Ogata, a salvage ship captain. When a reporter arrives and asks to interview Serizawa, Emiko escorts the reporter to Serizawa's home. After Serizawa refuses to divulge his current work to the reporter, he gives Emiko a demonstration of his recent project on the condition that she must keep it a secret. The demonstration horrifies her and she leaves without mentioning the engagement. Shortly after she returns home, Godzilla surfaces from Tokyo Bay and attacks Shinagawa. After attacking a passing train, Godzilla returns to the ocean. After consulting international experts, the Japanese Self-Defense Forces construct a 30 m tall and 50,000 volt electrified fence along the coast and deploy forces to stop and kill Godzilla. Dismayed that there is no plan to study Godzilla for its resistance to radiation, Yamane returns home, where Emiko and Ogata await, hoping to get his consent for them to wed. When Ogata disagrees with Yamane, arguing that the threat that Godzilla poses outweighs any potential benefits from studying the monster, Yamane tells him to leave. Godzilla resurfaces and breaks through the fence to Tokyo with its atomic breath, unleashing more destruction across the city. Further attempts to kill the monster with tanks and fighter jets fail and Godzilla returns to the ocean. The day after, hospitals and shelters are crowded with the maimed and the dead, with some survivors suffering from radiation sickness. Distraught by the devastation, Emiko tells Ogata about Serizawa's research, a weapon called the "Oxygen Destroyer", which disintegrates oxygen atoms and causes organisms to die of a rotting asphyxiation. Emiko and Ogata go to Serizawa to convince him to use the Oxygen Destroyer but he initially refuses, explaining that if he uses the device, the superpowers of the world will surely force him to construct more Oxygen Destroyers for use as a superweapon. After watching a program displaying the nation's current tragedy, Serizawa finally accepts their pleas. As Serizawa burns his notes, Emiko breaks down crying. A navy ship takes Ogata and Serizawa to plant the device in Tokyo Bay. After finding Godzilla, Serizawa unloads the device and cuts off his air support, taking the secret of the Oxygen Destroyer to his grave. Godzilla is destroyed, but many mourn Serizawa's death. Yamane believes that if nuclear weapons testing continues, another Godzilla may rise in the future. Cast Akira Takarada as Hideto Ogata Momoko Kōchi as Emiko Yamane Akihiko Hirata as Dr. Daisuke Serizawa Takashi Shimura as Dr. Kyohei Yamane Fuyuki Murakami as Dr. Tanabe Sachio Sakai as Hagiwara Ren Yamamoto as Masaji Yamada Toyoaki Suzuki as Shinkichi Yamada Toranosuke Ogawa as the President of the Nankai Shipping Company Hiroshi Hayashi as the Chairman of Diet Committee Seijiro Onda as Oyama, Diet Committee member Kin Sugai as Ozawa, Diet Committee member Kokuten Kōdō as the old fisherman Tadashi Okabe as the assistant of Dr. Tanabe Jiro Mitsuaki as an employee of the Nankai Salvage Company Ren Imaizumi as a radio officer of the Nankai Salvage Company Sokichi Maki as the chief at the Maritime Safety Agency Kenji Sahara as a partygoer Haruo Nakajima as Godzilla and a reporter Katsumi Tezuka as Godzilla and a newspaper deskman Cast taken from Japan's Favorite Mon-Star. Themes In the film, Godzilla symbolizes nuclear holocaust from Japan's perspective and has since been culturally identified as a strong metaphor for nuclear weapons. Producer Tomoyuki Tanaka stated that, "The theme of the film, from the beginning, was the terror of the bomb. Mankind had created the bomb, and now nature was going to take revenge on mankind." Director Ishirō Honda filmed Godzilla's Tokyo rampage to mirror the atomic bombings of Hiroshima and Nagasaki, stating, "If Godzilla had been a dinosaur or some other animal, he would have been killed by just one cannonball. But if he were equal to an atomic bomb, we wouldn't know what to do. So, I took the characteristics of an atomic bomb and applied them to Godzilla." On March 1, 1954, just a few months before the film was made, the Japanese fishing vessel Daigo Fukuryū Maru ("Lucky Dragon No. 5") had been showered with radioactive fallout from the U.S. military's 15-megaton "Castle Bravo" hydrogen bomb test at nearby Bikini Atoll. The boat's catch was contaminated, spurring a panic in Japan about the safety of eating fish, and the crew was sickened, with one crew member eventually dying from radiation sickness. This event led to the emergence of a large and enduring anti-nuclear movement that gathered 30 million signatures on an anti-nuclear petition by August 1955 and eventually became institutionalized as the Japan Council against Atomic and Hydrogen Bombs. The film's opening scene of Godzilla destroying a Japanese vessel is a direct reference to these events, and had a strong impact on Japanese viewers with this recent event still fresh in the mind of the public. Academics Anne Allison, Thomas Schnellbächer, and Steve Ryfle have said that Godzilla contains political and cultural undertones that can be attributed to what the Japanese had experienced in World War II and that Japanese audiences were able to connect emotionally to the monster. They theorized that these viewers saw Godzilla as a victim and felt that the creature's backstory reminded them of their experiences in World War II. These academics have also claimed that as the atomic bomb testing that woke Godzilla was carried out by the United States, the film in a way can be seen to blame the United States for the problems and struggles that Japan experienced after World War II had ended. They also felt that the film could have served as a cultural coping method to help the people of Japan move on from the events of the war. Brian Merchant from Motherboard called the film "a bleak, powerful metaphor for nuclear power that still endures today" and on its themes, he stated: "It's an unflinchingly bleak, deceptively powerful film about coping with and taking responsibility for incomprehensible, manmade tragedy. Specifically, nuclear tragedies. It's arguably the best window into post-war attitudes towards nuclear power we've got—as seen from the perspective of its greatest victims." Terrence Rafferty from The New York Times said Godzilla was "an obvious gigantic, unsubtle, grimly purposeful metaphor for the atomic bomb" and felt the film was "extraordinarily solemn, full of earnest discussions". Mark Jacobson from the website of New York magazine said that Godzilla "transcends humanist prattle. Very few constructs have so perfectly embodied the overriding fears of a particular era. He is the symbol of a world gone wrong, a work of man that once created cannot be taken back or deleted. He rears up out of the sea as a creature of no particular belief system, apart from even the most elastic version of evolution and taxonomy, a reptilian id that lives inside the deepest recesses of the collective unconscious that cannot be reasoned with, a merciless undertaker who broaches no deals." Regarding the film, Jacobson stated, "Honda's first Godzilla... is in line with these inwardly turned post-war films and perhaps the most brutally unforgiving of them. Shame-ridden self-flagellation was in order, and who better to supply the rubber-suited psychic punishment than the Rorschach-shaped big fella himself?" Tim Martin from The Daily Telegraph (London) said that the original 1954 film was "a far cry from its B-movie successors. It was a sober allegory of a film with ambitions as large as its thrice-normal budget, designed to shock and horrify an adult audience. Its roster of frightening images—cities in flames, overstuffed hospitals, irradiated children—would have been all too familiar to cinemagoers for whom memories of Hiroshima and Nagasaki were still less than a decade old, while its script posed deliberately inflammatory questions about the balance of postwar power and the development of nuclear energy." Martin also commented how the film's themes were omitted in the American version, stating, "Its thematic preoccupation with nuclear energy proved even less acceptable to the American distributors who, after buying the film, began an extensive reshoot and recut for Western markets." Production Crew Ishirō Honda – director, co-writer Eiji Tsuburaya – special effects director Kōji Kajita – assistant director Teruo Maki – production manager Choshiro Ishii – lighting Takeo Kita – chief art director Satoshi Chuko – art director Akira Watanabe – special effects art director Kuichirō Kishida – special effects lighting Teizō Toshimitsu – monster builder Hisashi Shimonaga – sound recording Ichiro Minawa – sound and musical effects Personnel taken from Japan's Favorite Mon-Star. Development In 1954, Toho originally planned to produce , a Japanese-Indonesian co-production about the aftermath of the Japanese occupation of Indonesia. However, anti-Japanese sentiment in Indonesia forced political pressure on the government to deny visas for the Japanese filmmakers. The film was to be co-produced with Perfini, filmed on location in Jakarta in color (a first for a major Toho production), and was to open markets for Japanese films in Southeast Asia. Producer Tomoyuki Tanaka flew to Jakarta to renegotiate with the Indonesian government but was unsuccessful and on the flight back to Japan, conceived the idea for a giant monster film inspired by the 1953 film The Beast from 20,000 Fathoms and the Daigo Fukuryū Maru incident that happened in March 1954. The film's opening sequence is a direct reference to the incident. Tanaka felt the film had potential due to nuclear fears generating news and monster films becoming popular, due to the financial success of The Beast from 20,000 Fathoms and the 1952 re-release of King Kong, the latter of which earned more money than previous releases. During his flight, Tanaka wrote an outline with the working title and pitched it to executive producer Iwao Mori. Mori approved the project in mid–April 1954 after special effects director Eiji Tsuburaya agreed to do the film's effects and confirmed that the film was financially feasible. Mori also felt the project was a perfect vehicle for Tsuburaya and to test the storyboarding system that he instituted at the time. Mori also approved Tanaka's choice to have Ishirō Honda direct the film and shortened the title of the production to Project G (G for Giant), as well as giving the production classified status and ordered Tanaka to minimize his attention on other films and mainly focus on Project G. Toho originally intended for Senkichi Taniguchi to direct the film, as he was originally attached to direct In the Shadow of Glory, however, Taniguchi declined the assignment. Honda was not Toho's first choice for the film's director, however, his war-time experience made him an ideal candidate for the film's anti-nuclear themes. Several other directors passed on the project, feeling the idea was "stupid", however, Honda accepted the assignment due to this interest in science and "unusual things", stating, "I had no problem taking it seriously." It was during the production of Godzilla that Honda worked with assistant director Kōji Kajita for the first time. Afterwards, Kajita would go on to collaborate with Honda as his chief assistant director for 17 films over the course of 10 years. Due to sci-fi films lacking respect from film critics, Honda, Tanaka, and Tsuburaya agreed on depicting a monster attack as if it were a real event, with the serious tone of a documentary. Writing Tsuburaya submitted an outline of his own, written three years prior to Godzilla; it featured a giant octopus attacking ships in the Indian Ocean. In May 1954, Tanaka hired sci-fi writer Shigeru Kayama to write the story. Only 50 pages long and written in 11 days, Kayama's treatment depicted Dr. Yamane wearing dark shades, a cape and living in a European-style house from which he only emerged at night. Godzilla was portrayed as more animal-like by coming ashore to feed on animals, with an ostensibly gorilla-like interest in females. Kayama's story also featured less destruction and borrowed a scene from The Beast from 20,000 Fathoms by having Godzilla attack a lighthouse. Takeo Murata and Honda co-wrote the screenplay in three weeks, confining themselves in a Japanese inn in Tokyo's Shibuya ward. On writing the script, Murata stated, "Director Honda and I... racked our brains to make Mr. Kayama's original treatment into a full, working vision." Murata said that Tsuburaya and Tanaka pitched their ideas as well. Tanaka requested that they do not spend too much money, while Tsuburaya encouraged them to "do whatever it takes to make it work". Murata and Honda redeveloped key characters and elements by adding Emiko's love triangle. In Kayama's story, Serizawa was depicted as merely a colleague of Dr. Yamane's. Godzilla's full appearance was to be revealed during the Odo Island hurricane but Honda and Murata opted to show parts of the creature as the film built up to his full reveal. Honda and Murata also introduced the characters Hagiwara and Dr. Tanabe in their draft but the role of Shinkichi, who had a substantial role in Kayama's story, was cut down. A novelization, written by Kayama, was published on October 25, 1954 by Iwatani Bookstore as . Creature design
like King Kong, Invasion of the Body Snatchers, Alien and The Thing. A draft story entitled The Resurrection of Godzilla was submitted by Akira Murao in 1980, and had Godzilla pitted against a shape-shifting monster called Bakan in the backdrop of an illegal nuclear waste disposal site, though the project was cancelled due to budgetary concerns. In 1983, American director Steve Miner proposed directing a Godzilla film at his own expense. Toho approved of the project, and Miner hired Fred Dekker to write the screenplay and paleosculptor Steve Czerkas to redesign the monster. The project was however hampered by Miner's insistence on using prohibitively costly stop-motion animation and shooting the film in 3D, and was thus rejected by major American movie studios. Under pressure from a 10,000-member group of Japanese Godzilla fans calling themselves the "Godzilla Resurrection Committee", Tanaka decided to helm a Japanese film for "strictly domestic consumption" to be released jointly alongside Miner's movie. In an effort to disavow Godzilla's increasingly heroic and anthropomorphic depiction in previous films, Tanaka insisted on making a direct sequel to the original 1954 movie. He hired screenwriter Shuichi Nagahara, who wrote a screenplay combining elements of the previously cancelled The Resurrection of Godzilla and Miner's still unproduced film, including an intensification of hostilities during the Cold War and a flying fortress which fires missiles into Godzilla's mouth. Koji Hashimoto was hired as director after Ishirō Honda declined the offer, as he was assisting Akira Kurosawa with Kagemusha and Ran, and felt that the franchise should have been discontinued after the death of Eiji Tsuburaya. Composer Akira Ifukube was offered to score the film but respectfully declined. At the time, it was rumored that Ifukube refused to participate in the film due to the changes made to Godzilla, stating, "I do not write music for 80-meter monsters". However, this quote was later clarified, by Ifukube's biographer Erik Homenick and Japanese Giants editor Ed Godziszewski, as a joke spread by fans which was later misinterpreted as fact. Ifukube declined to score the film due to his priorities, at the time, teaching composition at the Tokyo College of Music. Special effects The special effects were directed by Teruyoshi Nakano, who had directed the special effects of several previous Godzilla films. The decision was made by Tanaka to increase the apparent height of Godzilla from to so that Godzilla would not be dwarfed by the contemporary skyline of Tokyo. This meant that the miniatures had to be built to a th scale, and this contributed to an increase in the budget of the film to $6.25 million. Tanaka and Nakano supervised suit-maker Noboyuki Yasumaru in constructing a new Godzilla design, incorporating ears and four toes, features not seen since Godzilla Raids Again. Nakano insisted on infusing elements into the design that suggested sadness, such as downward-slanting eyes and sloping shoulders. Suit construction took two months, and consisted of separately casting body-part molds with urethane on a pre-built, life-size statue of the final design. Yasumaru personally took charge of all phases of suit-building, unlike in previous productions wherein the different stages of suit-production were handled by different craftsmen. The final suit was constructed to accommodate stuntman Hiroshi Yamawaki, but he declined suddenly, and was replaced by veteran suit actor Kenpachiro Satsuma, who had portrayed Hedorah and Gigan in the Showa Era. Because the suit wasn't built to his measurements, Satsuma had difficulty performing, being able to last only ten minutes within it, and losing 12 pounds during filming. Hoping to avoid having Godzilla move in an overly human fashion, Nakano instructed Satsuma to base his actions on Noh, a traditional Japanese dance. Taking inspiration from the publicity surrounding the 40-foot tall King Kong model from Dino De Laurentiis's 1976 film of the same name, Toho spent a reported ¥52,146 (approximately $475.00) on a 16-foot high robotic Godzilla (dubbed "Cybot") for use in close-up shots of the creature's head. The Cybot consisted of a hydraulically-powered mechanical endoskeleton covered in urethane skin containing 3,000 computer operated parts which permitted it to tilt its head, and move its lips and arms. Unlike previous Godzilla suits, whose lower jaws consisted of wire-operated flaps, the Cybot's jaws were hinged like those of an actual animal, and slid back as they opened. A life-size, crane operated foot was also built for close-up shots of city destruction scenes.
produced in the Showa era. In Japan, the film was followed by Godzilla vs. Biollante in 1989. The Return of Godzilla stars Ken Tanaka, Yasuko Sawaguchi, Yosuke Natsuki, and Keiju Kobayashi, with Kenpachiro Satsuma as Godzilla. The film serves as both a sequel to the original 1954 film and a reboot of the franchise that ignores the events of every Shōwa era film aside from the original Godzilla, placing itself in line with the darker tone and themes of the original film and returning Godzilla to his destructive, antagonistic roots. The film was released theatrically in Japan on December 15, 1984. The following year, in the United States, New World Pictures released Godzilla 1985, a heavily re-edited American adaptation of the film which includes additional footage, and features Raymond Burr reprising his role from the 1956 film Godzilla, King of the Monsters!. Plot The Japanese fishing vessel Yahata-Maru is caught in strong currents off the shores of Daikoku Island. As the boat drifts into shore, the island begins to erupt, and a giant monster lifts itself out of the volcano. A few days later, reporter Goro Maki is sailing in the area and finds the vessel intact but deserted. As he explores the vessel, he finds all the crew dead except for Hiroshi Okumura, who has been badly wounded. Suddenly a giant Shockirus sea louse attacks him but he is saved by Okumura. In Tokyo, Okumura realizes by looking at pictures that the monster he saw was a new Godzilla. Maki writes an article about the account, but the news of Godzilla's return is kept secret and his article is withheld. Maki visits Professor Hayashida, whose parents were lost in the 1954 Godzilla attack. Hayashida describes Godzilla as a living, invincible nuclear weapon able to cause mass destruction. At Hayashida's laboratory, Maki meets Okumura's sister, Naoko, and informs her that her brother is alive and at the police hospital. A Soviet submarine is destroyed in the Pacific. The Soviets believe the attack was perpetrated by the Americans, and a diplomatic crisis ensues, which threatens to escalate into nuclear war. The Japanese intervene and reveal that Godzilla was behind the attacks. The Japanese cabinet meets to discuss Japan's defense. A new weapon is revealed, the Super X, a specially-armored flying fortress that will defend the capital. The Japanese military is put on alert. Godzilla attacks the Ihama nuclear power plant in Shizuoka Prefecture. While feeding off the reactor, he is distracted by a flock of birds and leaves the facility. Hayashida believes that Godzilla was distracted instinctively by a homing signal from the birds. Hayashida, together with geologist Minami, propose to the Japanese Cabinet, that Godzilla could be lured back to Mount Mihara on Ōshima Island by a similar signal, and a volcanic eruption could be started, capturing Godzilla. Prime Minister Mitamura meets with Soviet and American envoys and declares that nuclear weapons will not be used on Godzilla, even if he were to attack the Japanese mainland. Meanwhile, the Soviets have their own plans to counter the threat posed by Godzilla, and a Soviet control ship disguised as a freighter in Tokyo Harbor prepares to launch a nuclear missile from one of their orbiting satellites should Godzilla attack. Godzilla is sighted at dawn in Tokyo Bay heading towards Tokyo, causing mass evacuations. The JASDF attacks Godzilla but fails to stop his advance on the city. Godzilla soon emerges and makes short work of the JSDF stationed there. The battle causes damage to the Soviet ship and starts a missile launch countdown. The captain dies as he attempts to stop the missile from launching. Godzilla proceeds towards Shinjuku, wreaking havoc along the way. Godzilla is confronted by four laser-armed trucks and the Super X. Because Godzilla's heart is similar to a nuclear reactor, the cadmium shells that are fired into his mouth by the Super X seal and slow down his heart, knocking Godzilla unconscious. The countdown ends and the Soviet missile is launched, but it is destroyed by an American counter-missile. Hayashida and Okumura are extracted from Tokyo via helicopter and taken to Mt. Mihara to set up the homing device before the two missiles collide above Tokyo. The destruction of the nuclear missile produces an electrical storm and an EMP, which revives Godzilla once more and temporarily disables the Super X. An enraged Godzilla bears down on the Super X just as it manages to get airborne again. The Super X's weapons prove ineffective against the kaiju, resulting in even more destruction in the city as Godzilla chases it through several skyscrapers. Godzilla finally destroys the Super X by dropping a skyscraper on top of it and continues his rampage, until Hayashida uses the homing device to distract him. Godzilla leaves Tokyo and swims across Tokyo Bay, following the homing device to Mount Mihara. There, Godzilla follows the device and falls into the mouth of the volcano. Okumura activates detonators at the volcano, creating a controlled
as member states and became part of a French protectorship, Fichte delivered the famous Addresses to the German Nation (Reden an die deutsche Nation, 1807-1808) which attempted to define the German Nation, and guided the uprising against Napoleon. He became a professor at the new University of Berlin, founded in 1810. By the votes of his colleagues Fichte was unanimously elected its rector in the succeeding year. But, once more, his impetuosity and reforming zeal led to friction, and he resigned in 1812. The campaign against Napoleon began, and the hospitals at Berlin were soon full of patients. Fichte's wife devoted herself to nursing and caught a virulent fever. Just as she was recovering, he became sick with typhus and died in 1814 at the age of 51. His son, Immanuel Hermann Fichte (18 July 1796 – 8 August 1879), also made contributions to philosophy. Philosophical work Fichte's critics argued that his mimicry of Kant's difficult style produced works that were barely intelligible. "He made no hesitation in pluming himself on his great skill in the shadowy and obscure, by often remarking to his pupils, that 'there was only one man in the world who could fully understand his writings; and even he was often at a loss to seize upon his real meaning. On the other hand, Fichte acknowledged the difficulty, but argued that his works were clear and transparent to those who made the effort to think without preconceptions and prejudices. Fichte did not endorse Kant's argument for the existence of noumena, of "things in themselves", the supra-sensible reality beyond direct human perception. Fichte saw the rigorous and systematic separation of "things in themselves" (noumena) and things "as they appear to us" (phenomena) as an invitation to skepticism. Rather than invite skepticism, Fichte made the radical suggestion that we should throw out the notion of a noumenal world and accept that consciousness does not have a grounding in a so-called "real world". In fact, Fichte achieved fame for originating the argument that consciousness is not grounded in outside of itself. The phenomenal world as such, arises from consciousness; the activity of the I; and moral awareness. His student (and critic), Arthur Schopenhauer, wrote: Søren Kierkegaard was also a student of the writings of Fichte: Central theory In Foundations of Natural Right (1797), Fichte argued that self-consciousness was a social phenomenon — an important step and perhaps the first clear step taken in this direction by modern philosophy. For Fichte, a necessary condition of every subject's self-awareness is the existence of other rational subjects. These others call or summon (fordern auf) the subject or self out of its unconsciousness and into an awareness of itself as a free individual. Fichte proceeds from the general principle that the I (das Ich) must posit itself as an individual in order to posit (setzen) itself at all, and that in order to posit itself as an individual, it must recognize itself to a calling or summons (Aufforderung) by other free individual(s) — called to limit its own freedom out of respect for the freedom of the others. The same condition applies to the others in development. Mutual recognition (gegenseitig anerkennen) of rational individuals is a condition necessary for the individual I. The argument for intersubjectivity is central to the conception of selfhood developed in the Foundations of the Science of Knowledge (Grundlage der gesamten Wissenschaftslehre, 1794/1795). Fichte's consciousness of the self depends upon resistance or a check by something that is understood as not part of the self yet is not immediately ascribable to a particular sensory perception. In his later 1796–99 lectures (his Nova methodo), Fichte incorporated this into his revised presentation of the foundations of his system, where the summons takes its place alongside original feeling, which takes the place of the earlier Anstoss (see below) as a limit on the absolute freedom and a condition for the positing of the I. The I posits this situation for itself. To posit does not mean to 'create' the objects of consciousness. The principle in question simply states that the essence of an I lies in the assertion of self-identity, i.e., that consciousness presupposes self-consciousness. Such immediate self-identity cannot be understood as a psychological fact, or an act or accident of some previously existing substance or being. It is an action of the I, but one that is identical with the very existence of this same I. In Fichte's technical terminology, the original unity of self-consciousness is an action and the product of the same I, as a "fact and/or act" (Thathandlung; Modern German: Tathandlung), a unity that is presupposed by and contained within every fact and every act of empirical consciousness, although it never appears as such. The I can posit itself only as limited. Moreover, it cannot even posit its own limitations, in the sense of producing or creating these limits. The finite I cannot be the ground of its own passivity. Instead, for Fichte, if the I is to posit itself, it must simply discover itself to be limited, a discovery that Fichte characterizes as an "impulse," "repulse," or "resistance" (Anstoss; Modern German: Anstoß) to the free practical activity of the I. Such an original limitation of the I is, however, a limit for the I only insofar as the I posits it out as a limit. The I does this, according to Fichte's analysis, by positing its own limitation, first, as only a feeling, then as a sensation, then as an intuition of a thing, and finally as a summons of another person. The Anstoss thus provides the essential impetus that first posits in motion the entire complex train of activities that finally result in our conscious experience both of ourselves and others as empirical individuals and of the world around us. Although Anstoss plays a similar role as the thing in itself does in Kantian philosophy, unlike Kant, Fichte's Anstoss is not something foreign to the I. Instead, it denotes the original encounter of the I with its own finitude. Rather than claim that the not-I (das Nicht-Ich) is the cause or ground of the Anstoss, Fichte argues that not-I is posited by the I in order to explain to itself the Anstoss in order to become conscious of Anstoss. The Wissenschaftslehre demonstrates that Anstoss must occur if self-consciousness is to come about but is unable to explain the actual occurrence of Anstoss. There are limits to what can be expected from an a priori deduction of experience, and this, for Fichte, equally applies to Kant's transcendental philosophy. According to Fichte, transcendental philosophy can explain that the world must have space, time, and causality, but it can never explain why objects have the particular sensible properties they happen to have or why I am this determinate individual rather than another. This is something that the I simply has to discover at the same time that it discovers its own freedom, and indeed, is a condition for the latter. Dieter Henrich (1966) proposed that Fichte was able to move beyond a "reflective theory of consciousness". According to Fichte, the self must already have some prior acquaintance with itself, independent of the act of reflection ("no object comes to consciousness except under the condition that I am aware of myself, the conscious subject [jedes Object kommt zum Bewusstseyn lediglich unter der Bedingung, dass ich auch meiner selbst, des bewusstseyenden Subjects mir bewusst sey]"). This idea is what Henrich called Fichte's original insight. Nationalism Between December 1807 and March 1808, Fichte gave a series of lectures concerning the "German nation" and its culture and language, projecting the kind of national education he hoped would raise it from the humiliation of its defeat at the hands of the French. Having been a supporter of Revolutionary France, Fichte became disenchanted by 1804 as Napoleon's armies advanced through Europe, occupying German territories, stripping them of their raw materials and subjugating them to foreign rule. He came to believe Germany would be responsible to carry the virtues of the French Revolution into the future. Furthermore, his nationalism was not aroused by Prussian military defeat and humiliation, for these had not yet occurred, but resulted from his own humanitarian philosophy. Disappointed in the French, he turned to the German nation as the instrument of fulfilling it. These lectures, entitled the Addresses to the German Nation, coincided with a period of reform in the Prussian government, under the chancellorship of Baron vom Stein. The Addresses display Fichte's interest during that period in language and culture as vehicles of human spiritual development. Fichte built upon earlier ideas of Johann Gottfried Herder and attempted to unite them with his approach. The aim of the German nation, according to Fichte, was to "found an empire of spirit and reason, and to annihilate completely the crude physical force that rules of the world." Like Herder's German nationalism, Fichte's was cultural, and grounded in the aesthetic, literary, and moral. However, Fichte's belief in a "Closed Commercial State", a state dominated economy and society, should be noted – as should its kinship with certain 20th-century governments in Germany and elsewhere. The nationalism propounded by Fichte in the Addresses would be used over a century later by the Nazi Party in Germany, which saw in Fichte a forerunner to its own nationalist ideology. Like Nietzsche, the association of Fichte with the Nazi regime came to colour readings of Fichte's German nationalism in the post-war period. This reading of Fichte was often bolstered through reference to an unpublished letter from 1793, Contributions to the Correction of the Public's Judgment concerning the French Revolution, wherein Fichte expressed anti-semitic sentiments, such as arguing against extending civil rights to Jews and calling them a "state within a state" that could "undermine" the German nation. However, attached to the letter is a footnote in which Fichte provides an impassioned plea for permitting Jews to practice their religion without hindrance. Furthermore, the final act of Fichte's academic career was to resign as rector of the University of Berlin in protest when his colleagues refused to punish the harassment of Jewish students. While recent scholarship has sought to dissociate Fichte's writings on nationalism with their adoption by the Nazi Party, the association continues to blight his legacy, although Fichte, as if to exclude all ground of doubt, clearly and distinctly prohibits, in his reworked version of The Science of Ethics as Based on the Science of Knowledge (see § Final period in Berlin) genocide and other crimes against humanity: If you say that it is your conscience's command to exterminate peoples for their sins, [...] we can confidently tell you that you are wrong; for such things can never be commanded against the free and moral force. Economics Fichte's 1800 economic treatise The Closed Commercial State had a profound influence on the economic theories of German Romanticism. In it, Fichte argues the need for the strictest, purely guild-like regulation of industry. The "exemplary rational state" (Vernunftstaat), Fichte argues, should not allow any of its "subjects" to engage in this or that production, failing to pass the preliminary test, not certifying government agents in their professional skills and agility. According to Vladimir Mikhailovich Shulyatikov, "this kind of demand was typical of Mittelstund, the German petty middle class, the class of artisans, hoping by creating artificial barriers to stop the victorious march of big capital and thus save themselves from inevitable death. The same demand was imposed on the state, as is evident from Fichte's treatise, by the German "factory" (Fabrike), more precisely, the manufacture of the early 19th century". Fichte opposed free trade and unrestrained capitalist industrial growth, stating: "There is an endless war of all against all ... And this war is becoming more fierce, unjust, more dangerous in its consequences, the more the world's population grows, the more acquisitions the trading state makes, the more production and art (industry) develops and, together with thus, the number of circulating goods increases, and with them the needs become more and more diversified. What, with the simple way of life of nations, was done before without great injustices and oppression, turns, thanks to increased needs, into flagrant injustice, into a source of great evils. The buyer tries to take the goods away from the seller; therefore he demands freedom of trade, i.e. freedom for the seller to wander around the markets, freedom not to find a sale for goods and sell them significantly below their value. Therefore, he requires strong competition between manufacturers (Fabrikanten) and merchants." The only means that could save the modern world, which would destroy evil at the root, is, according to Fichte, to split the "world state" (the global market) into separate self-sufficient bodies. Each such body, each "closed trading state" will be able to regulate its internal economic relations. It will be able to both extract and process everything that is needed to meet the needs of its citizens. It will carry out the ideal organization of production. Fichte argued for government regulation of industrial growth, writing "Only by limitation does a certain industry become the property of the class that deals with it". Vladimir Mikhailovich Shulyatikov considers the economics of German idealists and Romantics as representing the compromise of the German bourgeoisie of the early 19th century with the monarchical State: The French physiocrats proclaimed the principle: "Laissez faire!" On the other hand, the German capitalists of the 1800s, whose ideologists were the objective idealists, professed a belief in the saving effect of government tutelage. Women Fichte believed that "active citizenship, civic freedom and even property rights should be withheld from women, whose calling was to subject themselves utterly to the authority of their fathers and husbands." Final period in Berlin Fichte gave a wide range of public and private lectures in Berlin from the last decade of his life. These form some of his best known work, and are the basis of a revived German-speaking scholarly interest in his work. The lectures include two works from 1806. In The Characteristics of the Present Age (Die Grundzüge des gegenwärtigen Zeitalters), Fichte outlines his theory of different historical and cultural epochs. His mystic work The Way Towards the Blessed Life (Die Anweisung zum seligen Leben oder auch die Religionslehre) gave his fullest thoughts on religion. In 1807-1808 he gave a series of speeches in French-occupied Berlin, Addresses to the German Nation. In 1810, the new University of Berlin was established, designed along ideas put forward by Wilhelm von Humboldt. Fichte was made its rector and also the first Chair of Philosophy. This was in part because of educational themes in the Addresses, and in part because of his earlier work at Jena University. Fichte lectured on further versions of his Wissenschaftslehre. Of these, he only published a brief work from 1810, The Science of Knowledge in its General Outline (Die Wissenschaftslehre, in ihrem allgemeinen Umrisse dargestellt; also translated as Outline of the Doctrine of Knowledge). His son published some of these thirty years after his death. Most only became public in the last decades of the twentieth century, in his collected works. This included reworked versions of the Doctrine of Science (Wissenschaftslehre, 1810–1813), The Science of Rights (Das System der Rechtslehre, 1812), and The Science of Ethics as Based on the Science of Knowledge (Das System der Sittenlehre nach den Principien der Wissenschaftslehre, 1812; 1st ed. 1798). Bibliography Selected works in German Wissenschaftslehre Ueber den Begriff der Wissenschaftslehre oder der sogenannten Philosophie (1794) Grundlage der gesamten Wissenschaftslehre (1794/1795) Wissenschaftslehre nova methodo (1796–1799: "Halle Nachschrift," 1796/1797 and "Krause Nachschrift," 1798/1799) Versuch einer neuen Darstellung der Wissenschaftslehre (1797/1798) Darstellung der Wissenschaftslehre (1801) Die Wissenschaftslehre (1804, 1812, 1813) Die Wissenschaftslehre, in ihrem allgemeinen Umrisse dargestellt (1810) Other works in German Versuch einer Critik aller Offenbarung (1792) Beitrag zur Berichtigung der Urteile des Publikums über die französische Revolution (1793) Einige Vorlesungen über die Bestimmung des Gelehrten (1794) Grundlage des Naturrechts (1796) Das System der Sittenlehre nach den Principien der Wissenschaftslehre (1798) "Ueber den Grund unsers Glaubens an eine göttliche Weltregierung" (1798) "Appellation an das Publikum über die durch Churf. Sächs. Confiscationsrescript ihm beigemessenen atheistischen Aeußerungen. Eine Schrift, die man zu lesen bittet, ehe man sie confsicirt" (1799) Der geschlossene Handelsstaat. Ein philosophischer Entwurf als Anhang zur Rechtslehre und Probe einer künftig zu liefernden Politik (1800) Die Bestimmung des Menschen (1800) Friedrich Nicolais Leben und sonderbare Meinungen (1801) Philosophie der Maurerei. Briefe an Konstant (1802/03) Die Grundzüge des gegenwärtigen Zeitalters (1806) Die Anweisung zum seligen Leben oder auch die Religionslehre (1806) Reden an die deutsche Nation (1807/1808) Das System der Rechtslehre (1812) Correspondence Jacobi an Fichte, German Text (1799/1816), with Introduction and Critical Apparatus by Marco Ivaldo and Ariberto Acerbi (Introduction, German Text, Italian Translation, 3 Appendices with Jacobi's and Fichte's complementary Texts, Philological Notes, Commentary, Bibliography, Index): Istituto Italiano per gli Studi Filosofici Press, Naples 2011, . Collected works in German The new standard edition of Fichte's works in German, which supersedes all previous editions, is the Gesamtausgabe ("Collected Works" or "Complete Edition", commonly abbreviated as GA), prepared by the Bavarian Academy of Sciences: Gesamtausgabe der Bayerischen Akademie der Wissenschaften, 42 volumes, edited by , Hans Gliwitzky, Erich Fuchs and Peter Schneider, Stuttgart-Bad Cannstatt: Frommann-Holzboog, 1962–2012. It is organized into four parts:
us" (phenomena) as an invitation to skepticism. Rather than invite skepticism, Fichte made the radical suggestion that we should throw out the notion of a noumenal world and accept that consciousness does not have a grounding in a so-called "real world". In fact, Fichte achieved fame for originating the argument that consciousness is not grounded in outside of itself. The phenomenal world as such, arises from consciousness; the activity of the I; and moral awareness. His student (and critic), Arthur Schopenhauer, wrote: Søren Kierkegaard was also a student of the writings of Fichte: Central theory In Foundations of Natural Right (1797), Fichte argued that self-consciousness was a social phenomenon — an important step and perhaps the first clear step taken in this direction by modern philosophy. For Fichte, a necessary condition of every subject's self-awareness is the existence of other rational subjects. These others call or summon (fordern auf) the subject or self out of its unconsciousness and into an awareness of itself as a free individual. Fichte proceeds from the general principle that the I (das Ich) must posit itself as an individual in order to posit (setzen) itself at all, and that in order to posit itself as an individual, it must recognize itself to a calling or summons (Aufforderung) by other free individual(s) — called to limit its own freedom out of respect for the freedom of the others. The same condition applies to the others in development. Mutual recognition (gegenseitig anerkennen) of rational individuals is a condition necessary for the individual I. The argument for intersubjectivity is central to the conception of selfhood developed in the Foundations of the Science of Knowledge (Grundlage der gesamten Wissenschaftslehre, 1794/1795). Fichte's consciousness of the self depends upon resistance or a check by something that is understood as not part of the self yet is not immediately ascribable to a particular sensory perception. In his later 1796–99 lectures (his Nova methodo), Fichte incorporated this into his revised presentation of the foundations of his system, where the summons takes its place alongside original feeling, which takes the place of the earlier Anstoss (see below) as a limit on the absolute freedom and a condition for the positing of the I. The I posits this situation for itself. To posit does not mean to 'create' the objects of consciousness. The principle in question simply states that the essence of an I lies in the assertion of self-identity, i.e., that consciousness presupposes self-consciousness. Such immediate self-identity cannot be understood as a psychological fact, or an act or accident of some previously existing substance or being. It is an action of the I, but one that is identical with the very existence of this same I. In Fichte's technical terminology, the original unity of self-consciousness is an action and the product of the same I, as a "fact and/or act" (Thathandlung; Modern German: Tathandlung), a unity that is presupposed by and contained within every fact and every act of empirical consciousness, although it never appears as such. The I can posit itself only as limited. Moreover, it cannot even posit its own limitations, in the sense of producing or creating these limits. The finite I cannot be the ground of its own passivity. Instead, for Fichte, if the I is to posit itself, it must simply discover itself to be limited, a discovery that Fichte characterizes as an "impulse," "repulse," or "resistance" (Anstoss; Modern German: Anstoß) to the free practical activity of the I. Such an original limitation of the I is, however, a limit for the I only insofar as the I posits it out as a limit. The I does this, according to Fichte's analysis, by positing its own limitation, first, as only a feeling, then as a sensation, then as an intuition of a thing, and finally as a summons of another person. The Anstoss thus provides the essential impetus that first posits in motion the entire complex train of activities that finally result in our conscious experience both of ourselves and others as empirical individuals and of the world around us. Although Anstoss plays a similar role as the thing in itself does in Kantian philosophy, unlike Kant, Fichte's Anstoss is not something foreign to the I. Instead, it denotes the original encounter of the I with its own finitude. Rather than claim that the not-I (das Nicht-Ich) is the cause or ground of the Anstoss, Fichte argues that not-I is posited by the I in order to explain to itself the Anstoss in order to become conscious of Anstoss. The Wissenschaftslehre demonstrates that Anstoss must occur if self-consciousness is to come about but is unable to explain the actual occurrence of Anstoss. There are limits to what can be expected from an a priori deduction of experience, and this, for Fichte, equally applies to Kant's transcendental philosophy. According to Fichte, transcendental philosophy can explain that the world must have space, time, and causality, but it can never explain why objects have the particular sensible properties they happen to have or why I am this determinate individual rather than another. This is something that the I simply has to discover at the same time that it discovers its own freedom, and indeed, is a condition for the latter. Dieter Henrich (1966) proposed that Fichte was able to move beyond a "reflective theory of consciousness". According to Fichte, the self must already have some prior acquaintance with itself, independent of the act of reflection ("no object comes to consciousness except under the condition that I am aware of myself, the conscious subject [jedes Object kommt zum Bewusstseyn lediglich unter der Bedingung, dass ich auch meiner selbst, des bewusstseyenden Subjects mir bewusst sey]"). This idea is what Henrich called Fichte's original insight. Nationalism Between December 1807 and March 1808, Fichte gave a series of lectures concerning the "German nation" and its culture and language, projecting the kind of national education he hoped would raise it from the humiliation of its defeat at the hands of the French. Having been a supporter of Revolutionary France, Fichte became disenchanted by 1804 as Napoleon's armies advanced through Europe, occupying German territories, stripping them of their raw materials and subjugating them to foreign rule. He came to believe Germany would be responsible to carry the virtues of the French Revolution into the future. Furthermore, his nationalism was not aroused by Prussian military defeat and humiliation, for these had not yet occurred, but resulted from his own humanitarian philosophy. Disappointed in the French, he turned to the German nation as the instrument of fulfilling it. These lectures, entitled the Addresses to the German Nation, coincided with a period of reform in the Prussian government, under the chancellorship of Baron vom Stein. The Addresses display Fichte's interest during that period in language and culture as vehicles of human spiritual development. Fichte built upon earlier ideas of Johann Gottfried Herder and attempted to unite them with his approach. The aim of the German nation, according to Fichte, was to "found an empire of spirit and reason, and to annihilate completely the crude physical force that rules of the world." Like Herder's German nationalism, Fichte's was cultural, and grounded in the aesthetic, literary, and moral. However, Fichte's belief in a "Closed Commercial State", a state dominated economy and society, should be noted – as should its kinship with certain 20th-century governments in Germany and elsewhere. The nationalism propounded by Fichte in the Addresses would be used over a century later by the Nazi Party in Germany, which saw in Fichte a forerunner to its own nationalist ideology. Like Nietzsche, the association of Fichte with the Nazi regime came to colour readings of Fichte's German nationalism in the post-war period. This reading of Fichte was often bolstered through reference to an unpublished letter from 1793, Contributions to the Correction of the Public's Judgment concerning the French Revolution, wherein Fichte expressed anti-semitic sentiments, such as arguing against extending civil rights to Jews and calling them a "state within a state" that could "undermine" the German nation. However, attached to the letter is a footnote in which Fichte provides an impassioned plea for permitting Jews to practice their religion without hindrance. Furthermore, the final act of Fichte's academic career was to resign as rector of the University of Berlin in protest when his colleagues refused to punish the harassment of Jewish students. While recent scholarship has sought to dissociate Fichte's writings on nationalism with their adoption by the Nazi Party, the association continues to blight his legacy, although Fichte, as if to exclude all ground of doubt, clearly and distinctly prohibits, in his reworked version of The Science of Ethics as Based on the Science of Knowledge (see § Final period in Berlin) genocide and other crimes against humanity: If you say that it is your conscience's command to exterminate peoples for their sins, [...] we can confidently tell you that you are wrong; for such things can never be commanded against the free and moral force. Economics Fichte's 1800 economic treatise The Closed Commercial State had a profound influence on the economic theories of German Romanticism. In it, Fichte argues the need for the strictest, purely guild-like regulation of industry. The "exemplary rational state" (Vernunftstaat), Fichte argues, should not allow any of its "subjects" to engage in this or that production, failing to pass the preliminary test, not certifying government agents in their professional skills and agility. According to Vladimir Mikhailovich Shulyatikov, "this kind of demand was typical of Mittelstund, the German petty middle class, the class of artisans, hoping by creating artificial barriers to stop the victorious march of big capital and thus save themselves from inevitable death. The same demand was imposed on the state, as is evident from Fichte's treatise, by the German "factory" (Fabrike), more precisely, the manufacture of the early 19th century". Fichte opposed free trade and unrestrained capitalist industrial growth, stating: "There is an endless war of all against all ... And this war is becoming more fierce, unjust, more dangerous in its consequences, the more the world's population grows, the more acquisitions the trading state makes, the more production and art (industry) develops and, together with thus, the number of circulating goods increases, and with them the needs become more and more diversified. What, with the simple way of life of nations, was done before without great injustices and oppression, turns, thanks to increased needs, into flagrant injustice, into a source of great evils. The buyer tries to take the goods away from the seller; therefore he demands freedom of trade, i.e. freedom for the seller to wander around the markets, freedom not to find a sale for goods and sell them significantly below their value. Therefore, he requires strong competition between manufacturers (Fabrikanten) and merchants." The only means that could save the modern world, which would destroy evil at the root, is, according to Fichte, to split the "world state" (the global market) into separate self-sufficient bodies. Each such body, each "closed trading state" will be able to regulate its internal economic relations. It will be able to both extract and process everything that is needed to meet the needs of its citizens. It will carry out the ideal organization of production. Fichte argued for government regulation of industrial growth, writing "Only by limitation does a certain industry become the property of the class that deals with it". Vladimir Mikhailovich Shulyatikov considers the economics of German idealists and Romantics as representing the compromise of the German bourgeoisie of the early 19th century with the monarchical State: The French physiocrats proclaimed the principle: "Laissez faire!" On the other hand, the German capitalists of the 1800s, whose ideologists were the objective idealists, professed a belief in the saving effect of government tutelage. Women Fichte believed that "active citizenship, civic freedom and even property rights should be withheld from women, whose calling was to subject themselves utterly to the authority of their fathers and husbands." Final period in Berlin Fichte gave a wide range of public and private lectures in Berlin from the last decade of his life. These form some of his best known work, and are the basis of a revived German-speaking scholarly interest in his work. The lectures include two works from 1806. In The Characteristics of the Present Age (Die Grundzüge des gegenwärtigen Zeitalters), Fichte outlines his theory of different historical and cultural epochs. His mystic work The Way Towards the Blessed Life (Die Anweisung zum seligen Leben oder auch die Religionslehre) gave his fullest thoughts on religion. In 1807-1808 he gave a series of speeches in French-occupied Berlin, Addresses to the German Nation. In 1810, the new University of Berlin was established, designed along ideas put forward by Wilhelm von Humboldt. Fichte was made its rector and also the first Chair of Philosophy. This was in part because of educational themes in the Addresses, and in part because of his earlier work at Jena University. Fichte lectured on further versions of his Wissenschaftslehre. Of these, he only published a brief work from 1810, The Science of Knowledge in its General Outline (Die Wissenschaftslehre, in ihrem allgemeinen Umrisse dargestellt; also translated as Outline of the Doctrine of Knowledge). His son published some of these thirty years after his death. Most only became public in the last decades of the twentieth century, in his collected works. This included reworked versions of the Doctrine of Science (Wissenschaftslehre, 1810–1813), The Science of Rights (Das System der Rechtslehre, 1812), and The Science of Ethics as Based on the Science of Knowledge (Das System der Sittenlehre nach den Principien der Wissenschaftslehre, 1812; 1st ed. 1798). Bibliography Selected works in German Wissenschaftslehre Ueber den Begriff der Wissenschaftslehre oder der sogenannten Philosophie (1794) Grundlage der gesamten Wissenschaftslehre (1794/1795) Wissenschaftslehre nova methodo (1796–1799: "Halle Nachschrift," 1796/1797 and "Krause Nachschrift," 1798/1799) Versuch einer neuen Darstellung der Wissenschaftslehre (1797/1798) Darstellung der Wissenschaftslehre (1801) Die Wissenschaftslehre (1804, 1812, 1813) Die Wissenschaftslehre, in ihrem allgemeinen Umrisse dargestellt (1810) Other works in German Versuch einer Critik aller Offenbarung (1792) Beitrag zur Berichtigung der Urteile des Publikums über die französische Revolution (1793) Einige Vorlesungen über die Bestimmung des Gelehrten (1794) Grundlage des Naturrechts (1796) Das System der Sittenlehre nach den Principien der Wissenschaftslehre (1798) "Ueber den Grund unsers Glaubens an eine göttliche Weltregierung" (1798) "Appellation an das Publikum über die durch Churf. Sächs. Confiscationsrescript ihm beigemessenen atheistischen Aeußerungen. Eine Schrift, die man zu lesen bittet, ehe man sie confsicirt" (1799) Der geschlossene Handelsstaat. Ein philosophischer Entwurf als Anhang zur Rechtslehre und Probe einer künftig zu liefernden Politik (1800) Die Bestimmung des Menschen (1800) Friedrich Nicolais Leben und sonderbare Meinungen (1801) Philosophie der Maurerei. Briefe an Konstant (1802/03) Die Grundzüge des gegenwärtigen Zeitalters (1806) Die Anweisung zum seligen Leben oder auch die Religionslehre (1806) Reden an die deutsche Nation (1807/1808) Das System der Rechtslehre (1812) Correspondence Jacobi an Fichte, German Text (1799/1816), with Introduction and Critical Apparatus by Marco Ivaldo and Ariberto Acerbi (Introduction, German Text, Italian Translation, 3 Appendices with Jacobi's and Fichte's complementary Texts, Philological Notes, Commentary, Bibliography, Index): Istituto Italiano per gli Studi Filosofici Press, Naples 2011, . Collected works in German The new standard edition of Fichte's works in German, which supersedes all previous editions, is the Gesamtausgabe ("Collected Works" or "Complete Edition", commonly abbreviated as GA), prepared by the Bavarian Academy of Sciences: Gesamtausgabe der Bayerischen Akademie der Wissenschaften, 42 volumes, edited by , Hans Gliwitzky, Erich Fuchs and Peter Schneider, Stuttgart-Bad Cannstatt: Frommann-Holzboog, 1962–2012. It is organized into four parts: Part I: Published Works Part II: Unpublished Writings Part III: Correspondence Part IV: Lecture Transcripts Fichte's works are quoted and cited from GA, followed by a combination of Roman and Arabic numbers, indicating the series and volume, respectively, and the page number(s). Another edition is Johann Gottlieb Fichtes sämmtliche Werke (abbrev. SW), ed. I. H. Fichte. Berlin: de Gruyter, 1971. Selected works in English Concerning the Conception of the Science of Knowledge Generally (Ueber den Begriff der Wissenschaftslehre oder der sogenannten Philosophie, 1794), translated by Adolph Ernst Kroeger. In The Science of Knowledge, pp. 331–336. Philadelphia: J.B. Lippincott & Co., 1868. Rpt., London: Trübner & Co., 1889. Attempt at a Critique of All Revelation. Trans. Garrett Green. New York: Cambridge University Press, 1978. (Translation of Versuch einer Critik aller Offenbarung, 1st ed. 1792, 2nd ed. 1793.) Early Philosophical Writings. Trans. and ed. Daniel Breazeale. Ithaca: Cornell University Press, 1988. (Contains Selections from Fichte's Writings and Correspondence from the Jena period, 1794–1799). Foundations of the Entire Science of Knowledge. Translation of: Grundlage der gesammten Wissenschaftslehre (1794/95, 2nd ed. 1802), Fichte's first major exposition of the Wissenschaftlehre. In: Foundations of Natural Right. Trans. Michael Baur. Ed. Frederick Neuhouser. Cambridge: Cambridge University Press, 2000. (Translation of Grundlage des Naturrechts, 1796/97.) Foundations of Transcendental Philosophy (Wissenschaftslehre) Nova Methodo [FTP]. Trans. and ed. Daniel Breazeale. Ithaca, NY: Cornell University
lakes. This corresponds to thinking of lakes Erie and Ontario as "down south" and the others as "up north". Vessels sailing north on Lake Michigan are considered "upbound" even though they are sailing toward its effluent current. Primary connecting waterways The Chicago River and Calumet River systems connect the Great Lakes Basin to the Mississippi River System through man-made alterations and canals. The St. Marys River, including the Soo Locks, connects Lake Superior to Lake Huron, via the North Channel. The Straits of Mackinac connect Lake Michigan to Lake Huron (which are hydrologically one). The St. Clair River connects Lake Huron to Lake St. Clair. The Detroit River connects Lake St. Clair to Lake Erie. The Niagara River, including Niagara Falls, connects Lake Erie to Lake Ontario. The Welland Canal, bypassing the Niagara River, connects Lake Erie to Lake Ontario. The Saint Lawrence River and the Saint Lawrence Seaway connect Lake Ontario to the Gulf of Saint Lawrence, which connects to the Atlantic Ocean. Lake Michigan–Huron Lakes Huron and Michigan are sometimes considered a single lake, called Lake Michigan–Huron, because they are one hydrological body of water connected by the Straits of Mackinac. The straits are wide and deep; the water levels rise and fall together, and the flow between Michigan and Huron frequently reverses direction. Large bays and related significant bodies of water Lake Nipigon, connected to Lake Superior by the Nipigon River, is surrounded by sill-like formations of mafic and ultramafic igneous rock hundreds of meters high. The lake lies in the Nipigon Embayment, a failed arm of the triple junction (centered beneath Lake Superior) in the Midcontinent Rift System event, estimated at 1.1 billion years ago. Green Bay is an arm of Lake Michigan along the south coast of the Upper Peninsula of Michigan and the east coast of Wisconsin. It is separated from the rest of the lake by the Door Peninsula in Wisconsin, the Garden Peninsula in Michigan, and the chain of islands between them, all of which were formed by the Niagara Escarpment. Lake Winnebago, connected to Green Bay by the Fox River, serves as part of the Fox–Wisconsin Waterway and is part of a larger system of lakes in Wisconsin known as the Winnebago Pool. Grand Traverse Bay is an arm of Lake Michigan on Michigan's west coast and is one of the largest natural harbors in the Great Lakes. The bay has one large peninsula and one major island known as Power Island. Its name is derived from Jacques Marquette's crossing of the bay from Norwood to Northport which he called La Grande Traversee. Georgian Bay is an arm of Lake Huron, extending northeast from the lake entirely within Ontario. The bay, along with its narrow westerly extensions of the North Channel and Mississagi Strait, is separated from the rest of the lake by the Bruce Peninsula, Manitoulin Island, and Cockburn Island, all of which were formed by the Niagara Escarpment. Lake Nipissing, connected to Georgian Bay by the French River, contains two volcanic pipes, which are the Manitou Islands and Callander Bay. These pipes were formed by a violent, supersonic eruption of deep origin. The lake lies in the Ottawa-Bonnechere Graben, a Mesozoic rift valley that formed 175 million years ago. Lake Simcoe, connected to Georgian Bay by the Severn River, serves as part of the Trent–Severn Waterway, a canal route traversing Southern Ontario between Lakes Ontario and Huron. Lake St. Clair, connected with Lake Huron to its north by the St. Clair River and with Lake Erie to its south by the Detroit River. Although it is 17 times smaller in area than Lake Ontario and only rarely included in the listings of the Great Lakes, proposals for its official recognition as a Great Lake are occasionally made, which would affect its inclusion in scientific research projects designated as related to "The Great Lakes". Saginaw Bay, an extension of Lake Huron into the Lower Peninsula of Michigan, fed by the Saginaw and other rivers, has the largest contiguous freshwater wetland in the United States. Islands Dispersed throughout the Great Lakes are approximately 35,000 islands. The largest among them is Manitoulin Island in Lake Huron, the largest island in any inland body of water in the world. The second-largest island is Isle Royale in Lake Superior. Both of these islands are large enough to contain multiple lakes themselves—for instance, Manitoulin Island's Lake Manitou is the world's largest lake on a freshwater island. Some of these lakes even have their own islands, like Treasure Island in Lake Mindemoya in Manitoulin Island Peninsulas The Great Lakes also have several peninsulas between them, including the Door Peninsula, the Peninsulas of Michigan, and the Ontario Peninsula. Some of these peninsulas even contain smaller peninsulas, such as the Keweenaw Peninsula, the Thumb Peninsula, the Bruce Peninsula, and the Niagara Peninsula. Population centers on the peninsulas include Grand Rapids, Flint, and Detroit in Michigan along with London, Hamilton, Brantford, and Toronto in Ontario. Shipping connection to the ocean Although the Saint Lawrence Seaway and Great Lakes Waterway make the Great Lakes accessible to ocean-going vessels, shifts in shipping to wider ocean-going container ships—which do not fit through the locks on these routes—have limited container shipping on the lakes. Most Great Lakes trade is of bulk material, and bulk freighters of Seawaymax-size or less can move throughout the entire lakes and out to the Atlantic. Larger ships are confined to working within the lakes. Only barges can access the Illinois Waterway system providing access to the Gulf of Mexico via the Mississippi River. Despite their vast size, large sections of the Great Lakes freeze over in winter, interrupting most shipping from January to March. Some icebreakers ply the lakes, keeping the shipping lanes open through other periods of ice on the lakes. The Great Lakes are connected by the Chicago Sanitary and Ship Canal to the Gulf of Mexico via the Illinois River (from the Chicago River) and the Mississippi River. An alternate track is via the Illinois River (from Chicago), to the Mississippi, up the Ohio, and then through the Tennessee–Tombigbee Waterway (a combination of a series of rivers and lakes and canals), to Mobile Bay and the Gulf of Mexico. Commercial tug-and-barge traffic on these waterways is heavy. Pleasure boats can enter or exit the Great Lakes by way of the Erie Canal and Hudson River in New York. The Erie Canal connects to the Great Lakes at the east end of Lake Erie (at Buffalo, New York) and at the south side of Lake Ontario (at Oswego, New York). Water levels The lakes were originally fed by both precipitation and meltwater from glaciers which are no longer present. In modern times, only about 1% of volume per year is "new" water, originating from rivers, precipitation, and groundwater springs. In the post-glacial period, evaporation, and drainage have generally been balanced, making the levels of the lakes relatively constant. Intensive human population growth began in the region in the 20th century and continues today. At least two human water use activities have been identified as having the potential to affect the lakes' levels: diversion (the transfer of water to other watersheds) and consumption (substantially done today by the use of lake water to power and cool electric generation plants, resulting in evaporation). Outflows through the Chicago Sanitary and Ship Canal is more than balanced by artificial inflows via the Ogoki River and Long Lake/Kenogami River diversions. Fluctuation of the water levels in the lakes has been observed since records began in 1918. The water level of Lake Michigan–Huron had remained fairly constant over the 20th century Recent lake levels include record low levels in 2013 in Lakes Superior, Erie, and Michigan-Huron, followed by record high levels in 2020 in the same lakes. The water level in Lake Ontario has remained relatively constant in the same time period, hovering around the historical average level. The lake levels are affected primarily by changes in regional meteorology and climatology. The outflows from lakes Superior and Ontario are regulated, while the outflows of Michigan-Huron and Erie are not regulated at all. Ontario is the most tightly regulated, with its outflow controlled by the Moses-Saunders Power Dam, which explains its consistent historical levels. Etymology Lake Erie From the Erie tribe, a shortened form of the Iroquoian word 'long tail'. Lake Huron Named for the inhabitants of the area, the Wyandot (or "Hurons"), by the first French explorers . The Wyandot originally referred to the lake by the name , a word which has been variously translated as "Freshwater Sea", "Lake of the Hurons", or simply "lake". Lake Michigan From the Ojibwe word "great water" or "large lake". Lake Ontario From the Wyandot word "lake of shining waters". Lake Superior English translation of the French term "upper lake", referring to its position north of Lake Huron. The indigenous Ojibwe call it (from Ojibwe "big, large, great"; "water, lake, sea"). Popularized in French-influenced transliteration as Gitchigumi as in Gordon Lightfoot's 1976 story song "The Wreck of the Edmund Fitzgerald", or Gitchee Gumee as in Henry Wadsworth Longfellow's 1855 epic poem, The Song of Hiawatha). Statistics The Great Lakes contain 21% of the world's surface fresh water: , or 6.0×1015 U.S. gallons, that is 6 quadrillion U.S gallons, (2.3×1016 liters). The lakes contain about 84% of the surface freshwater of North America; if the water were evenly distributed over the entire continent's land area, it would reach a depth of 5 feet (1.5 meters). This is enough water to cover the 48 contiguous U.S. states to a uniform depth of . Although the lakes contain a large percentage of the world's fresh water, the Great Lakes supply only a small portion of U.S. drinking water on a national basis. The total surface area of the lakes is approximately —nearly the same size as the United Kingdom, and larger than the U.S. states of New York, New Jersey, Connecticut, Rhode Island, Massachusetts, Vermont, and New Hampshire combined. The Great Lakes coast measures approximately ;, but the length of a coastline is impossible to measure exactly and is not a well-defined measure. Canada borders approximately of coastline, while the remaining are bordered by the United States. Michigan has the longest shoreline of the United States, bordering roughly of lakes, followed by Wisconsin (), New York (), and Ohio (). Traversing the shoreline of all the lakes would cover a distance roughly equivalent to travelling half-way around the world at the equator. A notable modern phenomenon is the formation of ice volcanoes over the lakes during wintertime. Storm-generated waves carve the lakes' ice sheet and create conical mounds through the eruption of water and slush. The process is only well-documented in the Great Lakes, and has been credited with sparing the southern shorelines from worse rocky erosion. Geology It has been estimated that the foundational geology that created the conditions shaping the present day upper Great Lakes was laid from 1.1 to 1.2 billion years ago, when two previously fused tectonic plates split apart and created the Midcontinent Rift, which crossed the Great Lakes Tectonic Zone. A valley was formed providing a basin that eventually became modern day Lake Superior. When a second fault line, the Saint Lawrence rift, formed approximately 570 million years ago, the basis for Lakes Ontario and Erie was created, along with what would become the Saint Lawrence River. The Great Lakes are estimated to have been formed at the end of the Last Glacial Period (the Wisconsin glaciation ended 10,000 to 12,000 years ago), when the Laurentide Ice Sheet receded. The retreat of the ice sheet left behind a large amount of meltwater (Lake Algonquin, Lake Chicago, Glacial Lake Iroquois, and Champlain Sea) that filled up the basins that the glaciers had carved, thus creating the Great Lakes as we know them today. Because of the uneven nature of glacier erosion, some higher hills became Great Lakes islands. The Niagara Escarpment follows the contour of the Great Lakes between New York and Wisconsin. Land below the glaciers "rebounded" as it was uncovered. Since the glaciers covered some areas longer than others, this glacial rebound occurred at different rates. Climate The Great Lakes have a humid continental climate, Köppen climate classification Dfa (in southern areas) and Dfb (in northern parts) with varying influences from air masses from other regions including dry, cold Arctic systems, mild Pacific air masses from the west, and warm, wet tropical systems from the south and the Gulf of Mexico. The lakes have a moderating effect on the climate; they can also increase precipitation totals and produce lake effect snowfall. Lake effect The Great Lakes can have an effect on regional weather called lake-effect snow, which is sometimes very localized. Even late in winter, the lakes often have no icepack in the middle. The prevailing winds from the west pick up the air and moisture from the lake surface, which is slightly warmer in relation to the cold surface winds above. As the slightly warmer, moist air passes over the colder land surface, the moisture often produces concentrated, heavy snowfall that sets up in bands or "streamers". This is similar to the effect of warmer air dropping snow as it passes over mountain ranges. During freezing weather with high winds, the "snowbelts" receive regular snow fall from this localized weather pattern, especially along the eastern shores of the lakes. Snowbelts are found in Wisconsin, Michigan, Ohio, Pennsylvania, New York, and Ontario. Related to the lake effect is the regular occurrence of fog, particularly along the shorelines of the lakes. This is most noticeable along Lake Superior's shores. The lakes tend to moderate seasonal temperatures to some degree but not with as large an influence as do large oceans; they absorb heat and cool the air in summer, then slowly radiate that heat in autumn. They protect against frost during transitional weather and keep the summertime temperatures cooler than further inland. This effect can be very localized and overridden by offshore wind patterns. This temperature buffering produces areas known as "fruit belts", where fruit can be produced that is typically grown much farther south. For instance, western Michigan has apple orchards, and cherry orchards are cultivated adjacent to the lake shore as far north as the Grand Traverse Bay. Near Collingwood, Ontario, commercial fruit orchards, including a few wineries,
forest Habitats of the Indiana Dunes Plant lists include: List of Michigan flowers List of Minnesota wild flowers List of Minnesota trees Logging Logging of the extensive forests in the Great Lakes region removed riparian and adjacent tree cover over rivers and streams, which provide shade, moderating water temperatures in fish spawning grounds. Removal of trees also destabilized the soil, with greater volumes washed into stream beds causing siltation of gravel beds, and more frequent flooding. Running cut logs down the tributary rivers into the Great Lakes also dislocated sediments. In 1884, the New York Fish Commission determined that the dumping of sawmill waste (chips and sawdust) had impacted fish populations. Pollution The first U.S. Clean Water Act, passed by a Congressional override after being vetoed by U.S. President Richard Nixon in 1972, was a key piece of legislation, along with the bi-national Great Lakes Water Quality Agreement signed by Canada and the U.S. A variety of steps taken to process industrial and municipal pollution discharges into the system greatly improved water quality by the 1980s, and Lake Erie in particular is significantly cleaner. Discharge of toxic substances has been sharply reduced. Federal and state regulations control substances like PCBs. The first of 43 "Great Lakes Areas of Concern" to be formally "de-listed" through successful cleanup was Ontario's Collingwood Harbour in 1994; Ontario's Severn Sound followed in 2003. Presque Isle Bay in Pennsylvania is formally listed as in recovery, as is Ontario's Spanish Harbour. Dozens of other Areas of Concern have received partial cleanups such as the Rouge River (Michigan) and Waukegan Harbor (Illinois). Phosphate detergents were historically a major source of nutrient to the Great Lakes algae blooms in particular in the warmer and shallower portions of the system such as Lake Erie, Saginaw Bay, Green Bay, and the southernmost portion of Lake Michigan. By the mid-1980s, most jurisdictions bordering the Great Lakes had controlled phosphate detergents. Blue-green algae, or cyanobacteria blooms, have been problematic on Lake Erie since 2011. "Not enough is being done to stop fertilizer and phosphorus from getting into the lake and causing blooms," said Michael McKay, executive director of the Great Lakes Institute for Environmental Research (GLIER) at the University of Windsor. The largest Lake Erie bloom to date occurred in 2015, exceeding the severity index at 10.5 and in 2011 at a 10. In early August 2019, satellite images depicted a bloom stretching up to 1,300 square kilometres on Lake Erie, with the heaviest concentration near Toledo, Ohio. A large bloom does not necessarily mean the cyanobacteria ... will produce toxins", said Michael McKay, of the University of Windsor. Water quality testing was underway in August 2019. Mercury Until 1970, mercury was not listed as a harmful chemical, according to the United States Federal Water Quality Administration. In the 21st century, mercury has become more apparent in water tests. Mercury compounds have been used in paper mills to prevent slime from forming during their production, and chemical companies have used mercury to separate chlorine from brine solutions. Studies conducted by the Environmental Protection Agency have shown that when the mercury comes in contact with many of the bacteria and compounds in the fresh water, it forms the compound methyl mercury, which has a much greater impact on human health than elemental mercury due to a higher propensity for absorption. This form of mercury is not detrimental to a majority of fish types, but is very detrimental to people and other wildlife animals who consume the fish. Mercury has been known for health related problems such as birth defects in humans and animals, and the near extinction of eagles in the Great Lakes region. Sewage The amount of raw sewage dumped into the waters was the primary focus of both the first Great Lakes Water Quality Agreement and federal laws passed in both countries during the 1970s. Implementation of secondary treatment of municipal sewage by major cities greatly reduced the routine discharge of untreated sewage during the 1970s and 1980s. The International Joint Commission in 2009 summarized the change: "Since the early 1970s, the level of treatment to reduce pollution from waste water discharges to the Great Lakes has improved considerably. This is a result of significant expenditures to date on both infrastructure and technology, and robust regulatory systems that have proven to be, on the whole, quite effective." The commission reported that all urban sewage treatment systems on the U.S. side of the lakes had implemented secondary treatment, as had all on the Canadian side except for five small systems. Though contrary to federal laws in both countries, those treatment system upgrades have not yet eliminated combined sewer overflow events. This describes when older sewerage systems, which combine storm water with sewage into single sewers heading to the treatment plant, are temporarily overwhelmed by heavy rainstorms. Local sewage treatment authorities then must release untreated effluent, a mix of rainwater and sewage, into local water bodies. While enormous public investments such as the Deep Tunnel projects in Chicago and Milwaukee have greatly reduced the frequency and volume of these events, they have not been eliminated. The number of such overflow events in Ontario, for example, is flat according to the International Joint Commission. Reports about this issue on the U.S. side highlight five large municipal systems (those of Detroit, Cleveland, Buffalo, Milwaukee and Gary) as being the largest current periodic sources of untreated discharges into the Great Lakes. Impacts of climate change on algae Algae such as diatoms, along with other phytoplankton, are photosynthetic primary producers supporting the food web of the Great Lakes, and have been affected by global warming. The changes in the size or in the function of the primary producers may have a direct or an indirect impact on the food web. Photosynthesis carried out by diatoms constitutes about one fifth of the total photosynthesis. By taking out of the water to photosynthesize, diatoms help to stabilize the pH of the water, as would react with water to produce carbonic acid. Diatoms acquire inorganic carbon through passive diffusion of and , and use carbonic anhydrase mediated active transport to speed up this process. Large diatoms require more carbon uptake than smaller diatoms. There is a positive correlation between the surface area and the chlorophyll concentration of diatom cells. History Several Native American populations (Paleo-indians) inhabited the region around 10,000 BC, after the end of the Wisconsin glaciation. The peoples of the Great Lakes traded with the Hopewell culture from around 1000 AD, as copper nuggets have been extracted from the region and fashioned into ornaments and weapons in the mounds of Southern Ohio. The Rush–Bagot Treaty signed in 1818, after the War of 1812 and the later Treaty of Washington eventually led to a complete disarmament of naval vessels in the Great Lakes. Nonetheless, both nations maintained coast guard vessels in the Great Lakes. The brigantine Le Griffon, which was commissioned by René-Robert Cavelier, Sieur de La Salle, was built at Cayuga Creek, near the southern end of the Niagara River, and became the first known sailing ship to travel the upper Great Lakes on August 7, 1679. During settlement, the Great Lakes and its rivers were the only practical means of moving people and freight. Barges from middle North America were able to reach the Atlantic Ocean from the Great Lakes when the Welland Canal opened in 1824 and the later Erie Canal opened in 1825. By 1848, with the opening of the Illinois and Michigan Canal at Chicago, direct access to the Mississippi River was possible from the lakes. With these two canals an all-inland water route was provided between New York City and New Orleans. The main business of many of the passenger lines in the 19th century was transporting immigrants. Many of the larger cities owe their existence to their position on the lakes as a freight destination as well as for being a magnet for immigrants. After railroads and surface roads developed, the freight and passenger businesses dwindled and, except for ferries and a few foreign cruise ships, have now vanished. The immigration routes still have an effect today. Immigrants often formed their own communities, and some areas have a pronounced ethnicity, such as Dutch, German, Polish, Finnish, and many others. Since many immigrants settled for a time in New England before moving westward, many areas on the U.S. side of the Great Lakes also have a New England feel, especially in home styles and accent. Since general freight these days is transported by railroads and trucks, domestic ships mostly move bulk cargoes, such as iron ore, coal and limestone for the steel industry. The domestic bulk freight developed because of the nearby mines. It was more economical to transport the ingredients for steel to centralized plants rather than to make steel on the spot. Grain exports are also a major cargo on the lakes. In the 19th and early 20th centuries, iron and other ores such as copper were shipped south on (downbound ships), and supplies, food, and coal were shipped north (upbound). Because of the location of the coal fields in Pennsylvania and West Virginia, and the general northeast track of the Appalachian Mountains, railroads naturally developed shipping routes that went due north to ports such as Erie, Pennsylvania and Ashtabula, Ohio. Because the lake maritime community largely developed independently, it has some distinctive vocabulary. Ships, no matter the size, are called boats. When the sailing ships gave way to steamships, they were called steamboats—the same term used on the Mississippi. The ships also have a distinctive design; ships that primarily trade on the lakes are known as lakers. Foreign boats are known as salties. One of the more common sights on the lakes has been since about 1950 the 1,000‑by‑105-foot (305-by-32-meter), self-unloader. This is a laker with a conveyor belt system that can unload itself by swinging a crane over the side. Today, the Great Lakes fleet is much smaller in numbers than it once was because of the increased use of overland freight, and a few larger ships replacing many small ones. During World War II, the risk of submarine attacks against coastal training facilities motivated the United States Navy to operate two aircraft carriers on the Great Lakes, and . Both served as training ships to qualify naval aviators in carrier landing and takeoff. Lake Champlain briefly became the sixth Great Lake of the United States on March 6, 1998, when President Clinton signed Senate Bill 927. This bill, which reauthorized the National Sea Grant Program, contained a line declaring Lake Champlain to be a Great Lake. Not coincidentally, this status allows neighboring states to apply for additional federal research and education funds allocated to these national resources. Following a small uproar, the Senate voted to revoke the designation on March 24 (although New York and Vermont universities would continue to receive funds to monitor and study the lake). Alan B. McCullough has written that the fishing industry of the Great Lakes got its start "on the American side of Lake Ontario in Chaumont Bay, near the Maumee River on Lake Erie, and on the Detroit River at about the time of the War of 1812". Although the region was sparsely populated until the 1830s, so there was not much local demand and transporting fish was prohibitively costly, there were economic and infrastructure developments that were promising for the future of the fishing industry going into the 1830s. Particularly, the 1825 opening of the Erie Canal and the Welland Canal a few years later. The fishing industry expanded particularly in the waters associated with the fur trade that connect Lake Erie and Lake Huron. In fact, two major suppliers of fish in the 1830s were the fur trading companies Hudson's Bay Company and the American Fur Company. The catch from these waters was sent to the growing market for salted fish in Detroit, where merchants involved in the fur trade had already gained some experience handling salted fish. One such merchant was John P. Clark, a shipbuilder and merchant who began selling fish in the area of Manitowoc, Wisconsin where whitefish was abundant. Another operation cropped up in Georgian Bay, Canadian waters plentiful with trout as well as whitefish. In 1831, Alexander MacGregor from Goderich, Ontario found whitefish and herring in abundant supply around the Fishing Islands. A contemporary account by Methodist missionary John Evans describes the fish as resembling a "bright cloud moving rapidly through the water". From 1844 through 1857, palace steamers carried passengers and cargo around the Great Lakes. In the first half of the 20th century large luxurious passenger steamers sailed the lakes in opulence. The Detroit and Cleveland Navigation Company had several vessels at the time and hired workers from all walks of life to help operate these vessels. Several ferries currently operate on the Great Lakes to carry passengers to various islands. As of 2007, four car ferry services cross the Great Lakes, two on Lake Michigan: a steamer from Ludington, Michigan, to Manitowoc, Wisconsin, and a high speed catamaran from Milwaukee to Muskegon, Michigan, one on Lake Erie: a boat from Kingsville, Ontario, or Leamington, Ontario, to Pelee Island, Ontario, then onto Sandusky, Ohio, and one on Lake Huron: the M.S. Chi-Cheemaun runs between Tobermory and South Baymouth, Manitoulin Island, operated by the Owen Sound Transportation Company. An international ferry across Lake Ontario from Rochester, New York, to Toronto ran during 2004 and 2005 but is no longer in operation. Shipwrecks The large size of the Great Lakes increases the risk of water travel; storms and reefs are common threats. The lakes are prone to sudden and severe storms, in particular in the autumn, from late October until early December. Hundreds of ships have met their end on the lakes. The greatest concentration of shipwrecks lies near Thunder Bay (Michigan), beneath Lake Huron, near the point where eastbound and westbound shipping lanes converge. The Lake Superior shipwreck coast from Grand Marais, Michigan, to Whitefish Point became known as the "Graveyard of the Great Lakes". More vessels have been lost in the Whitefish Point area than any other part of Lake Superior. The Whitefish Point Underwater Preserve serves as an underwater museum to protect the many shipwrecks in this area. The first ship to sink in Lake Michigan was Le Griffon, also the first ship to sail the Great Lakes. Caught in a 1679 storm while trading furs between Green Bay and Michilimacinac, she was lost with all hands aboard. Its wreck may have been found in 2004, but a wreck subsequently discovered in a different location was also claimed in 2014 to be Le Griffon. The largest and last major freighter wrecked on the lakes was the , which sank on November 10, 1975, just over offshore from Whitefish Point on Lake Superior.
Germany (of or related to) Germania (historical use) Germans, citizens of Germany, people of German ancestry, or native speakers of the German language For citizens of Germany, see also German nationality law Germanic peoples (Roman times) German language any of the Germanic languages German cuisine, traditional foods of Germany People German (given name) German (surname) Germán, a Spanish name Places German (parish), Isle of Man German, Albania, or
of the German language For citizens of Germany, see also German nationality law Germanic peoples (Roman times) German language any of the Germanic languages German cuisine, traditional foods of Germany People German (given name) German (surname) Germán, a Spanish name Places German (parish), Isle of Man German, Albania, or Gërmej German, Bulgaria German, Iran German, North Macedonia German, New York, U.S. Agios Germanos, Greece Other uses German (mythology), a South Slavic mythological being Germans (band), a Canadian rock band "German" (song),
-cage (or as a (3,)-cage). The Petersen graph is the unique 5-cage (it is the smallest cubic graph of girth 5), the Heawood graph is the unique 6-cage, the McGee graph is the unique 7-cage and the Tutte eight cage is the unique 8-cage. There may exist multiple cages for a given girth. For instance there are three nonisomorphic 10-cages, each with 70 vertices: the Balaban 10-cage, the Harries graph and the Harries–Wong graph. Girth and graph coloring For any positive integers and , there exists a graph with girth at least and chromatic number at least ; for instance, the Grötzsch graph is triangle-free and has chromatic number 4, and repeating the Mycielskian construction used to form the Grötzsch graph produces triangle-free graphs of arbitrarily large chromatic number. Paul Erdős was the first to prove the general result, using the probabilistic method. More precisely, he showed that a random graph on vertices, formed by choosing independently whether to include each edge with probability has, with probability tending to 1 as goes to infinity, at most cycles of length or less, but has no independent set of size Therefore, removing one vertex from each short cycle leaves a smaller graph with girth greater than in which each color class of a coloring must be small and which therefore requires at least colors in any coloring. Explicit, though large, graphs with high girth and chromatic number can
girth is defined to be infinity. For example, a 4-cycle (square) has girth 4. A grid has girth 4 as well, and a triangular mesh has girth 3. A graph with girth four or more is triangle-free. Cages A cubic graph (all vertices have degree three) of girth that is as small as possible is known as a -cage (or as a (3,)-cage). The Petersen graph is the unique 5-cage (it is the smallest cubic graph of girth 5), the Heawood graph is the unique 6-cage, the McGee graph is the unique 7-cage and the Tutte eight cage is the unique 8-cage. There may exist multiple cages for a given girth. For instance there are three nonisomorphic 10-cages, each with 70 vertices: the Balaban 10-cage, the Harries graph and the Harries–Wong graph. Girth and graph coloring For any positive integers and , there exists a graph with girth at least and chromatic number at least ; for instance, the Grötzsch graph is triangle-free and has chromatic number 4, and repeating the Mycielskian construction used to form the Grötzsch graph produces triangle-free graphs of arbitrarily large chromatic number. Paul Erdős was the first to prove the general result, using the probabilistic method. More precisely, he showed that a random graph on vertices, formed by choosing independently whether
gun, you are making the gun more dangerous." Critics also point out that a trigger lock will increase the time it takes an owner to respond to a self-defense emergency. In 2008, the U.S. Supreme Court overturned a Washington, D.C. law that required handguns to be locked or otherwise kept inoperative within the home, saying that this "makes it impossible for citizens to use them for the core lawful purpose of self-defense." Although there are no universal standards for the design or testing of trigger locks, some jurisdictions, such as the state of California, maintain a list of approved trigger lock devices. In Canada, a trigger lock is one of the methods prescribed by law to secure a firearm during transport or storage. Chamber locks Chamber locks aim to block ammunition from being chambered, since most firearms typically cannot be discharged unless the ammunition is in the correct position. They are used to prevent live ammunition from being loaded into a firearm by blocking the chamber with a dummy cartridge or a chamber plug, which is sometimes wedged into place with the use of a tool, in essence jamming the firearm. Another type is one in which a steel rod locked into the safety cartridge with a key. As long as the rod and safety cartridge are engaged, the dummy round cannot eject nor can live ammunition be loaded into the firearm. Chamber locks work with most firearm types including revolvers, pistols, rifles and shotguns. They are available in any caliber and length, and may include such features as unique keying, rapid removal, and rigorous testing and certification by major state departments such as the California Department of Justice. Some shooting ranges require the handler to insert a temporary chamber plug which often has a brightly colored external tag, to signal the chamber being devoid of ammunition and blocked, whenever the firearm is being unused. These are called empty chamber indicators, or chamber flags. Cable locks Cable locks are a popular type of lock that usually threads into the receiver through the ejection port of repeating firearms. These locks physically obstruct the movements of the bolt, thereby preventing the cycling of the action, and deny the return to "battery" and the closure of the breech. In many designs of pistol and rifle, they also thread through the magazine well of the firearm to prevent the proper insertion of a magazine. Smart gun Personalized firearms, or smart guns, are intended to prevent unauthorized use with built-in locks that are released by RFID chips or other proximity devices, fingerprint recognition, magnetic rings, or a microchip implant. Secondary dangers While a firearm's primary danger lies in the discharge of ammunition, there are other ways a firearm may be detrimental to the health of the handler and bystanders. Noise When a firearm is discharged it emits a very loud noise, typically close to the handler's ears. This can cause temporary or permanent hearing damage such as tinnitus. Hearing protection such as earplugs, or earmuffs, or both, can reduce the risk of hearing damage. Some earmuffs or headphones made for shooting and similar loud situations use active noise control. Firearms may also have silencers which reduce the sound intensity from the barrel. Hot gases and debris A firearm emits hot gases, powder, and other debris when discharged. Some firearms, such as semi-automatic and fully automatic firearms, typically eject spent cartridge casings at high speed. Casings are also dangerously hot when ejected. Revolvers store spent casings in the chamber, but may emit a stream of hot gases and possible fine particulate debris laterally from the interface between the revolving chamber and the barrel. Any of these may hurt the handler or bystanders through burning or impact damage. Because eyes are particularly vulnerable to this type of damage, eye protection should be worn to reduce the risk of injury. Prescription lenses and various tints to suit different light conditions are available. Some eye protection products are rated to withstand impact from birdshot loads, which offers protection against irresponsible firearms use by other game bird shooters. Toxins and pollutants In recent years the toxic effects of ammunition and firearm cleaning agents have been highlighted. Lead ammunition left in nature may become mobilized by acid rain. Older ammunition may have mercury-based primers. Lead accumulates in shooting range backstops. Indoor ranges require good ventilation to remove pollutants such as powder, smoke, and lead dust from the air around the shooters. Indoor and outdoor ranges typically require extensive decontamination when they are decommissioned to remove all traces
lead poisoning from bullets, and pollution from other hazardous materials in propellants and cartridges. There were 47,000 unintentional firearm deaths worldwide in 2013. History Accidental explosions of stored gunpowder date to the 13th century in Yangzhou, China. Early handheld muskets using matchlock or wheel lock mechanisms were limited by poor reliability and the risk of accidental discharge, which was improved somewhat by the introduction of the flintlock, though unintentional firing continued to be a serious drawback. Percussion caps, introduced in the 1820s were more reliable, and by 1830 inventors added security pins to their designs to prevent accidental discharges. Trigger guards and grip safetys were further steps leading to the various safeties built into modern firearms. Malfunctions Storage Proper storage prevents unauthorized use or theft of firearms and ammunition, or damage to them. A gun safe or gun cabinet is commonly used to physically prevent access to a firearm. Local laws may require particular standards for the lock, for the strength and burglar resistance of the cabinet, and may even require weapons and ammunition to be stored separately. Rifles or shotgun safes that are a lighter version of true safes are generally the norm for hunters or multiple firearm owners. Various safety standards like the RSC standard and CDOJ safety standard in US exists for the minimum requirement to qualify a container as firearm safety storage device. Similarly small handgun safes of different sizes and capacity are preferred for storing small number of handguns although most of them are found to be not very reliable by independent researchers and professional hackers. Locking mechanism plays important role in overall safety of the small safe. Generally simplex mechanical locks are found to be most secure and reliable. For ammunition some experts recommend storing in secure locations away from firearms. Ammunition should be kept in cool, dry conditions free from contaminating vapors to prevent deterioration of the propellant and cartridge. Handloaders must take special precautions for storing primers and loose gunpowder. Training, habits and mindset Gun safety training teaches a safety mindset, habits, and rules. The mindset is that firearms are inherently dangerous and must always be stored carefully and handled with care. Handlers are taught to treat firearms with respect for their destructive capabilities, and strongly discouraged from playing or toying with firearms, a common cause of accidents. The rules of gun safety follow from this mindset. In 1902, the English politician and game shooting enthusiast Mark Hanbury Beaufoy wrote some much-quoted verses on gun safety, meant to instill the safety mindset. Various similar sayings have since been popularized. Jeff Cooper, an influential figure in modern firearms training, formalized and popularized "Four Rules" of safe firearm handling. Prior lists of gun safety rules included as few as three basic safety rules or as many as ten rules including gun safety and sporting etiquette rules. In addition to Cooper, other influential teachers of gun safety include Massad Ayoob, Clint Smith, Chuck Taylor, Jim Crews, Bob Munden and Ignatius Piazza. The National Rifle Association and other public safety websites provides a similar set of rules. Disassembly Locks There are several types of locks that serve to make it difficult to discharge a firearm. Locks are considered less effective than keeping firearms stored in a lockable safe since locks are more easily defeated than approved safes. An unauthorized handler can bypass the locked firearm at their leisure. Some manufacturers, such as Taurus, build locks into the firearm itself. California effected regulations in 2000 that forced locks to be approved by a firearm safety device laboratory via California Penal Code Section 12088. All locks under this code must receive extensive tests including saw, pick, pull, and many other tests in order to be approved for the state of California. If a lock passes the requirements then it is said to be California Department of Justice (CADOJ) approved. Trigger lock There is controversy surrounding manufacturing standards, usage, and legislation of trigger locks. While supporters of trigger locks argue that they will save children by preventing accidents, critics point to demonstrations that some models can be removed by children
street. Music The song was made famous by Paul Robeson whose deep voice was said by Robert O'Meally to have assumed "the might and authority of God." On February 7, 1958, the song was recorded in New York City and sung by Louis Armstrong with Sy Oliver's Orchestra. It was recorded by Doris Akers and the Sky Pilot Choir. The song has since become a jazz standard, having been recorded by Grant Green, Fats Waller, Archie Shepp, Hampton Hawes and many others. It is one of the five spirituals included in the oratorio A Child of Our Time, first performed in 1944, by the English classical composer Michael Tippett (190598). It is included in some seders in the United States, and is printed in Meyer Levin's An Israel Haggadah for Passover. The song was recorded by Deep River Boys in Oslo on September 26, 1960. It was released on the extended play Negro Spirituals No. 3 (HMV 7EGN 39). The song, or a modified version of it, has been used in the Roger Jones musical From Pharaoh to Freedom The French singer Claude Nougaro used its melody for his tribute to Louis Armstrong in French, under the name Armstrong (1965). "Go Down Moses" has sometimes been called "Let My People Go" and performed by a variety of musical artists, including RebbeSoul The song heavily influences "Get Down Moses", by Joe Strummer & the Mescaleros on their album Streetcore (2003). The song has been performed by the Russian Interior Ministry (MVD) Choir. Jazz singer Tony Vittia released a swing version under the name "Own The Night" (2013). The phrase "Go Down Moses" is featured in the chorus of the John Craigie song, "Will Not Fight" (2009). The phrase "Go Down Moses" is sung by Pops Staples with the Staple Singers in the song "The Weight" in The Last Waltz film by The Band (1976). The usual lyric is actually "Go down Miss Moses". Avant-garde singer-songwriter and composer Diamanda Galás recorded a version for her fifth album, You Must Be Certain of the Devil (1988), the final part of a trilogy about the AIDS epidemic that features songs influenced by American gospel music and biblical themes, and later in Plague Mass (1991) and The Singer (1992). Composer Nathaniel Dett used the text and melody of "Go Down Moses" throughout his oratorio, "The Ordering of Moses" (1937). In the first section, Dett sets the melody with added-note harmonies, quartal chords, modal harmonies, and chromaticism (especially French augmented sixth chords). Later in the oratorio, "Go Down Moses" is set as a fugue. Television The NBC television comedy The Fresh Prince of Bel-Air twice used the song for comedic effect. In the first instance, Will Smith's character sings the song after he and his cousin Carlton Banks are thrown into prison (Smith sings the first two lines, Banks sullenly provides the refrain, then a prisoner sings the final four lines in an operatic voice.) In the second instance, Banks is preparing for an Easter service and attempts to show off his prowess by singing the last two lines of the chorus; Smith replies with his own version, in which he makes a joke about Carlton's height ("...Let my cousin grow!"). In Dr. Katz, Professional Therapist is sung by Katz and Ben during the end credits of the episode "Thanksgiving" (Season 5, Episode 18). Della Reese sings it in Episode 424, "Elijah", of Touched by an Angel, which Bruce Davison sings "Eliyahu". In series 2 episode 3 of Life on Mars, the lawyer sings for his client's release. Recordings The Tuskegee Institute Singers recorded the song for Victor in 1914. The Kelly Family recorded the song twice: live version is included on their album Live (1988) and a studio version on New World (1990). The latter also features on their compilation album The Very Best - Over 10 Years (1993). The Golden Gate Quartet (Duration: 3:05; recorded in
people who reported on the song presumed it was composed by them. This became the first ever spiritual to be recorded in sheet music that is known of, by Reverend Lewis Lockwood. While visiting Fortress Monroe in 1861, he heard runaway slaves singing this song, transcribed what he heard, and then eventually published it in the National Anti-Slavery Standard. Sheet music was soon after published, titled "Oh! Let My People Go: The Song of the Contrabands", and arranged by Horace Waters. L.C. Lockwood, chaplain of the Contrabands, stated in the sheet music that the song was from Virginia, dating from about 1853. However, the song was not included in Slave Songs of the United States, despite it being a very prominent spiritual among slaves. Furthermore, the original version of the song sung by slaves almost definitely sounded very different from what Lockwood transcribed by ear, especially following an arrangement by a person who had never before heard the song how it was originally sung. The opening verse, as recorded by Lockwood, is: Sarah Bradford's authorized biography of Harriet Tubman, Scenes in the Life of Harriet Tubman (1869), quotes Tubman as saying she used "Go Down Moses" as one of two code songs fugitive slaves used to communicate when fleeing Maryland. Tubman began her underground railroad work in 1850 and continued until the beginning of the Civil War, so it's possible Tubman's use of the song predates the origin claimed by Lockwood. Some people even hypothesize that she herself may have written the spiritual. Although others claim Nat Turner, who led one of the most well-known slave revolts in history, either wrote or was the inspiration for the song. In popular culture Films Al Jolson sings it in Alan Crosland' film Big Boy (1930). Used briefly in Kid Millions (1934). Jess Lee Brooks sings it in Preston Sturges' film Sullivan's Travels (1941). Gregory Miller (played by Sidney Poitier) sang the song in the film Blackboard Jungle (1955). A reference is made to the song in the film Ferris Bueller's Day Off (1986), when a bedridden Cameron Frye sings, "When Cameron was in Egypt's land, let my Cameron go". Sergei Bodrov Jr. and Oleg Menshikov, who play the two main characters in Sergei Bodrov's film Кавказский пленник (1996; Prisoner of the Mountains) dance to the Louis Armstrong version. The teen comedy film Easy A (2010) remixed this song with a fast guitar and beats. The song was originally published as Original Soundtrack and is listed in IMDb. Literature William Faulkner titled his 1942 short-story collection Go Down, Moses after the song. Djuna Barnes, in her 1936 novel Nightwood, titled a chapter "Go Down, Matthew" as an allusion to the song's title. In Margaret Mitchell's 1936 novel Gone with the Wind, slaves from the Georgia plantation Tara are in Atlanta, to dig breastworks for the soldiers, and they sing "Go Down, Moses" as they march down a street. Music The song was made famous by Paul Robeson whose deep voice was said by Robert O'Meally to have assumed
classical mechanics to general relativity General relativity can be understood by examining its similarities with and departures from classical physics. The first step is the realization that classical mechanics and Newton's law of gravity admit a geometric description. The combination of this description with the laws of special relativity results in a heuristic derivation of general relativity. Geometry of Newtonian gravity At the base of classical mechanics is the notion that a body's motion can be described as a combination of free (or inertial) motion, and deviations from this free motion. Such deviations are caused by external forces acting on a body in accordance with Newton's second law of motion, which states that the net force acting on a body is equal to that body's (inertial) mass multiplied by its acceleration. The preferred inertial motions are related to the geometry of space and time: in the standard reference frames of classical mechanics, objects in free motion move along straight lines at constant speed. In modern parlance, their paths are geodesics, straight world lines in curved spacetime. Conversely, one might expect that inertial motions, once identified by observing the actual motions of bodies and making allowances for the external forces (such as electromagnetism or friction), can be used to define the geometry of space, as well as a time coordinate. However, there is an ambiguity once gravity comes into play. According to Newton's law of gravity, and independently verified by experiments such as that of Eötvös and its successors (see Eötvös experiment), there is a universality of free fall (also known as the weak equivalence principle, or the universal equality of inertial and passive-gravitational mass): the trajectory of a test body in free fall depends only on its position and initial speed, but not on any of its material properties. A simplified version of this is embodied in Einstein's elevator experiment, illustrated in the figure on the right: for an observer in an enclosed room, it is impossible to decide, by mapping the trajectory of bodies such as a dropped ball, whether the room is stationary in a gravitational field and the ball accelerating, or in free space aboard a rocket that is accelerating at a rate equal to that of the gravitational field versus the ball which upon release has nil acceleration. Given the universality of free fall, there is no observable distinction between inertial motion and motion under the influence of the gravitational force. This suggests the definition of a new class of inertial motion, namely that of objects in free fall under the influence of gravity. This new class of preferred motions, too, defines a geometry of space and time—in mathematical terms, it is the geodesic motion associated with a specific connection which depends on the gradient of the gravitational potential. Space, in this construction, still has the ordinary Euclidean geometry. However, spacetime as a whole is more complicated. As can be shown using simple thought experiments following the free-fall trajectories of different test particles, the result of transporting spacetime vectors that can denote a particle's velocity (time-like vectors) will vary with the particle's trajectory; mathematically speaking, the Newtonian connection is not integrable. From this, one can deduce that spacetime is curved. The resulting Newton–Cartan theory is a geometric formulation of Newtonian gravity using only covariant concepts, i.e. a description which is valid in any desired coordinate system. In this geometric description, tidal effects—the relative acceleration of bodies in free fall—are related to the derivative of the connection, showing how the modified geometry is caused by the presence of mass. Relativistic generalization As intriguing as geometric Newtonian gravity may be, its basis, classical mechanics, is merely a limiting case of (special) relativistic mechanics. In the language of symmetry: where gravity can be neglected, physics is Lorentz invariant as in special relativity rather than Galilei invariant as in classical mechanics. (The defining symmetry of special relativity is the Poincaré group, which includes translations, rotations and boosts.) The differences between the two become significant when dealing with speeds approaching the speed of light, and with high-energy phenomena. With Lorentz symmetry, additional structures come into play. They are defined by the set of light cones (see image). The light-cones define a causal structure: for each event , there is a set of events that can, in principle, either influence or be influenced by via signals or interactions that do not need to travel faster than light (such as event in the image), and a set of events for which such an influence is impossible (such as event in the image). These sets are observer-independent. In conjunction with the world-lines of freely falling particles, the light-cones can be used to reconstruct the spacetime's semi-Riemannian metric, at least up to a positive scalar factor. In mathematical terms, this defines a conformal structure or conformal geometry. Special relativity is defined in the absence of gravity. For practical applications, it is a suitable model whenever gravity can be neglected. Bringing gravity into play, and assuming the universality of free fall motion, an analogous reasoning as in the previous section applies: there are no global inertial frames. Instead there are approximate inertial frames moving alongside freely falling particles. Translated into the language of spacetime: the straight time-like lines that define a gravity-free inertial frame are deformed to lines that are curved relative to each other, suggesting that the inclusion of gravity necessitates a change in spacetime geometry. A priori, it is not clear whether the new local frames in free fall coincide with the reference frames in which the laws of special relativity hold—that theory is based on the propagation of light, and thus on electromagnetism, which could have a different set of preferred frames. But using different assumptions about the special-relativistic frames (such as their being earth-fixed, or in free fall), one can derive different predictions for the gravitational redshift, that is, the way in which the frequency of light shifts as the light propagates through a gravitational field (cf. below). The actual measurements show that free-falling frames are the ones in which light propagates as it does in special relativity. The generalization of this statement, namely that the laws of special relativity hold to good approximation in freely falling (and non-rotating) reference frames, is known as the Einstein equivalence principle, a crucial guiding principle for generalizing special-relativistic physics to include gravity. The same experimental data shows that time as measured by clocks in a gravitational field—proper time, to give the technical term—does not follow the rules of special relativity. In the language of spacetime geometry, it is not measured by the Minkowski metric. As in the Newtonian case, this is suggestive of a more general geometry. At small scales, all reference frames that are in free fall are equivalent, and approximately Minkowskian. Consequently, we are now dealing with a curved generalization of Minkowski space. The metric tensor that defines the geometry—in particular, how lengths and angles are measured—is not the Minkowski metric of special relativity, it is a generalization known as a semi- or pseudo-Riemannian metric. Furthermore, each Riemannian metric is naturally associated with one particular kind of connection, the Levi-Civita connection, and this is, in fact, the connection that satisfies the equivalence principle and makes space locally Minkowskian (that is, in suitable locally inertial coordinates, the metric is Minkowskian, and its first partial derivatives and the connection coefficients vanish). Einstein's equations Having formulated the relativistic, geometric version of the effects of gravity, the question of gravity's source remains. In Newtonian gravity, the source is mass. In special relativity, mass turns out to be part of a more general quantity called the energy–momentum tensor, which includes both energy and momentum densities as well as stress: pressure and shear. Using the equivalence principle, this tensor is readily generalized to curved spacetime. Drawing further upon the analogy with geometric Newtonian gravity, it is natural to assume that the field equation for gravity relates this tensor and the Ricci tensor, which describes a particular class of tidal effects: the change in volume for a small cloud of test particles that are initially at rest, and then fall freely. In special relativity, conservation of energy–momentum corresponds to the statement that the energy–momentum tensor is divergence-free. This formula, too, is readily generalized to curved spacetime by replacing partial derivatives with their curved-manifold counterparts, covariant derivatives studied in differential geometry. With this additional condition—the covariant divergence of the energy–momentum tensor, and hence of whatever is on the other side of the equation, is zero—the simplest set of equations are what are called Einstein's (field) equations: On the left-hand side is the Einstein tensor, , which is symmetric and a specific divergence-free combination of the Ricci tensor and the metric. In particular, is the curvature scalar. The Ricci tensor itself is related to the more general Riemann curvature tensor as On the right-hand side, is the energy–momentum tensor. All tensors are written in abstract index notation. Matching the theory's prediction to observational results for planetary orbits or, equivalently, assuring that the weak-gravity, low-speed limit is Newtonian mechanics, the proportionality constant is found to be , where is the gravitational constant and the speed of light in vacuum. When there is no matter present, so that the energy–momentum tensor vanishes, the results are the vacuum Einstein equations, In general relativity, the world line of a particle free from all external, non-gravitational force is a particular type of geodesic in curved spacetime. In other words, a freely moving or falling particle always moves along a geodesic. The geodesic equation is: where is a scalar parameter of motion (e.g. the proper time), and are Christoffel symbols (sometimes called the affine connection coefficients or Levi-Civita connection coefficients) which is symmetric in the two lower indices. Greek indices may take the values: 0, 1, 2, 3 and the summation convention is used for repeated indices and . The quantity on the left-hand-side of this equation is the acceleration of a particle, and so this equation is analogous to Newton's laws of motion which likewise provide formulae for the acceleration of a particle. This equation of motion employs the Einstein notation, meaning that repeated indices are summed (i.e. from zero to three). The Christoffel symbols are functions of the four spacetime coordinates, and so are independent of the velocity or acceleration or other characteristics of a test particle whose motion is described by the geodesic equation. Total force in general relativity In general relativity, the effective gravitational potential energy of an object of mass m rotating around a massive central body M is given by A conservative total force can then be obtained as where L is the angular momentum. The first term represents the Newton's force of gravity, which is described by the inverse-square law. The second term represents the centrifugal force in the circular motion. The third term represents the relativistic effect. Alternatives to general relativity There are alternatives to general relativity built upon the same premises, which include additional rules and/or constraints, leading to different field equations. Examples are Whitehead's theory, Brans–Dicke theory, teleparallelism, f(R) gravity and Einstein–Cartan theory. Definition and basic applications The derivation outlined in the previous section contains all the information needed to define general relativity, describe its key properties, and address a question of crucial importance in physics, namely how the theory can be used for model-building. Definition and basic properties General relativity is a metric theory of gravitation. At its core are Einstein's equations, which describe the relation between the geometry of a four-dimensional pseudo-Riemannian manifold representing spacetime, and the energy–momentum contained in that spacetime. Phenomena that in classical mechanics are ascribed to the action of the force of gravity (such as free-fall, orbital motion, and spacecraft trajectories), correspond to inertial motion within a curved geometry of spacetime in general relativity; there is no gravitational force deflecting objects from their natural, straight paths. Instead, gravity corresponds to changes in the properties of space and time, which in turn changes the straightest-possible paths that objects will naturally follow. The curvature is, in turn, caused by the energy–momentum of matter. Paraphrasing the relativist John Archibald Wheeler, spacetime tells matter how to move; matter tells spacetime how to curve. While general relativity replaces the scalar gravitational potential of classical physics by a symmetric rank-two tensor, the latter reduces to the former in certain limiting cases. For weak gravitational fields and slow speed relative to the speed of light, the theory's predictions converge on those of Newton's law of universal gravitation. As it is constructed using tensors, general relativity exhibits general covariance: its laws—and further laws formulated within the general relativistic framework—take on the same form in all coordinate systems. Furthermore, the theory does not contain any invariant geometric background structures, i.e. it is background independent. It thus satisfies a more stringent general principle of relativity, namely that the laws of physics are the same for all observers. Locally, as expressed in the equivalence principle, spacetime is Minkowskian, and the laws of physics exhibit local Lorentz invariance. Model-building The core concept of general-relativistic model-building is that of a solution of Einstein's equations. Given both Einstein's equations and suitable equations for the properties of matter, such a solution consists of a specific semi-Riemannian manifold (usually defined by giving the metric in specific coordinates), and specific matter fields defined on that manifold. Matter and geometry must satisfy Einstein's equations, so in particular, the matter's energy–momentum tensor must be divergence-free. The matter must, of course, also satisfy whatever additional equations were imposed on its properties. In short, such a solution is a model universe that satisfies the laws of general relativity, and possibly additional laws governing whatever matter might be present. Einstein's equations are nonlinear partial differential equations and, as such, difficult to solve exactly. Nevertheless, a number of exact solutions are known, although only a few have direct physical applications. The best-known exact solutions, and also those most interesting from a physics point of view, are the Schwarzschild solution, the Reissner–Nordström solution and the Kerr metric, each corresponding to a certain type of black hole in an otherwise empty universe, and the Friedmann–Lemaître–Robertson–Walker and de Sitter universes, each describing an expanding cosmos. Exact solutions of great theoretical interest include the Gödel universe (which opens up the intriguing possibility of time travel in curved spacetimes), the Taub-NUT solution (a model universe that is homogeneous, but anisotropic), and anti-de Sitter space (which has recently come to prominence in the context of what is called the Maldacena conjecture). Given the difficulty of finding exact solutions, Einstein's field equations are also solved frequently by numerical integration on a computer, or by considering small perturbations of exact solutions. In the field of numerical relativity, powerful computers are employed to simulate the geometry of spacetime and to solve Einstein's equations for interesting situations such as two colliding black holes. In principle, such methods may be applied to any system, given sufficient computer resources, and may address fundamental questions such as naked singularities. Approximate solutions may also be found by perturbation theories such as linearized gravity and its generalization, the post-Newtonian expansion, both of which were developed by Einstein. The latter provides a systematic approach to solving for the geometry of a spacetime that contains a distribution of matter that moves slowly compared with the speed of light. The expansion involves a series of terms; the first terms represent Newtonian gravity, whereas the later terms represent ever smaller corrections to Newton's theory due to general relativity. An extension of this expansion is the parametrized post-Newtonian (PPN) formalism, which allows quantitative comparisons between the predictions of general relativity and alternative theories. Consequences of Einstein's theory General relativity has a number of physical consequences. Some follow directly from the theory's axioms, whereas others have become clear only in the course of many years of research that followed Einstein's initial publication. Gravitational time dilation and frequency shift Assuming that the equivalence principle holds, gravity influences the passage of time. Light sent down into a gravity well is blueshifted, whereas light sent in the opposite direction (i.e., climbing out of the gravity well) is redshifted; collectively, these two effects are known as the gravitational frequency shift. More generally, processes close to a massive body run more slowly when compared with processes taking place farther away; this effect is known as gravitational time dilation. Gravitational redshift has been measured in the laboratory and using astronomical observations. Gravitational time dilation in the Earth's gravitational field has been measured numerous times using atomic clocks, while ongoing validation is provided as a side effect of the operation of the Global Positioning System (GPS). Tests in stronger gravitational fields are provided by the observation of binary pulsars. All results are in agreement with general relativity. However, at the current level of accuracy, these observations cannot distinguish between general relativity and other theories in which the equivalence principle is valid. Light deflection and gravitational time delay General relativity predicts that the path of light will follow the curvature of spacetime as it passes near a star. This effect was initially confirmed by observing the light of stars or distant quasars being deflected as it passes the Sun. This and related predictions follow from the fact that light follows what is called a light-like or null geodesic—a generalization of the straight lines along which light travels in classical physics. Such geodesics are the generalization of the invariance of lightspeed in special relativity. As one examines suitable model spacetimes (either the exterior Schwarzschild solution or, for more than a single mass, the post-Newtonian expansion), several effects of gravity on light propagation emerge. Although the bending of light can also be derived by extending the universality of free fall to light, the angle of deflection resulting from such calculations is only half the value given by general relativity. Closely related to light deflection is the gravitational time delay (or Shapiro delay), the phenomenon that light signals take longer to move through a gravitational field than they would in the absence of that field. There have been numerous successful tests of this prediction. In the parameterized post-Newtonian formalism (PPN), measurements of both the deflection of light and the gravitational time delay determine a parameter called γ, which encodes the influence of gravity on the geometry of space. Gravitational waves Predicted in 1916 by Albert Einstein, there are gravitational waves: ripples in the metric of spacetime that propagate at the speed of light. These are one of several analogies between weak-field gravity and electromagnetism in that, they are analogous to electromagnetic waves. On February 11, 2016, the Advanced LIGO team announced that they had directly detected gravitational waves from a pair of black holes merging. The simplest type of such a wave can be visualized by its action on a ring of freely floating particles. A sine wave propagating through such a ring towards the reader distorts the ring in a characteristic, rhythmic fashion (animated image to the right). Since Einstein's equations are non-linear, arbitrarily strong gravitational waves do not obey linear superposition, making their description difficult. However, linear approximations of gravitational waves are sufficiently accurate to describe the exceedingly weak waves that are expected to arrive here on Earth from far-off cosmic events, which typically result in relative distances increasing and decreasing by or less. Data analysis methods routinely make use of the fact that these linearized waves can be Fourier decomposed. Some exact solutions describe gravitational waves without any approximation, e.g., a wave train traveling through empty space or Gowdy universes, varieties of an expanding cosmos filled with gravitational waves. But for gravitational waves produced in astrophysically relevant situations, such as the merger of two black holes, numerical methods are presently the only way to construct appropriate models. Orbital effects and the relativity of direction General relativity differs from classical mechanics in a number of predictions concerning orbiting bodies. It predicts an overall rotation (precession) of planetary orbits, as
follow directly from the theory's axioms, whereas others have become clear only in the course of many years of research that followed Einstein's initial publication. Gravitational time dilation and frequency shift Assuming that the equivalence principle holds, gravity influences the passage of time. Light sent down into a gravity well is blueshifted, whereas light sent in the opposite direction (i.e., climbing out of the gravity well) is redshifted; collectively, these two effects are known as the gravitational frequency shift. More generally, processes close to a massive body run more slowly when compared with processes taking place farther away; this effect is known as gravitational time dilation. Gravitational redshift has been measured in the laboratory and using astronomical observations. Gravitational time dilation in the Earth's gravitational field has been measured numerous times using atomic clocks, while ongoing validation is provided as a side effect of the operation of the Global Positioning System (GPS). Tests in stronger gravitational fields are provided by the observation of binary pulsars. All results are in agreement with general relativity. However, at the current level of accuracy, these observations cannot distinguish between general relativity and other theories in which the equivalence principle is valid. Light deflection and gravitational time delay General relativity predicts that the path of light will follow the curvature of spacetime as it passes near a star. This effect was initially confirmed by observing the light of stars or distant quasars being deflected as it passes the Sun. This and related predictions follow from the fact that light follows what is called a light-like or null geodesic—a generalization of the straight lines along which light travels in classical physics. Such geodesics are the generalization of the invariance of lightspeed in special relativity. As one examines suitable model spacetimes (either the exterior Schwarzschild solution or, for more than a single mass, the post-Newtonian expansion), several effects of gravity on light propagation emerge. Although the bending of light can also be derived by extending the universality of free fall to light, the angle of deflection resulting from such calculations is only half the value given by general relativity. Closely related to light deflection is the gravitational time delay (or Shapiro delay), the phenomenon that light signals take longer to move through a gravitational field than they would in the absence of that field. There have been numerous successful tests of this prediction. In the parameterized post-Newtonian formalism (PPN), measurements of both the deflection of light and the gravitational time delay determine a parameter called γ, which encodes the influence of gravity on the geometry of space. Gravitational waves Predicted in 1916 by Albert Einstein, there are gravitational waves: ripples in the metric of spacetime that propagate at the speed of light. These are one of several analogies between weak-field gravity and electromagnetism in that, they are analogous to electromagnetic waves. On February 11, 2016, the Advanced LIGO team announced that they had directly detected gravitational waves from a pair of black holes merging. The simplest type of such a wave can be visualized by its action on a ring of freely floating particles. A sine wave propagating through such a ring towards the reader distorts the ring in a characteristic, rhythmic fashion (animated image to the right). Since Einstein's equations are non-linear, arbitrarily strong gravitational waves do not obey linear superposition, making their description difficult. However, linear approximations of gravitational waves are sufficiently accurate to describe the exceedingly weak waves that are expected to arrive here on Earth from far-off cosmic events, which typically result in relative distances increasing and decreasing by or less. Data analysis methods routinely make use of the fact that these linearized waves can be Fourier decomposed. Some exact solutions describe gravitational waves without any approximation, e.g., a wave train traveling through empty space or Gowdy universes, varieties of an expanding cosmos filled with gravitational waves. But for gravitational waves produced in astrophysically relevant situations, such as the merger of two black holes, numerical methods are presently the only way to construct appropriate models. Orbital effects and the relativity of direction General relativity differs from classical mechanics in a number of predictions concerning orbiting bodies. It predicts an overall rotation (precession) of planetary orbits, as well as orbital decay caused by the emission of gravitational waves and effects related to the relativity of direction. Precession of apsides In general relativity, the apsides of any orbit (the point of the orbiting body's closest approach to the system's center of mass) will precess; the orbit is not an ellipse, but akin to an ellipse that rotates on its focus, resulting in a rose curve-like shape (see image). Einstein first derived this result by using an approximate metric representing the Newtonian limit and treating the orbiting body as a test particle. For him, the fact that his theory gave a straightforward explanation of Mercury's anomalous perihelion shift, discovered earlier by Urbain Le Verrier in 1859, was important evidence that he had at last identified the correct form of the gravitational field equations. The effect can also be derived by using either the exact Schwarzschild metric (describing spacetime around a spherical mass) or the much more general post-Newtonian formalism. It is due to the influence of gravity on the geometry of space and to the contribution of self-energy to a body's gravity (encoded in the nonlinearity of Einstein's equations). Relativistic precession has been observed for all planets that allow for accurate precession measurements (Mercury, Venus, and Earth), as well as in binary pulsar systems, where it is larger by five orders of magnitude. In general relativity the perihelion shift , expressed in radians per revolution, is approximately given by where: is the semi-major axis is the orbital period is the speed of light in vacuum is the orbital eccentricity Orbital decay According to general relativity, a binary system will emit gravitational waves, thereby losing energy. Due to this loss, the distance between the two orbiting bodies decreases, and so does their orbital period. Within the Solar System or for ordinary double stars, the effect is too small to be observable. This is not the case for a close binary pulsar, a system of two orbiting neutron stars, one of which is a pulsar: from the pulsar, observers on Earth receive a regular series of radio pulses that can serve as a highly accurate clock, which allows precise measurements of the orbital period. Because neutron stars are immensely compact, significant amounts of energy are emitted in the form of gravitational radiation. The first observation of a decrease in orbital period due to the emission of gravitational waves was made by Hulse and Taylor, using the binary pulsar PSR1913+16 they had discovered in 1974. This was the first detection of gravitational waves, albeit indirect, for which they were awarded the 1993 Nobel Prize in physics. Since then, several other binary pulsars have been found, in particular the double pulsar PSR J0737-3039, where both stars are pulsars and which was last reported to also be in agreement with general relativity in 2021 after 16 years of observations. Geodetic precession and frame-dragging Several relativistic effects are directly related to the relativity of direction. One is geodetic precession: the axis direction of a gyroscope in free fall in curved spacetime will change when compared, for instance, with the direction of light received from distant stars—even though such a gyroscope represents the way of keeping a direction as stable as possible ("parallel transport"). For the Moon–Earth system, this effect has been measured with the help of lunar laser ranging. More recently, it has been measured for test masses aboard the satellite Gravity Probe B to a precision of better than 0.3%. Near a rotating mass, there are gravitomagnetic or frame-dragging effects. A distant observer will determine that objects close to the mass get "dragged around". This is most extreme for rotating black holes where, for any object entering a zone known as the ergosphere, rotation is inevitable. Such effects can again be tested through their influence on the orientation of gyroscopes in free fall. Somewhat controversial tests have been performed using the LAGEOS satellites, confirming the relativistic prediction. Also the Mars Global Surveyor probe around Mars has been used. Interpretations Neo-Lorentzian Interpretation Examples of prominent physicists who support neo-Lorentzian explanations of general relativity are Franco Selleri and Antony Valentini. Astrophysical applications Gravitational lensing The deflection of light by gravity is responsible for a new class of astronomical phenomena. If a massive object is situated between the astronomer and a distant target object with appropriate mass and relative distances, the astronomer will see multiple distorted images of the target. Such effects are known as gravitational lensing. Depending on the configuration, scale, and mass distribution, there can be two or more images, a bright ring known as an Einstein ring, or partial rings called arcs. The earliest example was discovered in 1979; since then, more than a hundred gravitational lenses have been observed. Even if the multiple images are too close to each other to be resolved, the effect can still be measured, e.g., as an overall brightening of the target object; a number of such "microlensing events" have been observed. Gravitational lensing has developed into a tool of observational astronomy. It is used to detect the presence and distribution of dark matter, provide a "natural telescope" for observing distant galaxies, and to obtain an independent estimate of the Hubble constant. Statistical evaluations of lensing data provide valuable insight into the structural evolution of galaxies. Gravitational-wave astronomy Observations of binary pulsars provide strong indirect evidence for the existence of gravitational waves (see Orbital decay, above). Detection of these waves is a major goal of current relativity-related research. Several land-based gravitational wave detectors are currently in operation, most notably the interferometric detectors GEO 600, LIGO (two detectors), TAMA 300 and VIRGO. Various pulsar timing arrays are using millisecond pulsars to detect gravitational waves in the 10−9 to 10−6 Hertz frequency range, which originate from binary supermassive blackholes. A European space-based detector, eLISA / NGO, is currently under development, with a precursor mission (LISA Pathfinder) having launched in December 2015. Observations of gravitational waves promise to complement observations in the electromagnetic spectrum. They are expected to yield information about black holes and other dense objects such as neutron stars and white dwarfs, about certain kinds of supernova implosions, and about processes in the very early universe, including the signature of certain types of hypothetical cosmic string. In February 2016, the Advanced LIGO team announced that they had detected gravitational waves from a black hole merger. Black holes and other compact objects Whenever the ratio of an object's mass to its radius becomes sufficiently large, general relativity predicts the formation of a black hole, a region of space from which nothing, not even light, can escape. In the currently accepted models of stellar evolution, neutron stars of around 1.4 solar masses, and stellar black holes with a few to a few dozen solar masses, are thought to be the final state for the evolution of massive stars. Usually a galaxy has one supermassive black hole with a few million to a few billion solar masses in its center, and its presence is thought to have played an important role in the formation of the galaxy and larger cosmic structures. Astronomically, the most important property of compact objects is that they provide a supremely efficient mechanism for converting gravitational energy into electromagnetic radiation. Accretion, the falling of dust or gaseous matter onto stellar or supermassive black holes, is thought to be responsible for some spectacularly luminous astronomical objects, notably diverse kinds of active galactic nuclei on galactic scales and stellar-size objects such as microquasars. In particular, accretion can lead to relativistic jets, focused beams of highly energetic particles that are being flung into space at almost light speed. General relativity plays a central role in modelling all these phenomena, and observations provide strong evidence for the existence of black holes with the properties predicted by the theory. Black holes are also sought-after targets in the search for gravitational waves (cf. Gravitational waves, above). Merging black hole binaries should lead to some of the strongest gravitational wave signals reaching detectors here on Earth, and the phase directly before the merger ("chirp") could be used as a "standard candle" to deduce the distance to the merger events–and hence serve as a probe of cosmic expansion at large distances. The gravitational waves produced as a stellar black hole plunges into a supermassive one should provide direct information about the supermassive black hole's geometry. Cosmology The current models of cosmology are based on Einstein's field equations, which include the cosmological constant since it has important influence on the large-scale dynamics of the cosmos, where is the spacetime metric. Isotropic and homogeneous solutions of these enhanced equations, the Friedmann–Lemaître–Robertson–Walker solutions, allow physicists to model a universe that has evolved over the past 14 billion years from a hot, early Big Bang phase. Once a small number of parameters (for example the universe's mean matter density) have been fixed by astronomical observation, further observational data can be used to put the models to the test. Predictions, all successful, include the initial abundance of chemical elements formed in a period of primordial nucleosynthesis, the large-scale structure of the universe, and the existence and properties of a "thermal echo" from the early cosmos, the cosmic background radiation. Astronomical observations of the cosmological expansion rate allow the total amount of matter in the universe to be estimated, although the nature of that matter remains mysterious in part. About 90% of all matter appears to be dark matter, which has mass (or, equivalently, gravitational influence), but does not interact electromagnetically and, hence, cannot be observed directly. There is no generally accepted description of this new kind of matter, within the framework of known particle physics or otherwise. Observational evidence from redshift surveys of distant supernovae and measurements of the cosmic background radiation also show that the evolution of our universe is significantly influenced by a cosmological constant resulting in an acceleration of cosmic expansion or, equivalently, by a form of energy with an unusual equation of state, known as dark energy, the nature of which remains unclear. An inflationary phase, an additional phase of strongly accelerated expansion at cosmic times of around 10−33 seconds, was hypothesized in 1980 to account for several puzzling observations that were unexplained by classical cosmological models, such as the nearly perfect homogeneity of the cosmic background radiation. Recent measurements of the cosmic background radiation have resulted in the first evidence for this scenario. However, there is a bewildering variety of possible inflationary scenarios, which cannot be restricted by current observations. An even larger question is the physics of the earliest universe, prior to the inflationary phase and close to where the classical models predict the big bang singularity. An authoritative answer would require a complete theory of quantum gravity, which has not yet been developed (cf. the section on quantum gravity, below). Exotic solutions: Time travel, Warp drives Kurt Gödel showed that solutions to Einstein's equations exist that contain closed timelike curves (CTCs), which allow for loops in time. The solutions require extreme physical conditions unlikely ever to occur in practice, and it remains an open question whether further laws of physics will eliminate them completely. Since then, other—similarly impractical—GR solutions containing CTCs have been found, such as the Tipler cylinder and traversable wormholes. Stephen Hawking has introduced Chronology protection conjecture which is an assumption beyond those of standard general relativity to prevent time travel. Some exact solutions in general relativity such as Alcubierre drive present examples of warp drive but these solutions requires exotic matter distribution, and generally suffers from semiclassical instability. Advanced concepts Asymptotic symmetries The spacetime symmetry group for special relativity is the Poincaré group, which is a ten-dimensional group of three Lorentz boosts, three rotations, and four spacetime translations. It is logical to ask what symmetries if any might apply in General Relativity. A tractable case might be to consider the symmetries of spacetime as seen by observers located far away from all sources of the gravitational field. The naive expectation for asymptotically flat spacetime symmetries might be simply to extend and reproduce the symmetries of flat spacetime of special relativity, viz., the Poincaré group. In 1962 Hermann Bondi, M. G. van der Burg, A. W. Metzner and Rainer K. Sachs addressed this asymptotic symmetry problem in order to investigate the flow of energy at infinity due to propagating gravitational waves. Their first step was to decide on some physically
The terms "genealogy" and "family history" are often used synonymously, but some entities offer a slight difference in definition. The Society of Genealogists, while also using the terms interchangeably, describes genealogy as the "establishment of a pedigree by extracting evidence, from valid sources, of how one generation is connected to the next" and family history as "a biographical study of a genealogically proven family and of the community and country in which they lived". Motivation Individuals conduct genealogical research for a number of reasons. Personal or medical interest Private individuals research genealogy out of curiosity about their heritage. This curiosity can be particularly strong among those whose family histories were lost or unknown due to, for example, adoption or separation from family through divorce, death, or other situations. In addition to simply wanting to know more about who they are and where they came from, individuals may research their genealogy to learn about any hereditary diseases in their family history. There is a growing interest in family history in the media as a result of advertising and television shows sponsored by large genealogy companies, such as Ancestry.com. This, coupled with easier access to online records and the affordability of DNA tests, has both inspired curiosity and allowed those who are curious to easily start investigating their ancestry. Community or religious obligation In communitarian societies, one's identity is defined as much by one's kin network as by individual achievement, and the question "Who are you?" would be answered by a description of father, mother, and tribe. New Zealand Māori, for example, learn whakapapa (genealogies) to discover who they are. Family history plays a part in the practice of some religious belief systems. For example, The Church of Jesus Christ of Latter-day Saints (LDS Church) has a doctrine of baptism for the dead, which necessitates that members of that faith engage in family history research. In East Asian countries that were historically shaped by Confucianism, many people follow a practice of ancestor worship as well as genealogical record-keeping. Ancestors' names are inscribed on tablets and placed in shrines, where rituals are performed. Genealogies are also recorded in genealogy books. This practice is rooted in the belief that respect for one's family is a foundation for a healthy society. Establishing identity Royal families, both historically and in modern times, keep records of their genealogies in order to establish their right to rule and determine who will be the next sovereign. For centuries in various cultures, one's genealogy has been a source of political and social status. Some countries and indigenous tribes allow individuals to obtain citizenship based on their genealogy. In Ireland and in Greece, for example, an individual can become a citizen if one of their grandparents was born in that country, regardless of their own or their parents' birthplace. In societies such as Australia or the United States, by the 20th century, there was growing pride in the pioneers and nation-builders. Establishing descent from these was, and is, important to lineage societies, such as the Daughters of the American Revolution and The General Society of Mayflower Descendants. Modern family history explores new sources of status, such as celebrating the resilience of families that survived generations of poverty or slavery, or the success of families in integrating across racial or national boundaries. Some family histories even emphasize links to celebrity criminals, such as the bushranger Ned Kelly in Australia. Legal and forensic research Lawyers involved in probate cases do genealogy to locate heirs of property. Detectives may perform genealogical research using DNA evidence to identify victims of homicides or perpetrators of crimes. Scholarly research Historians and geneticists may carry out genealogical research to gain a greater understanding of specific topics in their respective fields, and some may employ professional genealogists in connection with specific aspects of their research. They also publish their research in peer-reviewed journals. The introduction of postgraduate courses in genealogy in recent years has given genealogy more of an academic focus, with the emergence of peer-reviewed journals in this area. Scholarly genealogy is beginning to emerge as a discipline in its own right, with an increasing number of individuals who have obtained genealogical qualifications carrying out research on a diverse range of topics related to genealogy, both within academic institutions and independently. History Historically, in Western societies, the focus of genealogy was on the kinship and descent of rulers and nobles, often arguing or demonstrating the legitimacy of claims to wealth and power. The term often overlapped with heraldry, in which the ancestry of royalty was reflected in their coats of arms. Modern scholars consider many claimed noble ancestries to be fabrications, such as the Anglo-Saxon Chronicle that traced the ancestry of several English kings to the god Woden. Some family trees have been maintained for considerable periods. The family tree of Confucius has been maintained for over 2,500 years and is listed in the Guinness Book of World Records as the largest extant family tree. The fifth edition of the Confucius Genealogy was printed in 2009 by the Confucius Genealogy Compilation Committee (CGCC). Modern times In modern times, genealogy has become more widespread, with commoners as well as nobility researching and maintaining their family trees. Genealogy received a boost in the late 1970s with the television broadcast of Roots: The Saga of an American Family by Alex Haley. His account of his family's descent from the African tribesman Kunta Kinte inspired many others to study their own lines. With the advent of the Internet, the number of resources readily accessible to genealogists has vastly increased, resulting in an explosion of interest in the topic. Genealogy is one of the most popular topics on the Internet. The Internet has become a major source not only of data for genealogists but also of education and communication. India Some notable places where traditional genealogy records are kept include Hindu genealogy registers at Haridwar (Uttarakhand), Varanasi and Allahabad (Uttar Pradesh), Kurukshetra (Haryana), Trimbakeshwar (Maharashtra), and Chintpurni (Himachal Pradesh). United States Genealogical research in the United States was first systematized in the early 19th century, especially by John Farmer (1789–1838). Before Farmer's efforts, tracing one's genealogy was seen as an attempt by the American colonists to secure a measure of social standing, an aim that was counter to the new republic's egalitarian, future-oriented ideals (as outlined in the Constitution). As Fourth of July celebrations commemorating the Founding Fathers and the heroes of the Revolutionary War became increasingly popular, however, the pursuit of "antiquarianism," which focused on local history, became acceptable as a way to honor the achievements of early Americans. Farmer capitalized on the acceptability of antiquarianism to frame genealogy within the early republic's ideological framework of pride in one's American ancestors. He corresponded with other antiquarians in New England, where antiquarianism and genealogy were well established, and became a coordinator, booster, and contributor to the growing movement. In the 1820s, he and fellow antiquarians began to produce genealogical and antiquarian tracts in earnest, slowly gaining a devoted audience among the American people. Though Farmer died in 1839, his efforts led to the creation of the New England Historic Genealogical Society (NEHGS), one of New England's oldest and most prominent organizations dedicated to the preservation of public records. NEHGS publishes the New England Historical and Genealogical Register. The Genealogical Society of Utah, founded in 1894, later became the Family History Department of The Church of Jesus Christ of Latter-day Saints. The department's research facility, the Family History Library, which Utah.com states is "the largest genealogical library in the world," was established to assist in tracing family lineages for special religious ceremonies which Latter-day Saints believe will seal family units together for eternity. Latter-day Saints believe that this fulfilled a biblical prophecy stating that the prophet Elijah would return to "turn the heart of the fathers to the children, and the heart of the children to their fathers." There is a network of church-operated Family History Centers all over the country and around the world, where volunteers assist the public with tracing their ancestors.<ref>"Family History Centers ," The Church of Jesus Christ of Latter-day Saints: Newsroom, Accessed 2 Jul 2019.</ref> Brigham Young University offers bachelor's degree, minor, and concentration programs in Family History and is the only school in North America to offer this. The American Society of Genealogists is the scholarly honorary society of the U.S. genealogical field. Founded by John Insley Coddington, Arthur Adams, and Meredith B. Colket, Jr., in December 1940, its membership is limited to 50 living fellows. ASG has semi-annually published The Genealogist, a scholarly journal of genealogical research, since 1980. Fellows of the American Society of Genealogists, who bear the post-nominal acronym FASG, have written some of the most notable genealogical materials of the last half-century. Some of the most notable scholarly American genealogical journals are The American Genealogist, National Genealogical Society Quarterly, The New England Historical and Genealogical Register, The New York Genealogical and Biographical Record, and The Genealogist.David L. Greene, "Scholarly Genealogical Journals in America, The American Genealogist 61 (1985-86): 116-20. Research process Genealogical research is a complex process that uses historical records and sometimes genetic analysis to demonstrate kinship. Reliable conclusions are based on the quality of sources (ideally, original records), the information within those sources, (ideally, primary or firsthand information), and the evidence that can be drawn (directly or indirectly), from that information. In many instances, genealogists must skillfully assemble indirect or circumstantial evidence to build a case for identity and kinship. All evidence and conclusions, together with the documentation that supports them, is then assembled to create a cohesive genealogy or family history. Genealogists begin their research by collecting family documents and stories. This creates a foundation for documentary research, which involves examining and evaluating historical records for evidence about ancestors and other relatives, their kinship ties, and the events that occurred in their lives. As a rule, genealogists begin with the present and work backwards in time. Historical, social, and family context is essential to achieving correct identification of individuals and relationships. Source citation is also important when conducting genealogical research. To keep track of collected material, family group sheets and pedigree charts are used. Formerly handwritten, these can now be generated by genealogical software. Genetic analysis Because a person's DNA contains information that has been passed down relatively unchanged from early ancestors, analysis of DNA is sometimes used for genealogical research. Three DNA types are of particular interest. Mitochondrial DNA (mtDNA) is contained in the mitochondria of the egg cell and is passed down from a mother to all of her children, both male and female; however, only females pass it on to their children. Y-DNA is present only in males and is passed down from a father to his sons (direct male line) with only minor mutations occurring over time. Autosomal DNA (atDNA), is found in the 22 non-sex chromosomes (autosomes) and is inherited from both parents; thus, it can uncover relatives from any branch of the family. A genealogical DNA test allows two individuals to find the probability that they are, or are not, related within an estimated number of generations. Individual genetic test results are collected in databases to match people descended from a relatively recent common ancestor. See, for example, the Molecular Genealogy Research Project. Some tests are limited to either the patrilineal or the matrilineal line. Collaboration Most genealogy software programs can export information about persons and their relationships in a standardized format called a GEDCOM. In that format, it can be shared with other genealogists, added to databases, or converted into family web sites. Social networking service (SNS) websites allow genealogists to share data and build their family trees online. Members can upload their family trees and contact other family historians to fill in gaps in their research. In addition to the (SNS) websites, there are other resources that encourage genealogists to connect and share information, such as rootsweb.ancestry.com and rsl.rootsweb.ancestry.com. Volunteerism Volunteer efforts figure prominently in genealogy. These range from the extremely informal to the highly organized. On the informal side are the many popular and useful message boards such as Rootschat and mailing lists on particular surnames, regions, and other topics. These forums can be used to try to find relatives, request record lookups, obtain research advice, and much more. Many genealogists participate in loosely organized projects, both online and off. These collaborations take numerous forms. Some projects prepare name indexes for records, such as probate cases, and publish the indexes, either online or off. These indexes can be used as finding aids to locate original records. Other projects transcribe or abstract records. Offering record lookups for particular geographic areas is another common service. Volunteers do record lookups or take photos in their home areas for researchers who are unable to travel. Those looking for a structured volunteer environment can join one of thousands of genealogical societies worldwide. Most societies have a unique area of focus, such as a particular surname, ethnicity, geographic area, or descendancy from participants in a given historical event. Genealogical societies are almost exclusively staffed by volunteers and may offer a broad range of services, including maintaining libraries for members' use, publishing newsletters, providing research assistance to the public, offering classes or seminars, and organizing record preservation or transcription projects. Software Genealogy software is used to collect, store, sort, and display genealogical data. At a minimum, genealogy software accommodates basic information about individuals, including births, marriages, and deaths. Many programs allow for additional biographical information, including occupation, residence, and notes, and most also offer a method for keeping track of the sources for each piece of evidence. Most programs can generate basic kinship charts and reports, allow for the import of digital photographs and the export of data in the GEDCOM format (short for GEnealogical Data COMmunication) so that data can be shared with those using other genealogy software. More advanced features include the ability to restrict the information that is shared, usually by removing information about living people out of privacy concerns; the import of sound files; the generation of family history books, web pages and other publications; the ability to handle same-sex marriages and children born out of wedlock; searching the Internet for data; and the provision of research guidance. Programs may be geared toward a specific religion, with fields relevant to that religion, or to specific nationalities or ethnic groups, with source types relevant for those groups. Online resources involve complex programming and large data bases, such as censuses. Records and documentation Genealogists use a wide variety of records in their research. To effectively conduct genealogical research, it is important to understand how the records were created, what information is included in them, and how and where to access them.David Hey, The Oxford Companion to Family and Local History (2nd ed. 2008). List of record types Records that are used in genealogy research include: Vital records Birth records Death records Marriage and divorce records Adoption records Biographies and biographical profiles (e.g. Who's Who) Cemetery lists Census records Church and Religious records Baptism or christening Brit milah or Baby naming certificates Confirmation Bar or bat mitzvah Marriage Funeral or death Membership City directories and telephone directories Coroner's reports Court records Criminal records Civil records Diaries, personal letters and family Bibles DNA tests Emigration, immigration and naturalization records Hereditary & lineage organization records, e.g. Daughters of the American Revolution records Land and property records, deeds Medical records Military and conscription records Newspaper articles Obituaries Occupational records Oral histories Passports Photographs Poorhouse, workhouse, almshouse, and asylum records School and alumni association records Ship passenger lists Social Security (within the US) and pension records Tax records Tombstones, cemetery records, and funeral home records Voter registration records Wills and probate records To
by large genealogy companies, such as Ancestry.com. This, coupled with easier access to online records and the affordability of DNA tests, has both inspired curiosity and allowed those who are curious to easily start investigating their ancestry. Community or religious obligation In communitarian societies, one's identity is defined as much by one's kin network as by individual achievement, and the question "Who are you?" would be answered by a description of father, mother, and tribe. New Zealand Māori, for example, learn whakapapa (genealogies) to discover who they are. Family history plays a part in the practice of some religious belief systems. For example, The Church of Jesus Christ of Latter-day Saints (LDS Church) has a doctrine of baptism for the dead, which necessitates that members of that faith engage in family history research. In East Asian countries that were historically shaped by Confucianism, many people follow a practice of ancestor worship as well as genealogical record-keeping. Ancestors' names are inscribed on tablets and placed in shrines, where rituals are performed. Genealogies are also recorded in genealogy books. This practice is rooted in the belief that respect for one's family is a foundation for a healthy society. Establishing identity Royal families, both historically and in modern times, keep records of their genealogies in order to establish their right to rule and determine who will be the next sovereign. For centuries in various cultures, one's genealogy has been a source of political and social status. Some countries and indigenous tribes allow individuals to obtain citizenship based on their genealogy. In Ireland and in Greece, for example, an individual can become a citizen if one of their grandparents was born in that country, regardless of their own or their parents' birthplace. In societies such as Australia or the United States, by the 20th century, there was growing pride in the pioneers and nation-builders. Establishing descent from these was, and is, important to lineage societies, such as the Daughters of the American Revolution and The General Society of Mayflower Descendants. Modern family history explores new sources of status, such as celebrating the resilience of families that survived generations of poverty or slavery, or the success of families in integrating across racial or national boundaries. Some family histories even emphasize links to celebrity criminals, such as the bushranger Ned Kelly in Australia. Legal and forensic research Lawyers involved in probate cases do genealogy to locate heirs of property. Detectives may perform genealogical research using DNA evidence to identify victims of homicides or perpetrators of crimes. Scholarly research Historians and geneticists may carry out genealogical research to gain a greater understanding of specific topics in their respective fields, and some may employ professional genealogists in connection with specific aspects of their research. They also publish their research in peer-reviewed journals. The introduction of postgraduate courses in genealogy in recent years has given genealogy more of an academic focus, with the emergence of peer-reviewed journals in this area. Scholarly genealogy is beginning to emerge as a discipline in its own right, with an increasing number of individuals who have obtained genealogical qualifications carrying out research on a diverse range of topics related to genealogy, both within academic institutions and independently. History Historically, in Western societies, the focus of genealogy was on the kinship and descent of rulers and nobles, often arguing or demonstrating the legitimacy of claims to wealth and power. The term often overlapped with heraldry, in which the ancestry of royalty was reflected in their coats of arms. Modern scholars consider many claimed noble ancestries to be fabrications, such as the Anglo-Saxon Chronicle that traced the ancestry of several English kings to the god Woden. Some family trees have been maintained for considerable periods. The family tree of Confucius has been maintained for over 2,500 years and is listed in the Guinness Book of World Records as the largest extant family tree. The fifth edition of the Confucius Genealogy was printed in 2009 by the Confucius Genealogy Compilation Committee (CGCC). Modern times In modern times, genealogy has become more widespread, with commoners as well as nobility researching and maintaining their family trees. Genealogy received a boost in the late 1970s with the television broadcast of Roots: The Saga of an American Family by Alex Haley. His account of his family's descent from the African tribesman Kunta Kinte inspired many others to study their own lines. With the advent of the Internet, the number of resources readily accessible to genealogists has vastly increased, resulting in an explosion of interest in the topic. Genealogy is one of the most popular topics on the Internet. The Internet has become a major source not only of data for genealogists but also of education and communication. India Some notable places where traditional genealogy records are kept include Hindu genealogy registers at Haridwar (Uttarakhand), Varanasi and Allahabad (Uttar Pradesh), Kurukshetra (Haryana), Trimbakeshwar (Maharashtra), and Chintpurni (Himachal Pradesh). United States Genealogical research in the United States was first systematized in the early 19th century, especially by John Farmer (1789–1838). Before Farmer's efforts, tracing one's genealogy was seen as an attempt by the American colonists to secure a measure of social standing, an aim that was counter to the new republic's egalitarian, future-oriented ideals (as outlined in the Constitution). As Fourth of July celebrations commemorating the Founding Fathers and the heroes of the Revolutionary War became increasingly popular, however, the pursuit of "antiquarianism," which focused on local history, became acceptable as a way to honor the achievements of early Americans. Farmer capitalized on the acceptability of antiquarianism to frame genealogy within the early republic's ideological framework of pride in one's American ancestors. He corresponded with other antiquarians in New England, where antiquarianism and genealogy were well established, and became a coordinator, booster, and contributor to the growing movement. In the 1820s, he and fellow antiquarians began to produce genealogical and antiquarian tracts in earnest, slowly gaining a devoted audience among the American people. Though Farmer died in 1839, his efforts led to the creation of the New England Historic Genealogical Society (NEHGS), one of New England's oldest and most prominent organizations dedicated to the preservation of public records. NEHGS publishes the New England Historical and Genealogical Register. The Genealogical Society of Utah, founded in 1894, later became the Family History Department of The Church of Jesus Christ of Latter-day Saints. The department's research facility, the Family History Library, which Utah.com states is "the largest genealogical library in the world," was established to assist in tracing family lineages for special religious ceremonies which Latter-day Saints believe will seal family units together for eternity. Latter-day Saints believe that this fulfilled a biblical prophecy stating that the prophet Elijah would return to "turn the heart of the fathers to the children, and the heart of the children to their fathers." There is a network of church-operated Family History Centers all over the country and around the world, where volunteers assist the public with tracing their ancestors.<ref>"Family History Centers ," The Church of Jesus Christ of Latter-day Saints: Newsroom, Accessed 2 Jul 2019.</ref> Brigham Young University offers bachelor's degree, minor, and concentration programs in Family History and is the only school in North America to offer this. The American Society of Genealogists is the scholarly honorary society of the U.S. genealogical field. Founded by John Insley Coddington, Arthur Adams, and Meredith B. Colket, Jr., in December 1940, its membership is limited to 50 living fellows. ASG has semi-annually published The Genealogist, a scholarly journal of genealogical research, since 1980. Fellows of the American Society of Genealogists, who bear the post-nominal acronym FASG, have written some of the most notable genealogical materials of the last half-century. Some of the most notable scholarly American genealogical journals are The American Genealogist, National Genealogical Society Quarterly, The New England Historical and Genealogical Register, The New York Genealogical and Biographical Record, and The Genealogist.David L. Greene, "Scholarly Genealogical Journals in America, The American Genealogist 61 (1985-86): 116-20. Research process Genealogical research is a complex process that uses historical records and sometimes genetic analysis to demonstrate kinship. Reliable conclusions are based on the quality of sources (ideally, original records), the information within those sources, (ideally, primary or firsthand information), and the evidence that can be drawn (directly or indirectly), from that information. In many instances, genealogists must skillfully assemble indirect or circumstantial evidence to build a case for identity and kinship. All evidence and conclusions, together with the documentation that supports them, is then assembled to create a cohesive genealogy or family history. Genealogists begin their research by collecting family documents and stories. This creates a foundation for documentary research, which involves examining and evaluating historical records for evidence about ancestors and other relatives, their kinship ties, and the events that occurred in their lives. As a rule, genealogists begin with the present and work backwards in time. Historical, social, and family context is essential to achieving correct identification of individuals and relationships. Source citation is also important when conducting genealogical research. To keep track of collected material, family group sheets and pedigree charts are used. Formerly handwritten, these can now be generated by genealogical software. Genetic analysis Because a person's DNA contains information that has been passed down relatively unchanged from early ancestors, analysis of DNA is sometimes used for genealogical research. Three DNA types are of particular interest. Mitochondrial DNA (mtDNA) is contained in the mitochondria of the egg cell and is passed down from a mother to all of her children, both male and female; however, only females pass it on to their children. Y-DNA is present only in males and is passed down from a father to his sons (direct male line) with only minor mutations occurring over time. Autosomal DNA (atDNA), is found in the 22 non-sex chromosomes (autosomes) and is inherited from both parents; thus, it can uncover relatives from any branch of the family. A genealogical DNA test allows two individuals to find the probability that they are, or are not, related within an estimated number of generations. Individual genetic test results are collected in databases to match people descended from a relatively recent common ancestor. See, for example, the Molecular Genealogy Research Project. Some tests are limited to either the patrilineal or the matrilineal line. Collaboration Most genealogy software programs can export information about persons and their relationships in a standardized format called a GEDCOM. In that format, it can be shared with other genealogists, added to databases, or converted into family web sites. Social networking service (SNS) websites allow genealogists to share data and build their family trees online. Members can upload their family trees and contact other family historians to fill in gaps in their research. In addition to the (SNS) websites, there are other resources that encourage genealogists to connect and share information, such as rootsweb.ancestry.com and rsl.rootsweb.ancestry.com. Volunteerism Volunteer efforts figure prominently in genealogy. These range from the extremely informal to the highly organized. On the informal side are the many popular and useful message boards such as Rootschat and mailing lists on particular surnames, regions, and other topics. These forums can be used to try to find relatives, request record lookups, obtain research advice, and much more. Many genealogists participate in loosely organized projects, both online and off. These collaborations take numerous forms. Some projects prepare name indexes for records, such as probate cases, and publish the indexes, either online or off. These indexes can be used as finding aids to locate original records. Other projects transcribe or abstract records. Offering record lookups for particular geographic areas is another common service. Volunteers do record lookups or take photos in their home areas for researchers who are unable to travel. Those looking for a structured volunteer environment can join one of thousands of genealogical societies worldwide. Most societies have a unique area of focus, such as a particular surname, ethnicity, geographic area, or descendancy from participants in a given historical event. Genealogical societies are almost exclusively staffed by volunteers and may offer a broad range of services, including maintaining libraries for members' use, publishing newsletters, providing research assistance to the public, offering classes or seminars, and organizing record preservation or transcription projects. Software Genealogy software is used to collect, store, sort, and display genealogical data. At a minimum, genealogy software accommodates basic information about individuals, including births, marriages, and deaths. Many programs allow for additional biographical information, including occupation, residence, and notes, and most also offer a method for keeping track of the sources for each piece of evidence. Most programs can generate basic kinship charts and reports, allow for the import of digital photographs and the export of data in the GEDCOM format (short for GEnealogical Data COMmunication) so that data can be shared with those using other genealogy software. More advanced features include the ability to restrict the information that is shared, usually by removing information about living people out of privacy concerns; the import of sound files; the generation of family history books, web pages and other publications; the ability to handle same-sex marriages and children born out of wedlock; searching the Internet for data; and the provision of research guidance. Programs may be geared toward a specific religion, with fields relevant to that religion, or to specific nationalities or ethnic groups, with source types relevant for those groups. Online resources involve complex programming and large data bases, such as censuses. Records and documentation Genealogists use a wide variety of records in their research. To effectively conduct genealogical research, it is important to understand how the records were created, what information is included in them, and how and where to access them.David Hey, The Oxford Companion to Family and Local History (2nd ed. 2008). List of record types Records that are used in genealogy research include: Vital records Birth records Death records Marriage and divorce records Adoption records Biographies and biographical profiles (e.g. Who's Who) Cemetery lists Census records Church and Religious records Baptism or christening Brit milah or Baby naming certificates Confirmation Bar or bat mitzvah Marriage Funeral or death Membership City directories and telephone directories Coroner's reports Court records Criminal records Civil records Diaries, personal letters and family Bibles DNA tests Emigration, immigration and naturalization records Hereditary & lineage organization records, e.g. Daughters of the American Revolution records Land and property records, deeds Medical records Military and conscription records Newspaper articles Obituaries Occupational records Oral histories Passports Photographs Poorhouse, workhouse, almshouse, and asylum records School and alumni association records Ship
for a more transparent electoral process and reformed many governmental institutions. Abundant petroleum and foreign private investment have helped make Gabon one of the most prosperous countries in Sub-Saharan Africa, with the fifth highest HDI in the region (after Mauritius, Seychelles, Botswana and South Africa) and the fifth highest GDP per capita (PPP) in all of Africa (after Seychelles, Mauritius, Equatorial Guinea and Botswana). Its GDP grew by more than 6% per year from 2010 to 2012. However, because of inequality in income distribution, a significant proportion of the population remains poor. Gabon is rich in folklore and mythology. "Raconteurs" keep traditions alive such as the mvett among the Fangs and the ingwala among the Nzebis. Gabon is also known for its masks, such as the n'goltang (Fang) and the reliquary figures of the Kota. Etymology Gabon's name originates from gabão, Portuguese for "cloak", which is roughly the shape of the estuary of the Komo River by Libreville. History Pre-Colonial Era (pre-1885) The earliest inhabitants of the area were Pygmy peoples. They were largely replaced and absorbed by Bantu tribes as they migrated. In the 15th century, the first Europeans arrived. By the 18th century, a Myeni-speaking kingdom known as Orungu formed in Gabon. Through its control of the slave trade in the 18th and 19th centuries, it was able to become the most powerful of the trading centers that developed in Gabon during that period. On 10 February 1722, Bartholomew Roberts, Barti Ddu, a Welsh pirate known in English as Black Bart, died at sea off Cape Lopez. He raided ships off the Americas and West Africa from 1719 to 1722. Colonial Era (1885–1960) French explorer Pierre Savorgnan de Brazza led his first mission to the Gabon-Congo area in 1875. He founded the town of Franceville, and was later colonial governor. Several Bantu groups lived in the area that is now Gabon when France officially occupied it in 1885. In 1910, Gabon became one of the four territories of French Equatorial Africa, a federation that survived until 1958. In World War II, the Allies invaded Gabon in order to overthrow the pro-Vichy France colonial administration. On 28 November 1958, Gabon became an autonomous republic within the French Community, and on 17 August 1960, it became fully independent. Post-Independence (1960–present) The first president of Gabon, elected in 1961, was Léon M'ba, with Omar Bongo Ondimba as his vice president. After M'ba's accession to power, the press was suppressed, political demonstrations banned, freedom of expression curtailed, other political parties gradually excluded from power, and the Constitution changed along French lines to vest power in the Presidency, a post that M'ba assumed himself. However, when M'ba dissolved the National Assembly in January 1964 to institute one-party rule, an army coup sought to oust him from power and restore parliamentary democracy. French paratroopers flew in within 24 hours to restore M'ba to power. After a few days of fighting, the coup ended and the opposition was imprisoned, despite widespread protests and riots. French soldiers still remain in the Camp de Gaulle on the outskirts of Gabon's capital to this day. When M'Ba died in 1967, Bongo replaced him as president. In March 1968, Bongo declared Gabon a one-party state by dissolving the BDG and establishing a new party—the Parti Democratique Gabonais (PDG). He invited all Gabonese, regardless of previous political affiliation, to participate. Bongo sought to forge a single national movement in support of the government's development policies, using the PDG as a tool to submerge the regional and tribal rivalries that had divided Gabonese politics in the past. Bongo was elected president in February 1975; in April 1975, the position of vice president was abolished and replaced by the position of prime minister, who had no right to automatic succession. Bongo was re-elected President in both December 1979 and November 1986 to 7-year terms. In early 1990 economic discontent and a desire for political liberalization provoked violent demonstrations and strikes by students and workers. In response to grievances by workers, Bongo negotiated with them on a sector-by-sector basis, making significant wage concessions. In addition, he promised to open up the PDG and to organize a national political conference in March–April 1990 to discuss Gabon's future political system. The PDG and 74 political organizations attended the conference. Participants essentially divided into two loose coalitions, the ruling PDG and its allies, and the United Front of Opposition Associations and Parties, consisting of the breakaway Morena Fundamental and the Gabonese Progress Party. The April 1990 conference approved sweeping political reforms, including creation of a national Senate, decentralization of the budgetary process, freedom of assembly and press, and cancellation of an exit visa requirement. In an attempt to guide the political system's transformation to multiparty democracy, Bongo resigned as PDG chairman and created a transitional government headed by a new Prime Minister, Casimir Oye-Mba. The Gabonese Social Democratic Grouping (RSDG), as the resulting government was called, was smaller than the previous government and included representatives from several opposition parties in its cabinet. The RSDG drafted a provisional constitution in May 1990 that provided a basic bill of rights and an independent judiciary but retained strong executive powers for the president. After further review by a constitutional committee and the National Assembly, this document came into force in March 1991. Opposition to the PDG continued after the April 1990 conference, however, and in September 1990, two coup d'état attempts were uncovered and aborted. Despite anti-government demonstrations after the untimely death of an opposition leader, the first multiparty National Assembly elections in almost 30 years took place in September–October 1990, with the PDG garnering a large majority. Following President Omar Bongo's re-election in December 1993 with 51% of the vote, opposition candidates refused to validate the election results. Serious civil disturbances and violent repression led to an agreement between the government and opposition factions to work toward a political settlement. These talks led to the Paris Accords in November 1994, under which several opposition figures were included in a government of national unity. This arrangement soon broke down, however, and the 1996 and 1997 legislative and municipal elections provided the background for renewed partisan politics. The PDG won a landslide victory in the legislative election, but several major cities, including Libreville, elected opposition mayors during the 1997 local election. Facing a divided opposition, President Omar Bongo coasted to easy re-election in December 1998, with large majorities of the vote. While Bongo's major opponents rejected the outcome as fraudulent, some international observers characterized the results as representative despite many perceived irregularities, and there were none of the civil disturbances that followed the 1993 election. Peaceful though flawed legislative elections held in 2001–2002, which were boycotted by a number of smaller opposition parties and were widely criticized for their administrative weaknesses, produced a National Assembly almost completely dominated by the PDG and allied independents. In November 2005 President Omar Bongo was elected for his sixth term. He won re-election easily, but opponents claim that the balloting process was marred by irregularities. There were some instances of violence following the announcement of his win, but Gabon generally remained peaceful. National Assembly elections were held again in December 2006. Several seats contested because of voting irregularities were overturned by the Constitutional Court, but the subsequent run-off elections in early 2007 again yielded a PDG-controlled National Assembly. On 8 June 2009, President Omar Bongo died of cardiac arrest at a Spanish hospital in Barcelona, ushering in a new era in Gabonese politics. In accordance with the amended constitution, Rose Francine Rogombé, the President of the Senate, became Interim President on 10 June 2009. The first contested elections in Gabon's history that did not include Omar Bongo as a candidate were held on 30 August 2009, with 18 candidates for president. The lead-up to the elections saw some isolated protests, but no significant disturbances. Omar Bongo's son, ruling party leader Ali Bongo Ondimba, was formally declared the winner after a 3-week review by the Constitutional Court; his inauguration took place on 16 October 2009. The court's review had been prompted by claims of fraud by the many opposition candidates, with the initial announcement of election results sparking unprecedented violent protests in Port-Gentil, the country's second-largest city and a long-time bastion of opposition to PDG rule. The citizens of Port-Gentil took to the streets, and numerous shops and residences were burned, including the French Consulate and a local prison. Officially, only four deaths occurred during the riots, but opposition and local leaders claim many more. Gendarmes and the military were deployed to Port-Gentil to support the beleaguered police, and a curfew was in effect for more than three months. A partial legislative by-election was held in June 2010. A newly created coalition of parties, the Union Nationale (UN), participated for the first time. The UN is composed largely of PDG defectors who left the party after Omar Bongo's death. Of the five hotly contested seats, the PDG won three and the UN won two; both sides claimed victory. In January 2019, there was an attempted coup d'état led by soldiers against the President Ali Bongo; the coup ultimately failed. Government and politics Gabon is a republic with a presidential form of government under the 1961 constitution (revised in 1975, rewritten in 1991, and revised in 2003). The president is elected by universal suffrage for a seven-year term; a 2003 constitutional amendment removed presidential term limits and facilitated a presidency for life. The president can appoint and dismiss the prime minister, the cabinet, and judges of the independent Supreme Court. The president also has other strong powers, such as authority to dissolve the National Assembly, declare a state of siege, delay legislation, and conduct referenda. Gabon has a bicameral legislature with a National Assembly and Senate. The National Assembly has 120 deputies who are popularly elected for a 5-year term. The Senate is composed of 102 members who are elected by municipal councils and regional assemblies and serve
dolomite and limestone rocks. Some of the caves include Grotte du Lastoursville, Grotte du Lebamba, Grotte du Bongolo, and Grotte du Kessipougou. Many caves have not been explored yet. A National Geographic Expedition visited the caves in the summer of 2008 to document them. Gabon is also noted for efforts to preserve the natural environment. In 2002, President Omar Bongo Ondimba designated roughly 10% of the nation's territory to be part of its national park system (with 13 parks in total), one of the largest proportions of nature parkland in the world. The National Agency for National Parks manages Gabon's national park system. Gabon had a 2018 Forest Landscape Integrity Index mean score of 9.07/10, ranking it 9th globally out of 172 countries. Natural resources include petroleum, magnesium, iron, gold, uranium, and forests. Wildlife Economy Gabon's economy is dominated by oil. Oil revenues constitute roughly 46% of the government's budget, 43% of the gross domestic product (GDP), and 81% of exports. Oil production is currently declining rapidly from its high point of 370,000 barrels per day in 1997. Some estimates suggest that Gabonese oil will be expended by 2025. In spite of the decreasing oil revenues, planning is only now beginning for an after-oil scenario. The Grondin Oil Field was discovered in water depths offshore, in 1971 and produces from the Batanga sandstones of Maastrichtian age forming an anticline salt structural trap which is about deep. Gabonese public expenditures from the years of significant oil revenues were not spent efficiently. Overspending on the Trans-Gabon Railway, the CFA franc devaluation of 1994, and periods of low oil prices caused serious debt problems that still plague the country. Gabon earned a poor reputation with the Paris Club and the International Monetary Fund (IMF) over the management of its debt and revenues. Successive IMF missions have criticized the government for overspending on off-budget items (in good years and bad), over-borrowing from the Central Bank, and slipping on the schedule for privatization and administrative reform. However, in September 2005 Gabon successfully concluded a 15-month Stand-By Arrangement with the IMF. Another 3-year Stand-By Arrangement with the IMF was approved in May 2007. Because of the financial crisis and social developments surrounding the death of President Omar Bongo and the elections, Gabon was unable to meet its economic goals under the Stand-By Arrangement in 2009. Negotiations with the IMF were ongoing. Gabon's oil revenues have given it a per capita GDP of $8,600, unusually high for the region. However, a skewed income distribution and poor social indicators are evident. The richest 20% of the population earn over 90% of the income while about a third of the Gabonese population lives in poverty. The economy is highly dependent on extraction, but primary materials are abundant. Before the discovery of oil, logging was the pillar of the Gabonese economy. Today, logging and manganese mining are the next-most-important income generators. Recent explorations suggest the presence of the world's largest unexploited iron ore deposit. For many who live in rural areas without access to employment opportunity in extractive industries, remittances from family members in urban areas or subsistence activities provide income. Foreign and local observers have lamented the lack of diversity in the Gabonese economy. Various factors have so far limited the development of new industries: the market is small, about a million dependent on imports from France unable to capitalize on regional markets entrepreneurial zeal not always present among the Gabonese a fairly regular stream of oil "rent", even if it is diminishing Further investment in the agricultural or tourism sectors is complicated by poor infrastructure. The small processing and service sectors that do exist are largely dominated by a few prominent local investors. At World Bank and IMF insistence, the government embarked in the 1990s on a program of privatization of its state-owned companies and administrative reform, including reducing public sector employment and salary growth, but progress has been slow. The new government has voiced a commitment to work toward an economic transformation of the country but faces significant challenges to realize this goal. Demographics Gabon has a population of approximately million. Historical and environmental factors caused Gabon's population to decline between 1900 and 1940. Gabon has one of the lowest population densities of any country in Africa, and the fourth highest Human Development Index in Sub-Saharan Africa. Ethnic groups Almost all Gabonese are of Bantu origin. Gabon has at least forty ethnic groups with differing languages and cultures. Including Fang, Myènè, Punu-Échira, Nzebi-Adouma, Teke-Mbete, Mèmbè, Kota, Akélé. There are also various indigenous Pygmy peoples: the Bongo, and Baka. The latter speak the only non-Bantu language in Gabon. More than 10,000 native French live in Gabon, including an estimated 2,000 dual nationals. Ethnic boundaries are less sharply drawn in Gabon than elsewhere in Africa. Most ethnicities are spread throughout Gabon, leading to constant contact and interaction among the groups, and there is no ethnic tension. One important reason for this is that intermarriage is extremely common and every Gabonese person is connected by blood to many different tribes. Indeed, intermarriage is often required because among many tribes, marriage within the same tribe is prohibited because it is regarded as incest. This is because those tribes consist of the descendants of a specific ancestor, and therefore all members of the tribe are regarded as close kin to each other. French, the language of its former colonial ruler, is a unifying force. The Democratic Party of Gabon (PDG)'s historical dominance also has served to unite various ethnicities and local interests into a larger whole. Population centres Languages French is the country's sole official language. It is estimated that 80% of Gabon's population can speak French, and that 30% of Libreville residents are native speakers of the language. Nationally, Gabonese people speak their various mother tongues according to their ethnic group. The 2013 census found that only 63.7% of Gabon's population could speak a Gabonese language, broken down by 86.3% in rural areas and 60.5% in urban areas speaking at least one national language. In October 2012, just before the 14th summit of the Organisation internationale de la Francophonie, the country declared an intention to add English as a second official language, reportedly in response to an investigation by France into corruption in the African country, though a government spokesman insisted it was for practical reasons only. It was later clarified that the country intended to introduce English as a first foreign language in schools, while keeping French as the general medium of instruction and the sole official language. Religion Major religions practiced in Gabon include Christianity (Roman Catholicism and Protestantism), Bwiti, Islam, and indigenous animistic religion. Many persons practice elements of both Christianity and traditional indigenous religious beliefs. Approximately 73 percent of residents practice at least some elements of Christianity, including the syncretistic Bwiti; 12 percent practice Islam; 10 percent practice traditional indigenous religious beliefs exclusively; and 5 percent practice no religion or are atheists. A vivid description of taboos and magic is provided by Schweitzer. Health Most of the health services of Gabon are public, but there are some private institutions, of which the best known is the hospital established in 1913 in Lambaréné by Albert Schweitzer. Gabon's medical infrastructure is considered one of the best in West Africa. By 1985 there were 28 hospitals, 87 medical centers, and 312 infirmaries and dispensaries. , there were an estimated 29 physicians per 100,000 people, and approximately 90% of the population had access to health care services. In 2000, 70% of the population had access to safe drinking water and 21% had adequate sanitation. A comprehensive government health program treats such diseases as leprosy, sleeping sickness, malaria, filariasis, intestinal worms, and tuberculosis. Rates for immunization of children under the age of one were 97% for tuberculosis and 65% for polio. Immunization rates for DPT and measles were 37% and 56% respectively. Gabon has a domestic supply of pharmaceuticals from a factory in Libreville. The total fertility rate has decreased from 5.8 in 1960 to 4.2 children per mother during childbearing years in 2000. Ten percent of all births were low birth weight. The maternal mortality rate was 520 per 100,000 live births as of 1998. In 2005, the infant mortality rate was 55.35 per 1,000 live births and life expectancy was 55.02 years. As of 2002, the overall mortality rate was estimated at 17.6 per 1,000 inhabitants. The HIV/AIDS prevalence is estimated to be 5.2% of the adult population (ages 15–49). , approximately 46,000 people were living with HIV/AIDS. There were an estimated 2,400 deaths from AIDS in 2009 – down from 3,000 deaths in 2003. Education Gabon's education system is regulated by two ministries: the Ministry of Education, in charge of pre-kindergarten through the last high school grade, and the Ministry of Higher Education and Innovative Technologies, in charge of universities, higher education, and professional schools. Education is compulsory for children ages 6 to 16 under the Education Act. Most children in Gabon start their school lives by attending nurseries or "Crèche", then kindergarten known as "Jardins d'Enfants". At age six, they are enrolled in primary school, "École Primaire" which is made up of six grades. The next level is "École Secondaire", which is made up of seven grades. The planned graduation age is 19 years old. Those who graduate can apply for admission at institutions of higher learning, including engineering schools or business schools. In Gabon as of 2012, the literacy rate of its population ages 15 and above was 82%. The government has used oil revenue for school construction, paying teachers' salaries, and promoting education, including in rural areas. However, maintenance of school structures, as well as teachers' salaries, has been declining. In 2002 the gross primary enrollment rate was 132 percent, and in 2000 the net primary enrollment rate was 78 percent. Gross and net enrollment ratios are based on the number of students formally registered in primary school and therefore do not necessarily reflect actual school attendance. As of 2001, 69 percent of children who started primary school were likely to reach grade 5. Problems in the education system include poor management and planning, lack of oversight, poorly qualified teachers, and overcrowded classrooms. Culture A country with a primarily oral tradition up until the spread of literacy in the 21st century, Gabon is rich in folklore and mythology. "Raconteurs" are currently working to keep traditions alive such as the mvett among the Fangs and the ingwala among the Nzebis. Gabon also features internationally celebrated masks, such as the n'goltang (Fang) and the reliquary figures of the Kota. Each group has its own set of masks used for various reasons. They are mostly used in traditional ceremonies such as marriage, birth and funerals. Traditionalists mainly work with rare local woods and other precious materials. Music Gabonese music is lesser-known in comparison with regional giants like the Democratic Republic of the Congo and Cameroon. The country boasts an array of folk styles, as well as pop stars like Patience Dabany and Annie-Flore Batchiellilys, a Gabonese singer and renowned live performer. Also known are guitarists like Georges Oyendze, La Rose Mbadou and Sylvain Avara, and the singer Oliver N'Goma. Imported rock and hip hop from the US and UK are popular in Gabon, as are rumba, makossa and soukous. Gabonese folk instruments include the obala, the ngombi, the balafon and traditional drums. Media Radio-Diffusion Télévision Gabonaise (RTG), which is owned and operated by the government, broadcasts in French and indigenous languages. Color television broadcasts have been introduced in major cities. In 1981, a commercial radio station, Africa No. 1, began operations. The most powerful radio station on the continent, it has participation from the French and Gabonese governments and private European media. In 2004, the government operated two radio stations and another seven were privately owned. There were also two government television stations and four privately owned. In 2003, there were an estimated 488 radios and 308 television sets for every 1,000 people. About 11.5 of every 1,000 people were cable subscribers. Also in 2003, there were 22.4 personal computers for every 1,000 people and 26 of every 1,000 people had access to the Internet. The national press service is the Gabonese Press Agency, which publishes a daily paper, Gabon-Matin (circulation 18,000 as of 2002). L'Union in Libreville, the government-controlled daily newspaper, had an average daily circulation of 40,000 in 2002. The weekly Gabon d'Aujourdhui is published by the Ministry of Communications. There are about nine privately owned periodicals which are either independent or affiliated with political parties. These publish in small numbers and are often delayed by financial constraints. The constitution of Gabon provides for free speech and a free press, and the government supports these rights. Several periodicals actively criticize the government and foreign publications are widely available. Cuisine Gabonese cuisine is influenced by French cuisine, but staple foods are also available. Sports The Gabon national football team has represented the nation since 1962. The Under-23 football team won the 2011 CAF U-23 Championship and qualified for the 2012 London Olympics. Gabon were joint hosts, along with Equatorial Guinea, of the 2012 Africa Cup of Nations, and the sole hosts of the competition's 2017 tournament. The Barcelona striker Pierre-Emerick Aubameyang plays for the Gabon national team. The Gabon national basketball team, nicknamed Les Panthères,