https://en.wikipedia.org/w/api.php?action=feedcontributions&feedformat=atom&user=205.189.94.9 Wikipedia - User contributions [en] 2024-10-18T18:24:21Z User contributions MediaWiki 1.43.0-wmf.27 https://en.wikipedia.org/w/index.php?title=Convolutional_neural_network&diff=1181040204 Convolutional neural network 2023-10-20T13:40:27Z <p>205.189.94.9: </p> <hr /> <div>{{short description|Artificial neural network}}<br /> {{Other uses|CNN (disambiguation)}}<br /> {{More citations needed|date=June 2019}}<br /> {{Machine learning|Artificial neural network}}<br /> '''Convolutional neural network''' ('''CNN''') is a [[regularization (mathematics)|regularized]] type of [[feed-forward neural network]] that learns [[feature engineering]] by itself via [[filter (signal processing)|filters]] (or kernel) optimization. Vanishing gradients and exploding gradients, seen during [[backpropagation]] in earlier neural networks, are prevented by using regularized weights over fewer connections.&lt;ref name=&quot;auto3&quot;&gt;{{cite book |last1=Venkatesan |first1=Ragav |url=https://books.google.com/books?id=bAM7DwAAQBAJ&amp;q=vanishing+gradient |title=Convolutional Neural Networks in Visual Computing: A Concise Guide |last2=Li |first2=Baoxin |date=2017-10-23 |publisher=CRC Press |isbn=978-1-351-65032-8 |language=en |access-date=2020-12-13 |archive-date=2023-10-16 |archive-url=https://web.archive.org/web/20231016190415/https://books.google.com/books?id=bAM7DwAAQBAJ&amp;q=vanishing+gradient#v=snippet&amp;q=vanishing%20gradient&amp;f=false |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;auto2&quot;&gt;{{cite book |last1=Balas |first1=Valentina E. |url=https://books.google.com/books?id=XRS_DwAAQBAJ&amp;q=exploding+gradient |title=Recent Trends and Advances in Artificial Intelligence and Internet of Things |last2=Kumar |first2=Raghvendra |last3=Srivastava |first3=Rajshree |date=2019-11-19 |publisher=Springer Nature |isbn=978-3-030-32644-9 |language=en |access-date=2020-12-13 |archive-date=2023-10-16 |archive-url=https://web.archive.org/web/20231016190414/https://books.google.com/books?id=XRS_DwAAQBAJ&amp;q=exploding+gradient#v=snippet&amp;q=exploding%20gradient&amp;f=false |url-status=live }}&lt;/ref&gt; For example, for ''each'' neuron in the fully-connected layer 10,000 weights would be required for processing an image sized 100 × 100 pixels. However, applying cascaded ''convolution'' (or cross-correlation) kernels,&lt;ref&gt;{{Cite journal|last1=Zhang|first1=Yingjie|last2=Soon|first2=Hong Geok|last3=Ye|first3=Dongsen|last4=Fuh|first4=Jerry Ying Hsi|last5=Zhu|first5=Kunpeng|date=September 2020|title=Powder-Bed Fusion Process Monitoring by Machine Vision With Hybrid Convolutional Neural Networks|url=https://ieeexplore.ieee.org/document/8913613|journal=IEEE Transactions on Industrial Informatics|volume=16|issue=9|pages=5769–5779|doi=10.1109/TII.2019.2956078|s2cid=213010088|issn=1941-0050|access-date=2023-08-12|archive-date=2023-07-31|archive-url=https://web.archive.org/web/20230731120013/https://ieeexplore.ieee.org/document/8913613/|url-status=live}}&lt;/ref&gt;&lt;ref&gt;{{Cite journal|last1=Chervyakov|first1=N.I.|last2=Lyakhov|first2=P.A.|last3=Deryabin|first3=M.A.|last4=Nagornov|first4=N.N.|last5=Valueva|first5=M.V.|last6=Valuev|first6=G.V.|date=September 2020|title=Residue Number System-Based Solution for Reducing the Hardware Cost of a Convolutional Neural Network|url=https://linkinghub.elsevier.com/retrieve/pii/S092523122030583X|journal=Neurocomputing|language=en|volume=407|pages=439–453|doi=10.1016/j.neucom.2020.04.018|s2cid=219470398|quote=Convolutional neural networks represent deep learning architectures that are currently used in a wide range of applications, including computer vision, speech recognition, time series analysis in finance, and many others.|access-date=2023-08-12|archive-date=2023-06-29|archive-url=https://web.archive.org/web/20230629155646/https://linkinghub.elsevier.com/retrieve/pii/S092523122030583X|url-status=live}}&lt;/ref&gt; only 25 neurons are required to process 5x5-sized tiles.&lt;ref name=&quot;auto1&quot;&gt;{{cite book |title=Guide to convolutional neural networks : a practical application to traffic-sign detection and classification |last=Habibi |first=Aghdam, Hamed |others=Heravi, Elnaz Jahani |isbn=9783319575490 |location=Cham, Switzerland |oclc=987790957 |date=2017-05-30}}&lt;/ref&gt;&lt;ref&gt;{{Cite journal|last=Atlas, Homma, and Marks|title=An Artificial Neural Network for Spatio-Temporal Bipolar Patterns: Application to Phoneme Classification|url=https://papers.nips.cc/paper/1987/file/98f13708210194c475687be6106a3b84-Paper.pdf |archive-url=https://web.archive.org/web/20210414091306/https://papers.nips.cc/paper/1987/file/98f13708210194c475687be6106a3b84-Paper.pdf |archive-date=2021-04-14 |url-status=live|journal=Neural Information Processing Systems (NIPS 1987)|volume=1}}&lt;/ref&gt; Higher-layer features are extracted from wider context windows, compared to lower-layer features. This is seen as a brute force approach to solving large-task static database development issues resolved and unresolved in [[twentieth-first century]] computing.<br /> <br /> They have applications in: <br /> * [[computer vision|image and video recognition]],&lt;ref name=&quot;Valueva Nagornov Lyakhov Valuev 2020 pp. 232–243&quot;&gt;{{cite journal |last1=Valueva |first1=M.V. |last2=Nagornov |first2=N.N. |last3=Lyakhov |first3=P.A. |last4=Valuev |first4=G.V. |last5=Chervyakov |first5=N.I. |title=Application of the residue number system to reduce hardware costs of the convolutional neural network implementation |journal=Mathematics and Computers in Simulation |publisher=Elsevier BV |volume=177 |year=2020 |issn=0378-4754 |doi=10.1016/j.matcom.2020.04.031 |pages=232–243 |s2cid=218955622 |quote=Convolutional neural networks are a promising tool for solving the problem of pattern recognition.}}&lt;/ref&gt;<br /> * [[recommender system]]s,&lt;ref&gt;{{cite book |url=https://proceedings.neurips.cc/paper/2013/file/b3ba8f1bee1238a2f37603d90b58898d-Paper.pdf |title=Deep content-based music recommendation |last1=van den Oord |first1=Aaron |last2=Dieleman |first2=Sander |last3=Schrauwen |first3=Benjamin |date=2013-01-01 |publisher=Curran Associates, Inc. |editor-last=Burges |editor-first=C. J. C. |pages=2643–2651 |editor-last2=Bottou |editor-first2=L. |editor-last3=Welling |editor-first3=M. |editor-last4=Ghahramani |editor-first4=Z. |editor-last5=Weinberger |editor-first5=K. Q. |access-date=2022-03-31 |archive-date=2022-03-07 |archive-url=https://web.archive.org/web/20220307172303/https://proceedings.neurips.cc/paper/2013/file/b3ba8f1bee1238a2f37603d90b58898d-Paper.pdf |url-status=live }}&lt;/ref&gt; <br /> * <br /> * [[image classification]],<br /> * <br /> * [[image segmentation]], <br /> * <br /> * [[medical image computing|medical image analysis]], <br /> * <br /> * [[natural language processing]],&lt;ref&gt;{{cite book |last1=Collobert |first1=Ronan |last2=Weston |first2=Jason |title=Proceedings of the 25th international conference on Machine learning - ICML '08 |chapter=A unified architecture for natural language processing |date=2008-01-01 |location=New York, NY, USA |publisher=ACM |pages=160–167 |doi=10.1145/1390156.1390177 |isbn=978-1-60558-205-4 |s2cid=2617020}}&lt;/ref&gt;<br /> * <br /> * [[brain–computer interface]]s,&lt;ref&gt;{{cite book |last1=Avilov |first1=Oleksii |last2=Rimbert |first2=Sebastien |last3=Popov |first3=Anton |last4=Bougrain |first4=Laurent |title=2020 42nd Annual International Conference of the IEEE Engineering in Medicine &amp; Biology Society (EMBC) |chapter=Deep Learning Techniques to Improve Intraoperative Awareness Detection from Electroencephalographic Signals |date=July 2020 |chapter-url=https://ieeexplore.ieee.org/document/9176228 |volume=2020 |location=Montreal, QC, Canada |publisher=IEEE |pages=142–145 |doi=10.1109/EMBC44109.2020.9176228 |pmid=33017950 |isbn=978-1-7281-1990-8 |s2cid=221386616 |url=https://hal.inria.fr/hal-02920320/file/Avilov_EMBC2020.pdf |access-date=2023-07-21 |archive-date=2022-05-19 |archive-url=https://web.archive.org/web/20220519135428/https://hal.inria.fr/hal-02920320/file/Avilov_EMBC2020.pdf |url-status=live }}&lt;/ref&gt; and <br /> * <br /> * financial [[time series]].&lt;ref name=&quot;Tsantekidis 7–12&quot;&gt;{{cite book |last1=Tsantekidis |first1=Avraam |last2=Passalis |first2=Nikolaos |last3=Tefas |first3=Anastasios |last4=Kanniainen |first4=Juho |last5=Gabbouj |first5=Moncef |last6=Iosifidis |first6=Alexandros |title=2017 IEEE 19th Conference on Business Informatics (CBI) |chapter=Forecasting Stock Prices from the Limit Order Book Using Convolutional Neural Networks |date=July 2017 |location=Thessaloniki, Greece |publisher=IEEE |pages=7–12 |doi=10.1109/CBI.2017.23 |isbn=978-1-5386-3035-8 |s2cid=4950757}}&lt;/ref&gt;<br /> <br /> CNNs are also known as '''Shift Invariant''' or '''Space Invariant Artificial Neural Networks''' ('''SIANN'''), based on the shared-weight architecture of the [[convolution]] kernels or filters that slide along input features and provide translation-[[equivariant map|equivariant]] responses known as feature maps.&lt;ref name=&quot;:0&quot;&gt;{{cite journal |last=Zhang |first=Wei |date=1988 |title=Shift-invariant pattern recognition neural network and its optical architecture |url=https://drive.google.com/file/d/1nN_5odSG_QVae54EsQN_qSz-0ZsX6wA0/view?usp=sharing |journal=Proceedings of Annual Conference of the Japan Society of Applied Physics |access-date=2020-06-22 |archive-date=2020-06-23 |archive-url=https://web.archive.org/web/20200623051222/https://drive.google.com/file/d/1nN_5odSG_QVae54EsQN_qSz-0ZsX6wA0/view?usp=sharing |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;:1&quot;&gt;{{cite journal |last=Zhang |first=Wei |date=1990 |title=Parallel distributed processing model with local space-invariant interconnections and its optical architecture |url=https://drive.google.com/file/d/0B65v6Wo67Tk5ODRzZmhSR29VeDg/view?usp=sharing |journal=Applied Optics |volume=29 |issue=32 |pages=4790–7 |doi=10.1364/AO.29.004790 |pmid=20577468 |bibcode=1990ApOpt..29.4790Z |access-date=2016-09-22 |archive-date=2017-02-06 |archive-url=https://web.archive.org/web/20170206111407/https://drive.google.com/file/d/0B65v6Wo67Tk5ODRzZmhSR29VeDg/view?usp=sharing |url-status=live }}&lt;/ref&gt; Counter-intuitively, most convolutional neural networks are not [[translation invariant|invariant to translation]], due to the downsampling operation they apply to the input.&lt;ref name=&quot;:6&quot;&gt;{{cite book |last1=Mouton |first1=Coenraad |last2=Myburgh |first2=Johannes C. |last3=Davel |first3=Marelie H. |title=Artificial Intelligence Research |chapter=Stride and Translation Invariance in CNNS |date=2020 |editor-last=Gerber |editor-first=Aurona |chapter-url=https://link.springer.com/chapter/10.1007%2F978-3-030-66151-9_17 |series=Communications in Computer and Information Science |volume=1342 |language=en |location=Cham |publisher=Springer International Publishing |pages=267–281 |doi=10.1007/978-3-030-66151-9_17 |arxiv=2103.10097 |isbn=978-3-030-66151-9 |s2cid=232269854 |access-date=2021-03-26 |archive-date=2021-06-27 |archive-url=https://web.archive.org/web/20210627074505/https://link.springer.com/chapter/10.1007%2F978-3-030-66151-9_17 |url-status=live }}&lt;/ref&gt;<br /> <br /> [[Feed-forward neural network]]s are usually fully connected networks, that is, each neuron in one [[Layer (deep learning)|layer]] is connected to all neurons in the next [[layer (deep learning)|layer]]. The &quot;full connectivity&quot; of these networks make them prone to [[overfitting]] data. Typical ways of regularization, or preventing overfitting, include: penalizing parameters during training (such as weight decay) or trimming connectivity (skipped connections, dropout, etc.) Robust datasets also increases the probability that CNNs will learn the generalized principles that characterize a given dataset rather than the biases of a poorly-populated set.&lt;ref&gt;{{Cite journal |last=Kurtzman |first=Thomas |date=August 20, 2019 |title=Hidden bias in the DUD-E dataset leads to misleading performance of deep learning in structure-based virtual screening |journal=PLOS ONE|volume=14 |issue=8 |pages=e0220113 |doi=10.1371/journal.pone.0220113 |pmid=31430292 |pmc=6701836 |bibcode=2019PLoSO..1420113C |doi-access=free }}&lt;/ref&gt; <br /> <br /> Convolutional networks were [[mathematical biology|inspired]] by [[biological]] processes&lt;ref name=fukuneoscholar/&gt;&lt;ref name=&quot;hubelwiesel1968&quot;/&gt;&lt;ref name=&quot;intro&quot;/&gt;&lt;ref name=&quot;robust face detection&quot;&gt;{{cite journal |last=Matusugu |first=Masakazu |year=2003 |title=Subject independent facial expression recognition with robust face detection using a convolutional neural network |url=http://www.iro.umontreal.ca/~pift6080/H09/documents/papers/sparse/matsugo_etal_face_expression_conv_nnet.pdf |journal=Neural Networks |volume=16 |issue=5 |pages=555–559 |doi=10.1016/S0893-6080(03)00115-1 |pmid=12850007 |author2=Katsuhiko Mori |author3=Yusuke Mitari |author4=Yuji Kaneda |access-date=17 November 2013 |archive-date=13 December 2013 |archive-url=https://web.archive.org/web/20131213022740/http://www.iro.umontreal.ca/~pift6080/H09/documents/papers/sparse/matsugo_etal_face_expression_conv_nnet.pdf |url-status=live }}&lt;/ref&gt; in that the connectivity pattern between [[artificial neuron|neurons]] resembles the organization of the animal [[visual cortex]]. Individual [[cortical neuron]]s respond to stimuli only in a restricted region of the [[visual field]] known as the [[receptive field]]. The receptive fields of different neurons partially overlap such that they cover the entire visual field.<br /> <br /> CNNs use relatively little pre-processing compared to other [[image classification|image classification algorithms]]. This means that the network learns to optimize the [[filter (signal processing)|filters]] (or kernels) through automated learning, whereas in traditional algorithms these filters are [[feature engineering|hand-engineered]]. This independence from prior knowledge and human intervention in feature extraction is a major advantage.{{To whom?&lt;!--e.g. to the programmers? the users? the CNN? --&gt;|date=April 2023}}<br /> <br /> {{TOC limit|3}}<br /> <br /> == Architecture ==<br /> [[File:Comparison image neural networks.svg|thumb|480px|Comparison of the LeNet and AlexNet convolution, pooling and dense layers&lt;br&gt;(AlexNet image size should be 227×227×3, instead of 224×224×3, so the math will come out right. The original paper said different numbers, but Andrej Karpathy, the head of computer vision at Tesla, said it should be 227×227×3 (he said Alex didn't describe why he put 224×224×3). The next convolution should be 11×11 with stride 4: 55×55×96 (instead of 54×54×96). It would be calculated, for example, as: [(input width 227 - kernel width 11) / stride 4] + 1 = [(227 - 11) / 4] + 1 = 55. Since the kernel output is the same length as width, its area is 55×55.)]]<br /> {{Main|Layer (deep learning)}}<br /> A convolutional neural network consists of an input layer, [[Artificial neural network#Organization|hidden layers]] and an output layer. In a convolutional neural network, the hidden layers include one or more layers that perform convolutions. Typically this includes a layer that performs a [[dot product]] of the convolution kernel with the layer's input matrix. This product is usually the [[Frobenius inner product]], and its activation function is commonly [[rectifier (neural networks)|ReLU]]. As the convolution kernel slides along the input matrix for the layer, the convolution operation generates a feature map, which in turn contributes to the input of the next layer. This is followed by other layers such as pooling layers, fully connected layers, and normalization layers.<br /> <br /> === Convolutional layers ===<br /> In a CNN, the input is a [[Tensor (machine learning)|tensor]] with shape: (number of inputs) × (input height) × (input width) × (input [[channel (digital image)|channels]]). After passing through a convolutional layer, the image becomes abstracted to a feature map, also called an activation map, with shape: (number of inputs) × (feature map height) × (feature map width) × (feature map [[channel (digital image)|channels]]).<br /> <br /> Convolutional layers convolve the input and pass its result to the next layer. This is similar to the response of a neuron in the visual cortex to a specific stimulus.&lt;ref name=&quot;deeplearning&quot;&gt;{{cite web |title=Convolutional Neural Networks (LeNet) – DeepLearning 0.1 documentation |url=http://deeplearning.net/tutorial/lenet.html |work=DeepLearning 0.1 |publisher=LISA Lab |access-date=31 August 2013 |archive-date=28 December 2017 |archive-url=https://web.archive.org/web/20171228091645/http://deeplearning.net/tutorial/lenet.html |url-status=dead }}&lt;/ref&gt; Each convolutional neuron processes data only for its [[receptive field]]. Although [[multilayer perceptron|fully connected feedforward neural networks]] can be used to learn features and classify data, this architecture is generally impractical for larger inputs (e.g., high-resolution images), which would require massive numbers of neurons because each pixel is a relevant input feature. A fully connected layer for an image of size 100 × 100 has 10,000 weights for ''each'' neuron in the second layer. Convolution reduces the number of free parameters, allowing the network to be deeper.&lt;ref name=&quot;auto1&quot;/&gt; For example, using a 5 × 5 tiling region, each with the same shared weights, requires only 25 neurons. Using regularized weights over fewer parameters avoids the vanishing gradients and exploding gradients problems seen during [[backpropagation]] in earlier neural networks.&lt;ref name=&quot;auto3&quot;/&gt;&lt;ref name=&quot;auto2&quot;/&gt; <br /> <br /> To speed processing, standard convolutional layers can be replaced by depthwise separable convolutional layers,&lt;ref&gt;{{Cite arXiv |last=Chollet |first=François |date=2017-04-04 |title=Xception: Deep Learning with Depthwise Separable Convolutions |class=cs.CV |eprint=1610.02357 }}&lt;/ref&gt; which are based on a depthwise convolution followed by a pointwise convolution. The ''depthwise convolution'' is a spatial convolution applied independently over each channel of the input tensor, while the ''pointwise convolution'' is a standard convolution restricted to the use of &lt;math&gt;1\times1&lt;/math&gt; kernels.<br /> <br /> === Pooling layers ===<br /> Convolutional networks may include local and/or global pooling layers along with traditional convolutional layers. Pooling layers reduce the dimensions of data by combining the outputs of neuron clusters at one layer into a single neuron in the next layer. Local pooling combines small clusters, tiling sizes such as 2 × 2 are commonly used. Global pooling acts on all the neurons of the feature map.&lt;ref name=&quot;flexible&quot;/&gt;&lt;ref&gt;{{cite web |last=[[Alex Krizhevsky |Krizhevsky]] |first=Alex |title=ImageNet Classification with Deep Convolutional Neural Networks |url=https://image-net.org/static_files/files/supervision.pdf |access-date=17 November 2013 |archive-date=25 April 2021 |archive-url=https://web.archive.org/web/20210425025127/http://www.image-net.org/static_files/files/supervision.pdf |url-status=live }}&lt;/ref&gt; There are two common types of pooling in popular use: max and average. ''Max pooling'' uses the maximum value of each local cluster of neurons in the feature map,&lt;ref name=Yamaguchi111990&gt;{{cite conference |title=A Neural Network for Speaker-Independent Isolated Word Recognition |last1=Yamaguchi |first1=Kouichi |last2=Sakamoto |first2=Kenji |last3=Akabane |first3=Toshio |last4=Fujimoto |first4=Yoshiji |date=November 1990 |location=Kobe, Japan |conference=First International Conference on Spoken Language Processing (ICSLP 90) |url=https://www.isca-speech.org/archive/icslp_1990/i90_1077.html |access-date=2019-09-04 |archive-date=2021-03-07 |archive-url=https://web.archive.org/web/20210307233750/https://www.isca-speech.org/archive/icslp_1990/i90_1077.html |url-status=dead }}&lt;/ref&gt;&lt;ref name=&quot;mcdns&quot;&gt;{{cite book |last1=Ciresan |first1=Dan |first2=Ueli |last2=Meier |first3=Jürgen |last3=Schmidhuber |title=2012 IEEE Conference on Computer Vision and Pattern Recognition |chapter=Multi-column deep neural networks for image classification |date=June 2012 |pages=3642–3649 |doi=10.1109/CVPR.2012.6248110 |arxiv=1202.2745 |isbn=978-1-4673-1226-4 |oclc=812295155 |publisher=[[Institute of Electrical and Electronics Engineers]] (IEEE) |location=New York, NY |citeseerx=10.1.1.300.3283 |s2cid=2161592}}&lt;/ref&gt; while ''average pooling'' takes the average value.<br /> <br /> === Fully connected layers ===<br /> <br /> Fully connected layers connect every neuron in one layer to every neuron in another layer. It is the same as a traditional [[multilayer perceptron]] neural network (MLP). The flattened matrix goes through a fully connected layer to classify the images.<br /> <br /> === Receptive field ===<br /> In neural networks, each neuron receives input from some number of locations in the previous layer. In a convolutional layer, each neuron receives input from only a restricted area of the previous layer called the neuron's ''receptive field''. Typically the area is a square (e.g. 5 by 5 neurons). Whereas, in a fully connected layer, the receptive field is the ''entire previous layer''. Thus, in each convolutional layer, each neuron takes input from a larger area in the input than previous layers. This is due to applying the convolution over and over, which takes the value of a pixel into account, as well as its surrounding pixels. When using dilated layers, the number of pixels in the receptive field remains constant, but the field is more sparsely populated as its dimensions grow when combining the effect of several layers.<br /> <br /> To manipulate the receptive field size as desired, there are some alternatives to the standard convolutional layer. For example, atrous or dilated convolution&lt;ref&gt;{{Cite arXiv|last1=Yu |first1=Fisher |last2=Koltun |first2=Vladlen |date=2016-04-30 |title=Multi-Scale Context Aggregation by Dilated Convolutions |class=cs.CV |eprint=1511.07122 }}&lt;/ref&gt;&lt;ref&gt;{{Cite arXiv|last1=Chen |first1=Liang-Chieh |last2=Papandreou |first2=George |last3=Schroff |first3=Florian |last4=Adam |first4=Hartwig |date=2017-12-05 |title=Rethinking Atrous Convolution for Semantic Image Segmentation |class=cs.CV |eprint=1706.05587 }}&lt;/ref&gt; expands the receptive field size without increasing the number of parameters by interleaving visible and blind regions. Moreover, a single dilated convolutional layer can comprise filters with multiple dilation ratios,&lt;ref&gt;{{Cite arXiv|last1=Duta |first1=Ionut Cosmin |last2=Georgescu |first2=Mariana Iuliana |last3=Ionescu |first3=Radu Tudor |date=2021-08-16 |title=Contextual Convolutional Neural Networks |class=cs.CV |eprint=2108.07387 }}&lt;/ref&gt; thus having a variable receptive field size.<br /> <br /> === Weights ===<br /> Each neuron in a neural network computes an output value by applying a specific function to the input values received from the receptive field in the previous layer. The function that is applied to the input values is determined by a vector of weights and a bias (typically real numbers). Learning consists of iteratively adjusting these biases and weights.<br /> <br /> The vectors of weights and biases are called ''filters'' and represent particular [[feature (machine learning)|feature]]s of the input (e.g., a particular shape). A distinguishing feature of CNNs is that many neurons can share the same filter. This reduces the [[memory footprint]] because a single bias and a single vector of weights are used across all receptive fields that share that filter, as opposed to each receptive field having its own bias and vector weighting.&lt;ref name=&quot;LeCun&quot;&gt;{{cite web |url=http://yann.lecun.com/exdb/lenet/ |title=LeNet-5, convolutional neural networks |last=LeCun |first=Yann |access-date=16 November 2013 |archive-date=24 February 2021 |archive-url=https://web.archive.org/web/20210224225707/http://yann.lecun.com/exdb/lenet/ |url-status=live }}&lt;/ref&gt;<br /> <br /> == History ==<br /> <br /> CNN are often compared to the way the brain achieves vision processing in living [[organisms]].&lt;ref name=&quot;auto&quot;&gt;{{cite news |url=https://becominghuman.ai/from-human-vision-to-computer-vision-convolutional-neural-network-part3-4-24b55ffa7045 |title=From Human Vision to Computer Vision — Convolutional Neural Network(Part3/4) |first=Puttatida |last=Mahapattanakul |date=November 11, 2019 |website=Medium |access-date=May 25, 2021 |archive-date=May 25, 2021 |archive-url=https://web.archive.org/web/20210525073017/https://becominghuman.ai/from-human-vision-to-computer-vision-convolutional-neural-network-part3-4-24b55ffa7045 |url-status=live }}&lt;/ref&gt;&lt;ref&gt;{{Cite journal |last1=van Dyck |first1=Leonard Elia |last2=Kwitt |first2=Roland |last3=Denzler |first3=Sebastian Jochen |last4=Gruber |first4=Walter Roland |date=2021 |title=Comparing Object Recognition in Humans and Deep Convolutional Neural Networks—An Eye Tracking Study |journal=Frontiers in Neuroscience |volume=15 |page=750639 |doi=10.3389/fnins.2021.750639 |pmid=34690686 |pmc=8526843 |issn=1662-453X |doi-access=free }}&lt;/ref&gt;<br /> <br /> === Receptive fields in the visual cortex ===<br /> Work by [[David H. Hubel|Hubel]] and [[Torsten Wiesel|Wiesel]] in the 1950s and 1960s showed that cat [[visual cortex|visual cortices]] contain neurons that individually respond to small regions of the [[visual field]]. Provided the eyes are not moving, the region of visual space within which visual stimuli affect the firing of a single neuron is known as its [[receptive field]].&lt;ref name=&quot;:4&quot;/&gt; Neighboring cells have similar and overlapping receptive fields. &lt;ref name=&quot;auto&quot;/&gt; Receptive field size and location varies systematically across the cortex to form a complete map of visual space. &lt;ref name=&quot;auto&quot;/&gt;{{citation needed|date=October 2017}} The cortex in each hemisphere represents the contralateral [[visual field]].{{citation needed|date=October 2017}}<br /> <br /> Their 1968 paper identified two basic visual cell types in the brain:&lt;ref name=&quot;hubelwiesel1968&quot;&gt;{{cite journal |title=Receptive fields and functional architecture of monkey striate cortex |journal=The Journal of Physiology |date=1968-03-01 |issn=0022-3751 |pmc=1557912 |pmid=4966457 |pages=215–243 |volume=195 |issue=1 |first1=D. H. |last1=Hubel |first2=T. N. |last2=Wiesel |doi=10.1113/jphysiol.1968.sp008455}}&lt;/ref&gt;<br /> <br /> *[[simple cell]]s, whose output is maximized by straight edges having particular orientations within their receptive field<br /> *[[complex cell]]s, which have larger [[receptive field]]s, whose output is insensitive to the exact position of the edges in the field.<br /> <br /> Hubel and Wiesel also proposed a cascading model of these two types of cells for use in pattern recognition tasks.&lt;ref&gt;{{cite book<br /> |title=Brain and visual perception: the story of a 25-year collaboration<br /> |author=David H. Hubel and Torsten N. Wiesel<br /> |publisher=Oxford University Press US<br /> |year=2005<br /> |isbn=978-0-19-517618-6<br /> |page=106<br /> |url=https://books.google.com/books?id=8YrxWojxUA4C&amp;pg=PA106<br /> |access-date=2019-01-18<br /> |archive-date=2023-10-16<br /> |archive-url=https://web.archive.org/web/20231016190414/https://books.google.com/books?id=8YrxWojxUA4C&amp;pg=PA106#v=onepage&amp;q&amp;f=false<br /> |url-status=live<br /> }}&lt;/ref&gt;&lt;ref name=&quot;:4&quot;&gt;{{cite journal |pmc=1363130 |pmid=14403679 |volume=148 |issue=3 |title=Receptive fields of single neurones in the cat's striate cortex |date=October 1959 |journal=J. Physiol. |pages=574–91 |last1=Hubel |first1=DH |last2=Wiesel |first2=TN |doi=10.1113/jphysiol.1959.sp006308}}&lt;/ref&gt;<br /> <br /> === Neocognitron, origin of the CNN architecture ===<br /> <br /> The &quot;[[neocognitron]]&quot;&lt;ref name=fukuneoscholar&gt;{{cite journal |last1=Fukushima |first1=K. |year=2007 |title=Neocognitron |journal=Scholarpedia |volume=2 |issue=1 |page=1717 |doi=10.4249/scholarpedia.1717 |bibcode=2007SchpJ...2.1717F |doi-access=free}}&lt;/ref&gt; was introduced by [[Kunihiko Fukushima]] in 1980.&lt;ref name=&quot;intro&quot;&gt;{{cite journal |last=Fukushima |first=Kunihiko |title=Neocognitron: A Self-organizing Neural Network Model for a Mechanism of Pattern Recognition Unaffected by Shift in Position |journal=Biological Cybernetics |year=1980 |volume=36 |issue=4 |pages=193–202 |url=https://www.cs.princeton.edu/courses/archive/spr08/cos598B/Readings/Fukushima1980.pdf |access-date=16 November 2013 |doi=10.1007/BF00344251 |pmid=7370364 |s2cid=206775608 |archive-date=3 June 2014 |archive-url=https://web.archive.org/web/20140603013137/http://www.cs.princeton.edu/courses/archive/spr08/cos598B/Readings/Fukushima1980.pdf |url-status=live }}&lt;/ref&gt;&lt;ref name=mcdns/&gt;&lt;ref&gt;{{cite journal |first1=Yann |last1=LeCun |first2=Yoshua |last2=Bengio |first3=Geoffrey |last3=Hinton |title=Deep learning |journal=Nature |volume=521 |issue=7553 |year=2015 |pages=436–444 |doi=10.1038/nature14539 |pmid=26017442 |bibcode=2015Natur.521..436L |s2cid=3074096}}&lt;/ref&gt;<br /> It was inspired by the above-mentioned work of Hubel and Wiesel. The neocognitron introduced the two basic types of layers in CNNs: convolutional layers, and downsampling layers. A convolutional layer contains units whose receptive fields cover a patch of the previous layer. The weight vector (the set of adaptive parameters) of such a unit is often called a filter. Units can share filters. Downsampling layers contain units whose receptive fields cover patches of previous convolutional layers. Such a unit typically computes the average of the activations of the units in its patch. This downsampling helps to correctly classify objects in visual scenes even when the objects are shifted.<br /> <br /> In 1969, [[Kunihiko Fukushima]] also introduced the [[rectifier (neural networks)|ReLU]] (rectified linear unit) [[activation function]].&lt;ref name=&quot;Fukushima1969&quot;&gt;{{cite journal |first1=K. |last1=Fukushima |title=Visual feature extraction by a multilayered network of analog threshold elements |journal=IEEE Transactions on Systems Science and Cybernetics |volume=5 |issue=4 |date=1969 |pages=322–333 |doi=10.1109/TSSC.1969.300225}}&lt;/ref&gt;&lt;ref name=DLhistory&gt;{{cite arXiv|last=Schmidhuber|first=Juergen|author-link=Juergen Schmidhuber|date=2022|title=Annotated History of Modern AI and Deep Learning |class=cs.NE|eprint=2212.11279}}&lt;/ref&gt; The rectifier has become the most popular activation function for CNNs and [[deep learning|deep neural networks]] in general.&lt;ref&gt;{{cite arXiv |last1=Ramachandran |first1=Prajit |last2=Barret |first2=Zoph |last3=Quoc |first3=V. Le |date=October 16, 2017 |title=Searching for Activation Functions |eprint=1710.05941 |class=cs.NE}}&lt;/ref&gt;<br /> <br /> In a variant of the neocognitron called the cresceptron, instead of using Fukushima's spatial averaging, J. Weng et al. in 1993 introduced a method called max-pooling where a downsampling unit computes the maximum of the activations of the units in its patch.&lt;ref name=&quot;weng1993&quot;&gt;{{cite book |first1=J |last1=Weng |first2=N |last2=Ahuja |first3=TS |last3=Huang |title=1993 (4th) International Conference on Computer Vision |chapter=Learning recognition and segmentation of 3-D objects from 2-D images |s2cid=8619176 |journal=Proc. 4th International Conf. Computer Vision |year=1993 |pages=121–128 |doi=10.1109/ICCV.1993.378228 |isbn=0-8186-3870-2}}&lt;/ref&gt;{{clarify|date=April 2023}}&lt;!--a lower paragraph states that it was introduced in 1990--&gt; Max-pooling is often used in modern CNNs.&lt;ref name=&quot;schdeepscholar&quot;/&gt;<br /> <br /> Several supervised and unsupervised learning algorithms have been proposed over the decades to train the weights of a neocognitron.&lt;ref name=fukuneoscholar/&gt; Today, however, the CNN architecture is usually trained through [[backpropagation]].<br /> <br /> The [[neocognitron]] is the first CNN which requires units located at multiple network positions to have shared weights.<br /> <br /> Convolutional neural networks were presented at the Neural Information Processing Workshop in 1987, automatically analyzing time-varying signals by replacing learned multiplication with convolution in time, and demonstrated for speech recognition.&lt;ref&gt;{{cite journal |last=Homma |first=Toshiteru |author2=Les Atlas |author3=Robert Marks II |year=1988 |title=An Artificial Neural Network for Spatio-Temporal Bipolar Patters: Application to Phoneme Classification |url=https://proceedings.neurips.cc/paper/1987/file/98f13708210194c475687be6106a3b84-Paper.pdf |journal=Advances in Neural Information Processing Systems |volume=1 |pages=31–40 |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331211142/https://proceedings.neurips.cc/paper/1987/file/98f13708210194c475687be6106a3b84-Paper.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> === Time delay neural networks ===<br /> The [[time delay neural network]] (TDNN) was introduced in 1987 by [[Alex Waibel]] et al. and was one of the first convolutional networks, as it achieved shift invariance.&lt;ref name=Waibel1987&gt;{{cite conference |title=Phoneme Recognition Using Time-Delay Neural Networks |last1=Waibel |first1=Alex |date=December 1987 |location=Tokyo, Japan |conference=Meeting of the Institute of Electrical, Information and Communication Engineers (IEICE)}}&lt;/ref&gt; It did so by utilizing weight sharing in combination with [[backpropagation]] training.&lt;ref name=&quot;speechsignal&quot;&gt;[[Alex Waibel|Alexander Waibel]] et al., ''[http://www.inf.ufrgs.br/~engel/data/media/file/cmp121/waibel89_TDNN.pdf Phoneme Recognition Using Time-Delay Neural Networks] {{Webarchive|url=https://web.archive.org/web/20210225163001/http://www.inf.ufrgs.br/~engel/data/media/file/cmp121/waibel89_TDNN.pdf |date=2021-02-25 }}'' IEEE Transactions on Acoustics, Speech, and Signal Processing, Volume 37, No. 3, pp. 328. - 339 March 1989.&lt;/ref&gt; Thus, while also using a pyramidal structure as in the neocognitron, it performed a global optimization of the weights instead of a local one.&lt;ref name=Waibel1987/&gt;<br /> <br /> TDNNs are convolutional networks that share weights along the temporal dimension.&lt;ref&gt;{{cite encyclopedia |last1=LeCun |first1=Yann |last2=Bengio |first2=Yoshua |editor-last=Arbib |editor-first=Michael A. |title=Convolutional networks for images, speech, and time series |encyclopedia=The handbook of brain theory and neural networks |edition=Second |year=1995 |publisher=The MIT press |pages=276–278 |url=https://www.researchgate.net/publication/2453996 |access-date=2019-12-03 |archive-date=2020-07-28 |archive-url=https://web.archive.org/web/20200728164116/https://www.researchgate.net/publication/2453996_Convolutional_Networks_for_Images_Speech_and_Time-Series |url-status=live }}&lt;/ref&gt; They allow speech signals to be processed time-invariantly. In 1990 Hampshire and Waibel introduced a variant which performs a two dimensional convolution.&lt;ref name=&quot;Hampshire1990&quot;&gt;John B. Hampshire and Alexander Waibel, ''[https://proceedings.neurips.cc/paper/1989/file/979d472a84804b9f647bc185a877a8b5-Paper.pdf Connectionist Architectures for Multi-Speaker Phoneme Recognition] {{Webarchive|url=https://web.archive.org/web/20220331225059/https://proceedings.neurips.cc/paper/1989/file/979d472a84804b9f647bc185a877a8b5-Paper.pdf |date=2022-03-31 }}'', Advances in Neural Information Processing Systems, 1990, Morgan Kaufmann.&lt;/ref&gt; Since these TDNNs operated on spectrograms, the resulting phoneme recognition system was invariant to both shifts in time and in frequency. This inspired [[translation invariance]] in image processing with CNNs.&lt;ref name=&quot;speechsignal&quot;/&gt; The tiling of neuron outputs can cover timed stages.&lt;ref name=&quot;video quality&quot;/&gt;<br /> <br /> TDNNs now {{When|date=August 2022}} achieve the best performance in far distance speech recognition.&lt;ref name=Ko2017&gt;{{cite conference |title=A Study on Data Augmentation of Reverberant Speech for Robust Speech Recognition |last1=Ko |first1=Tom |last2=Peddinti |first2=Vijayaditya |last3=Povey |first3=Daniel |last4=Seltzer |first4=Michael L. |last5=Khudanpur |first5=Sanjeev |date=March 2018 |location=New Orleans, LA, USA |conference=The 42nd IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2017) |url=https://www.danielpovey.com/files/2017_icassp_reverberation.pdf |access-date=2019-09-04 |archive-date=2018-07-08 |archive-url=https://web.archive.org/web/20180708072725/http://danielpovey.com/files/2017_icassp_reverberation.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> ==== Max pooling ====<br /> In 1990 Yamaguchi et al. introduced the concept of max pooling, which is a fixed filtering operation that calculates and propagates the maximum value of a given region. They did so by combining TDNNs with max pooling in order to realize a speaker independent isolated word recognition system.&lt;ref name=&quot;Yamaguchi111990&quot;/&gt; In their system they used several TDNNs per word, one for each [[syllable]]. The results of each TDNN over the input signal were combined using max pooling and the outputs of the pooling layers were then passed on to networks performing the actual word classification.<br /> <br /> === Image recognition with CNNs trained by gradient descent ===<br /> A system to recognize hand-written [[ZIP Code]] numbers&lt;ref&gt;Denker, J S, Gardner, W R, Graf, H. P, Henderson, D, Howard, R E, Hubbard, W, Jackel, L D, BaIrd, H S, and Guyon (1989) [http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.852.5499&amp;rep=rep1&amp;type=pdf Neural network recognizer for hand-written zip code digits] {{Webarchive|url=https://web.archive.org/web/20180804013916/http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.852.5499&amp;rep=rep1&amp;type=pdf |date=2018-08-04 }}, AT&amp;T Bell Laboratories&lt;/ref&gt; involved convolutions in which the kernel coefficients had been laboriously hand designed.&lt;ref name=&quot;:2&quot;&gt;Y. LeCun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, L. D. Jackel, [http://yann.lecun.com/exdb/publis/pdf/lecun-89e.pdf Backpropagation Applied to Handwritten Zip Code Recognition] {{Webarchive|url=https://web.archive.org/web/20200110090230/http://yann.lecun.com/exdb/publis/pdf/lecun-89e.pdf |date=2020-01-10 }}; AT&amp;T Bell Laboratories&lt;/ref&gt;<br /> <br /> [[Yann LeCun]] et al. (1989)&lt;ref name=&quot;:2&quot;/&gt; used back-propagation to learn the convolution kernel coefficients directly from images of hand-written numbers. Learning was thus fully automatic, performed better than manual coefficient design, and was suited to a broader range of image recognition problems and image types.<br /> <br /> Wei Zhang et al. (1988)&lt;ref name=&quot;:0&quot;/&gt;&lt;ref name=&quot;:1&quot;/&gt; used back-propagation to train the convolution kernels of a CNN for alphabets recognition. The model was called Shift-Invariant Artificial Neural Network (SIANN) before the name CNN was coined later in the early 1990s. Wei Zhang et al. also applied the same CNN without the last fully connected layer for medical image object segmentation (1991)&lt;ref name=&quot;:wz1991&quot;/&gt; and breast cancer detection in mammograms (1994).&lt;ref name=&quot;:wz1994&quot;/&gt; <br /> <br /> This approach became a foundation of modern [[computer vision]].<br /> <br /> ==== LeNet-5 ====<br /> {{Main|LeNet}}<br /> LeNet-5, a pioneering 7-level convolutional network by [[Yann LeCun|LeCun]] et al. in 1995,&lt;ref name=&quot;lecun95&quot;&gt;http://yann.lecun.com/exdb/publis/pdf/lecun-95a.pdf {{Webarchive|url=https://web.archive.org/web/20230502220356/http://yann.lecun.com/exdb/publis/pdf/lecun-95a.pdf |date=2023-05-02 }} {{bare URL PDF|date=May 2023}}&lt;/ref&gt; that classifies digits, was applied by several banks to recognize hand-written numbers on checks ({{Lang-en-GB|cheques}}) digitized in 32x32 pixel images. The ability to process higher-resolution images requires larger and more layers of convolutional neural networks, so this technique is constrained by the availability of computing resources.<br /> <br /> ===Shift-invariant neural network===<br /> <br /> A shift-invariant neural network was proposed by Wei Zhang et al. for image character recognition in 1988.&lt;ref name=&quot;:0&quot;/&gt;&lt;ref name=&quot;:1&quot;/&gt; It is a modified Neocognitron by keeping only the convolutional interconnections between the image feature layers and the last fully connected layer. The model was trained with back-propagation. The training algorithm were further improved in 1991&lt;ref&gt;{{cite journal |last=Zhang |first=Wei |date=1991 |title=Error Back Propagation with Minimum-Entropy Weights: A Technique for Better Generalization of 2-D Shift-Invariant NNs |url=https://drive.google.com/file/d/0B65v6Wo67Tk5dkJTcEMtU2c5Znc/view?usp=sharing |journal=Proceedings of the International Joint Conference on Neural Networks |access-date=2016-09-22 |archive-date=2017-02-06 |archive-url=https://web.archive.org/web/20170206155801/https://drive.google.com/file/d/0B65v6Wo67Tk5dkJTcEMtU2c5Znc/view?usp=sharing |url-status=live }}&lt;/ref&gt; to improve its generalization ability. The model architecture was modified by removing the last fully connected layer and applied for medical image segmentation (1991)&lt;ref name=&quot;:wz1991&quot;&gt;{{cite journal |last=Zhang |first=Wei |date=1991 |title=Image processing of human corneal endothelium based on a learning network |url=https://drive.google.com/file/d/0B65v6Wo67Tk5cm5DTlNGd0NPUmM/view?usp=sharing |journal=Applied Optics |volume=30 |issue=29 |pages=4211–7 |doi=10.1364/AO.30.004211 |pmid=20706526 |bibcode=1991ApOpt..30.4211Z |access-date=2016-09-22 |archive-date=2017-02-06 |archive-url=https://web.archive.org/web/20170206122612/https://drive.google.com/file/d/0B65v6Wo67Tk5cm5DTlNGd0NPUmM/view?usp=sharing |url-status=live }}&lt;/ref&gt; and automatic detection of breast cancer in [[mammography|mammograms (1994)]].&lt;ref name=&quot;:wz1994&quot;&gt;{{cite journal |last=Zhang |first=Wei |date=1994 |title=Computerized detection of clustered microcalcifications in digital mammograms using a shift-invariant artificial neural network |url=https://drive.google.com/file/d/0B65v6Wo67Tk5Ml9qeW5nQ3poVTQ/view?usp=sharing |journal=Medical Physics |volume=21 |issue=4 |pages=517–24 |doi=10.1118/1.597177 |pmid=8058017 |bibcode=1994MedPh..21..517Z |access-date=2016-09-22 |archive-date=2017-02-06 |archive-url=https://web.archive.org/web/20170206030321/https://drive.google.com/file/d/0B65v6Wo67Tk5Ml9qeW5nQ3poVTQ/view?usp=sharing |url-status=live }}&lt;/ref&gt;<br /> <br /> A different convolution-based design was proposed in 1988&lt;ref&gt;Daniel Graupe, Ruey Wen Liu, George S Moschytz.&quot;[https://www.researchgate.net/profile/Daniel_Graupe2/publication/241130197_Applications_of_signal_and_image_processing_to_medicine/links/575eef7e08aec91374b42bd2.pdf Applications of neural networks to medical signal processing] {{Webarchive|url=https://web.archive.org/web/20200728164114/https://www.researchgate.net/profile/Daniel_Graupe2/publication/241130197_Applications_of_signal_and_image_processing_to_medicine/links/575eef7e08aec91374b42bd2.pdf |date=2020-07-28 }}&quot;. In Proc. 27th IEEE Decision and Control Conf., pp. 343–347, 1988.&lt;/ref&gt; for application to decomposition of one-dimensional [[electromyography]] convolved signals via de-convolution. This design was modified in 1989 to other de-convolution-based designs.&lt;ref&gt;Daniel Graupe, Boris Vern, G. Gruener, Aaron Field, and Qiu Huang. &quot;[https://ieeexplore.ieee.org/abstract/document/100522/ Decomposition of surface EMG signals into single fiber action potentials by means of neural network] {{Webarchive|url=https://web.archive.org/web/20190904161656/https://ieeexplore.ieee.org/abstract/document/100522/ |date=2019-09-04 }}&quot;. Proc. IEEE International Symp. on Circuits and Systems, pp. 1008–1011, 1989.&lt;/ref&gt;&lt;ref&gt;Qiu Huang, Daniel Graupe, Yi Fang Huang, Ruey Wen Liu.&quot;[http://www.academia.edu/download/42092095/graupe_huang_q_huang_yf_liu_rw_1989.pdf Identification of firing patterns of neuronal signals]{{dead link|date=July 2022|bot=medic}}{{cbignore|bot=medic}}.&quot; In Proc. 28th IEEE Decision and Control Conf., pp. 266–271, 1989. https://ieeexplore.ieee.org/document/70115 {{Webarchive|url=https://web.archive.org/web/20220331211138/https://ieeexplore.ieee.org/document/70115 |date=2022-03-31 }}&lt;/ref&gt;<br /> <br /> === Neural abstraction pyramid ===<br /> [[File:Neural Abstraction Pyramid.jpg|alt=Neural Abstraction Pyramid|thumb|Neural abstraction pyramid]]<br /> The feed-forward architecture of convolutional neural networks was extended in the neural abstraction pyramid&lt;ref&gt;{{cite book<br /> |last1=Behnke<br /> |first1=Sven<br /> |year=2003<br /> |title=Hierarchical Neural Networks for Image Interpretation<br /> |url=https://www.ais.uni-bonn.de/books/LNCS2766.pdf<br /> |series=Lecture Notes in Computer Science<br /> |volume=2766<br /> |publisher=Springer<br /> |doi=10.1007/b11963<br /> |isbn=978-3-540-40722-5<br /> |s2cid=1304548<br /> |access-date=2016-12-28<br /> |archive-date=2017-08-10<br /> |archive-url=https://web.archive.org/web/20170810020001/http://www.ais.uni-bonn.de/books/LNCS2766.pdf<br /> |url-status=live<br /> }}&lt;/ref&gt; by lateral and feedback connections. The resulting recurrent convolutional network allows for the flexible incorporation of contextual information to iteratively resolve local ambiguities. In contrast to previous models, image-like outputs at the highest resolution were generated, e.g., for semantic segmentation, image reconstruction, and object localization tasks.<br /> <br /> === GPU implementations ===<br /> Although CNNs were invented in the 1980s, their breakthrough in the 2000s required fast implementations on [[graphics processing unit]]s (GPUs).<br /> <br /> In 2004, it was shown by K. S. Oh and K. Jung that standard neural networks can be greatly accelerated on GPUs. Their implementation was 20 times faster than an equivalent implementation on [[CPU]].&lt;ref&gt;{{cite journal |last1=Oh |first1=KS |last2=Jung |first2=K |title=GPU implementation of neural networks. |journal=Pattern Recognition |date=2004 |volume=37 |issue=6 |pages=1311–1314 |doi=10.1016/j.patcog.2004.01.013 |bibcode=2004PatRe..37.1311O}}&lt;/ref&gt;&lt;ref name=&quot;schdeepscholar&quot;&gt;{{cite journal |last1=Schmidhuber |first1=Jürgen |title=Deep Learning |journal=Scholarpedia |url=http://www.scholarpedia.org/article/Deep_Learning |date=2015 |volume=10 |issue=11 |pages=1527–54 |pmid=16764513 |doi=10.1162/neco.2006.18.7.1527 |citeseerx=10.1.1.76.1541 |s2cid=2309950 |access-date=2019-01-20 |archive-date=2016-04-19 |archive-url=https://web.archive.org/web/20160419024349/http://www.scholarpedia.org/article/Deep_Learning |url-status=live }}&lt;/ref&gt; In 2005, another paper also emphasised the value of [[GPGPU]] for [[machine learning]].&lt;ref&gt;{{cite conference |author1=Dave Steinkraus |author2=Patrice Simard |author3=Ian Buck |title=12th International Conference on Document Analysis and Recognition (ICDAR 2005) |date=2005 |pages=1115–1119 |chapter-url=https://www.computer.org/csdl/proceedings-article/icdar/2005/24201115/12OmNylKAVX |chapter=Using GPUs for Machine Learning Algorithms |doi=10.1109/ICDAR.2005.251 |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331211138/https://www.computer.org/csdl/proceedings-article/icdar/2005/24201115/12OmNylKAVX |url-status=live }}&lt;/ref&gt;<br /> <br /> The first GPU-implementation of a CNN was described in 2006 by K. Chellapilla et al. Their implementation was 4 times faster than an equivalent implementation on CPU.&lt;ref&gt;{{cite book |author1=Kumar Chellapilla |author2=Sid Puri |author3=Patrice Simard |editor1-last=Lorette |editor1-first=Guy |title=Tenth International Workshop on Frontiers in Handwriting Recognition |date=2006 |publisher=Suvisoft |chapter-url=https://hal.inria.fr/inria-00112631/document |chapter=High Performance Convolutional Neural Networks for Document Processing |access-date=2016-03-14 |archive-date=2020-05-18 |archive-url=https://web.archive.org/web/20200518193413/https://hal.inria.fr/inria-00112631/document |url-status=live }}&lt;/ref&gt; Subsequent work also used GPUs, initially for other types of neural networks (different from CNNs), especially unsupervised neural networks.&lt;ref&gt;{{cite journal |last1=Hinton |first1=GE |last2=Osindero |first2=S |last3=Teh |first3=YW |title=A fast learning algorithm for deep belief nets. |journal=Neural Computation |date=Jul 2006 |volume=18 |issue=7 |pages=1527–54 |pmid=16764513 |doi=10.1162/neco.2006.18.7.1527 |citeseerx=10.1.1.76.1541 |s2cid=2309950}}&lt;/ref&gt;&lt;ref&gt;{{cite journal |last1=Bengio |first1=Yoshua |last2=Lamblin |first2=Pascal |last3=Popovici |first3=Dan |last4=Larochelle |first4=Hugo |title=Greedy Layer-Wise Training of Deep Networks |journal=Advances in Neural Information Processing Systems |date=2007 |pages=153–160 |url=https://proceedings.neurips.cc/paper/2006/file/5da713a690c067105aeb2fae32403405-Paper.pdf |access-date=2022-03-31 |archive-date=2022-06-02 |archive-url=https://web.archive.org/web/20220602144141/https://proceedings.neurips.cc/paper/2006/file/5da713a690c067105aeb2fae32403405-Paper.pdf |url-status=live }}&lt;/ref&gt;&lt;ref&gt;{{cite journal |last1=Ranzato |first1=MarcAurelio |last2=Poultney |first2=Christopher |last3=Chopra |first3=Sumit |last4=LeCun |first4=Yann |title=Efficient Learning of Sparse Representations with an Energy-Based Model |journal=Advances in Neural Information Processing Systems |date=2007 |url=http://yann.lecun.com/exdb/publis/pdf/ranzato-06.pdf |access-date=2014-06-26 |archive-date=2016-03-22 |archive-url=https://web.archive.org/web/20160322112400/http://yann.lecun.com/exdb/publis/pdf/ranzato-06.pdf |url-status=live }}&lt;/ref&gt;&lt;ref&gt;{{cite book |last1=Raina |first1=R |last2=Madhavan |first2=A |last3=Ng |first3=Andrew |title=Proceedings of the 26th Annual International Conference on Machine Learning |chapter=Large-scale deep unsupervised learning using graphics processors |journal=ICML |date=2009 |pages=873–880 |doi=10.1145/1553374.1553486 |isbn=9781605585161 |s2cid=392458 |chapter-url=http://robotics.stanford.edu/~ang/papers/icml09-LargeScaleUnsupervisedDeepLearningGPU.pdf |access-date=2019-09-04 |archive-date=2020-12-08 |archive-url=https://web.archive.org/web/20201208104513/http://robotics.stanford.edu/~ang/papers/icml09-LargeScaleUnsupervisedDeepLearningGPU.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> In 2010, Dan Ciresan et al. at [[IDSIA]] showed that even deep standard neural networks with many layers can be quickly trained on GPU by supervised learning through the old method known as [[backpropagation]]. Their network outperformed previous machine learning methods on the [[MNIST]] handwritten digits benchmark.&lt;ref&gt;{{cite journal |last1=Ciresan |first1=Dan |last2=Meier |first2=Ueli |last3=Gambardella |first3=Luca |last4=Schmidhuber |first4=Jürgen |title=Deep big simple neural nets for handwritten digit recognition. |journal=Neural Computation |date=2010 |volume=22 |issue=12 |pages=3207–3220 |doi=10.1162/NECO_a_00052 |pmid=20858131 |arxiv=1003.0358 |s2cid=1918673}}&lt;/ref&gt; In 2011, they extended this GPU approach to CNNs, achieving an acceleration factor of 60, with impressive results.&lt;ref name=&quot;flexible&quot;&gt;{{cite journal |last=Ciresan |first=Dan |author2=Ueli Meier |author3=Jonathan Masci |author4=Luca M. Gambardella |author5=Jurgen Schmidhuber |title=Flexible, High Performance Convolutional Neural Networks for Image Classification |journal=Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence-Volume Volume Two |year=2011 |volume=2 |pages=1237–1242 |url=https://people.idsia.ch/~juergen/ijcai2011.pdf |access-date=17 November 2013 |archive-date=5 April 2022 |archive-url=https://web.archive.org/web/20220405190128/https://people.idsia.ch/~juergen/ijcai2011.pdf |url-status=live }}&lt;/ref&gt; In 2011, they used such CNNs on GPU to win an image recognition contest where they achieved superhuman performance for the first time.&lt;ref&gt;{{cite web |url=https://benchmark.ini.rub.de/gtsrb_results.html |title=IJCNN 2011 Competition result table |website=OFFICIAL IJCNN2011 COMPETITION |language=en-US |access-date=2019-01-14 |date=2010 |archive-date=2021-01-17 |archive-url=https://web.archive.org/web/20210117024729/https://benchmark.ini.rub.de/gtsrb_results.html |url-status=live }}&lt;/ref&gt; Between May 15, 2011 and September 30, 2012, their CNNs won no less than four image competitions.&lt;ref&gt;{{cite web |url=https://people.idsia.ch/~juergen/computer-vision-contests-won-by-gpu-cnns.html |last1=Schmidhuber |first1=Jürgen |title=History of computer vision contests won by deep CNNs on GPU |language=en-US |access-date=14 January 2019 |date=17 March 2017 |archive-date=19 December 2018 |archive-url=https://web.archive.org/web/20181219224934/http://people.idsia.ch/~juergen/computer-vision-contests-won-by-gpu-cnns.html |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;schdeepscholar&quot;/&gt; In 2012, they also significantly improved on the best performance in the literature for multiple image [[database]]s, including the [[MNIST database]], the NORB database, the HWDB1.0 dataset (Chinese characters) and the [[CIFAR-10|CIFAR10 dataset]] (dataset of 60000 32x32 labeled [[RGB images]]).&lt;ref name=&quot;mcdns&quot;/&gt;<br /> <br /> Subsequently, a similar GPU-based CNN by Alex Krizhevsky et al. won the [[ImageNet Large Scale Visual Recognition Challenge]] 2012.&lt;ref name=&quot;:02&quot;/&gt; A very deep CNN with over 100 layers by Microsoft won the ImageNet 2015 contest.&lt;ref&gt;{{cite book |last1=He |first1=Kaiming |last2=Zhang |first2=Xiangyu |last3=Ren |first3=Shaoqing |last4=Sun |first4=Jian |title=2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) |chapter=Deep Residual Learning for Image Recognition |pages=770–778 |date=2016 |chapter-url=https://openaccess.thecvf.com/content_cvpr_2016/papers/He_Deep_Residual_Learning_CVPR_2016_paper.pdf |doi=10.1109/CVPR.2016.90 |arxiv=1512.03385 |isbn=978-1-4673-8851-1 |s2cid=206594692 |access-date=2022-03-31 |archive-date=2022-04-05 |archive-url=https://web.archive.org/web/20220405165303/https://openaccess.thecvf.com/content_cvpr_2016/papers/He_Deep_Residual_Learning_CVPR_2016_paper.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> === Intel Xeon Phi implementations ===<br /> Compared to the training of CNNs using [[GPU]]s, not much attention was given to the [[Intel Xeon Phi]] [[coprocessor]].&lt;ref&gt;{{cite conference<br /> |last1=Viebke<br /> |first1=Andre<br /> |last2=Pllana<br /> |first2=Sabri<br /> |title=2015 IEEE 17th International Conference on High Performance Computing and Communications, 2015 IEEE 7th International Symposium on Cyberspace Safety and Security, and 2015 IEEE 12th International Conference on Embedded Software and Systems<br /> |chapter=The Potential of the Intel (R) Xeon Phi for Supervised Deep Learning<br /> |pages=758–765<br /> |website=IEEE Xplore<br /> |publisher=IEEE 2015<br /> |doi=10.1109/HPCC-CSS-ICESS.2015.45<br /> |isbn=978-1-4799-8937-9<br /> |year=2015<br /> |s2cid=15411954<br /> |chapter-url=http://lnu.diva-portal.org/smash/record.jsf?pid=diva2%3A877421&amp;dswid=4277<br /> |access-date=2022-03-31<br /> |archive-date=2023-03-06<br /> |archive-url=https://web.archive.org/web/20230306003530/http://lnu.diva-portal.org/smash/record.jsf?pid=diva2:877421&amp;dswid=4277<br /> |url-status=live<br /> }}&lt;/ref&gt;<br /> A notable development is a parallelization method for training convolutional neural networks on the Intel Xeon Phi, named Controlled Hogwild with Arbitrary Order of Synchronization (CHAOS).&lt;ref&gt;<br /> {{cite journal<br /> |last1=Viebke<br /> |first1=Andre<br /> |last2=Memeti<br /> |first2=Suejb<br /> |last3=Pllana<br /> |first3=Sabri<br /> |last4=Abraham<br /> |first4=Ajith<br /> |title=CHAOS: a parallelization scheme for training convolutional neural networks on Intel Xeon Phi<br /> |journal=The Journal of Supercomputing<br /> |date=2019<br /> |volume=75<br /> |issue=1<br /> |pages=197–227<br /> |doi=10.1007/s11227-017-1994-x<br /> |arxiv=1702.07908<br /> |s2cid=14135321<br /> }}<br /> &lt;/ref&gt;<br /> CHAOS exploits both the thread- and [[SIMD]]-level parallelism that is available on the Intel Xeon Phi.<br /> <br /> == Distinguishing features ==<br /> In the past, traditional [[multilayer perceptron]] (MLP) models were used for image recognition.{{Example needed|date=October 2017}} However, the full connectivity between nodes caused the [[curse of dimensionality]], and was computationally intractable with higher-resolution images. A 1000×1000-pixel image with [[RGB color model|RGB color]] channels has 3 million weights per fully-connected neuron, which is too high to feasibly process efficiently at scale.<br /> [[File:Conv layers.png|left|thumb|237x237px|CNN layers arranged in 3 dimensions]]<br /> For example, in [[CIFAR-10]], images are only of size 32×32×3 (32 wide, 32 high, 3 color channels), so a single fully connected neuron in the first hidden layer of a regular neural network would have 32*32*3 = 3,072 weights. A 200×200 image, however, would lead to neurons that have 200*200*3 = 120,000 weights.<br /> <br /> Also, such network architecture does not take into account the spatial structure of data, treating input pixels which are far apart in the same way as pixels that are close together. This ignores [[locality of reference]] in data with a grid-topology (such as images), both computationally and semantically. Thus, full connectivity of neurons is wasteful for purposes such as image recognition that are dominated by [[spatial locality|spatially local]] input patterns.<br /> <br /> Convolutional neural networks are variants of multilayer perceptrons, designed to emulate the behavior of a [[visual cortex]]. These models mitigate the challenges posed by the MLP architecture by exploiting the strong spatially local correlation present in natural images. As opposed to MLPs, CNNs have the following distinguishing features:<br /> * 3D volumes of neurons. The layers of a CNN have neurons arranged in [[three-dimensional space|3 dimensions]]: width, height and depth.&lt;ref&gt;{{cite journal |last=Hinton |first=Geoffrey |date=2012 |title=ImageNet Classification with Deep Convolutional Neural Networks |url=https://dl.acm.org/doi/10.5555/2999134.2999257 |journal=NIPS'12: Proceedings of the 25th International Conference on Neural Information Processing Systems - Volume 1 |volume=1 |pages=1097–1105 |via=ACM |access-date=2021-03-26 |archive-date=2019-12-20 |archive-url=https://web.archive.org/web/20191220014019/https://dl.acm.org/citation.cfm?id=2999134.2999257 |url-status=live }}&lt;/ref&gt; Where each neuron inside a convolutional layer is connected to only a small region of the layer before it, called a receptive field. Distinct types of layers, both locally and completely connected, are stacked to form a CNN architecture.<br /> * Local connectivity: following the concept of receptive fields, CNNs exploit spatial locality by enforcing a local connectivity pattern between neurons of adjacent layers. The architecture thus ensures that the learned &quot;[[filter (signal processing)|filters]]&quot; produce the strongest response to a spatially local input pattern. Stacking many such layers leads to [[nonlinear filter]]s that become increasingly global (i.e. responsive to a larger region of pixel space) so that the network first creates representations of small parts of the input, then from them assembles representations of larger areas.<br /> * Shared weights: In CNNs, each filter is replicated across the entire visual field. These replicated units share the same parameterization (weight vector and bias) and form a feature map. This means that all the neurons in a given convolutional layer respond to the same feature within their specific response field. Replicating units in this way allows for the resulting activation map to be [[equivariant map|equivariant]] under shifts of the locations of input features in the visual field, i.e. they grant translational [[equivariant map|equivariance]] - given that the layer has a stride of one.&lt;ref name=&quot;:5&quot;/&gt;<br /> * Pooling: In a CNN's pooling layers, feature maps are divided into rectangular sub-regions, and the features in each rectangle are independently down-sampled to a single value, commonly by taking their average or maximum value. In addition to reducing the sizes of feature maps, the pooling operation grants a degree of local [[translational symmetry|translational invariance]] to the features contained therein, allowing the CNN to be more robust to variations in their positions.&lt;ref name=&quot;:6&quot;/&gt;<br /> <br /> Together, these properties allow CNNs to achieve better generalization on [[computer vision|vision problems]]. Weight sharing dramatically reduces the number of [[free parameter]]s learned, thus lowering the memory requirements for running the network and allowing the training of larger, more powerful networks.<br /> <br /> == Building blocks ==<br /> {{More citations needed section|date=June 2017}}<br /> <br /> A CNN architecture is formed by a stack of distinct layers that transform the input volume into an output volume (e.g. holding the class scores) through a differentiable function. A few distinct types of layers are commonly used. These are further discussed below.[[File:Conv layer.png|left|thumb|Neurons of a convolutional layer (blue), connected to their receptive field (red)|229x229px]]<br /> <br /> === Convolutional layer ===<br /> The convolutional layer is the core building block of a CNN. The layer's parameters consist of a set of learnable [[filter (signal processing)|filters]] (or [[kernel (image processing)|kernels]]), which have a small receptive field, but extend through the full depth of the input volume. During the forward pass, each filter is [[convolution|convolved]] across the width and height of the input volume, computing the [[dot product]] between the filter entries and the input, producing a 2-dimensional [[activation function|activation map]] of that filter. As a result, the network learns filters that activate when it detects some specific type of [[feature (machine learning)|feature]] at some spatial position in the input.&lt;ref name=&quot;Géron Hands-on ML 2019&quot;&gt;{{cite book<br /> |last1=Géron<br /> |first1=Aurélien<br /> |title=Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow<br /> |date=2019<br /> |publisher=O'Reilly Media<br /> |location=Sebastopol, CA<br /> |isbn=978-1-492-03264-9<br /> }}, pp. 448&lt;/ref&gt;&lt;ref group=&quot;nb&quot;&gt;When applied to other types of data than image data, such as sound data, &quot;spatial position&quot; may variously correspond to different points in the [[time domain]], [[frequency domain]], or other [[space (mathematics)|mathematical spaces]].&lt;/ref&gt;<br /> <br /> Stacking the activation maps for all filters along the depth dimension forms the full output volume of the convolution layer. Every entry in the output volume can thus also be interpreted as an output of a neuron that looks at a small region in the input. Each entry in an activation map use the same set of parameters that define the filter.<br /> <br /> [[Self-supervised learning]] has been adapted for use in convolutional layers by using sparse patches with a high-mask ratio and a global response normalization layer.&lt;ref&gt;{{Cite web |last=Raschka |first=Sebastian |title=Ahead of AI #5: RevAIval of Ideas |url=https://magazine.sebastianraschka.com/p/ahead-of-ai-5-revaival-of-ideas |access-date=2023-02-07 |website=magazine.sebastianraschka.com |language=en |archive-date=2023-02-07 |archive-url=https://web.archive.org/web/20230207003859/https://magazine.sebastianraschka.com/p/ahead-of-ai-5-revaival-of-ideas |url-status=live }}&lt;/ref&gt;<br /> <br /> ==== Local connectivity ====<br /> [[File:Typical cnn.png|thumb|395x395px|Typical CNN architecture]]<br /> <br /> When dealing with high-dimensional inputs such as images, it is impractical to connect neurons to all neurons in the previous volume because such a network architecture does not take the spatial structure of the data into account. Convolutional networks exploit spatially local correlation by enforcing a [[sparse network|sparse local connectivity]] pattern between neurons of adjacent layers: each neuron is connected to only a small region of the input volume.<br /> <br /> The extent of this connectivity is a [[hyperparameter optimization|hyperparameter]] called the [[receptive field]] of the neuron. The connections are [[spatial locality|local in space]] (along width and height), but always extend along the entire depth of the input volume. Such an architecture ensures that the learned ({{Lang-en-GB|learnt}}) filters produce the strongest response to a spatially local input pattern.<br /> <br /> ==== Spatial arrangement ====<br /> <br /> Three [[hyperparameter (machine learning)|hyperparameters]] control the size of the output volume of the convolutional layer: the depth, [[stride of an array|stride]], and padding size:<br /> * The ''&lt;u&gt;depth&lt;/u&gt;'' of the output volume controls the number of neurons in a layer that connect to the same region of the input volume. These neurons learn to activate for different features in the input. For example, if the first convolutional layer takes the raw image as input, then different neurons along the depth dimension may activate in the presence of various oriented edges, or blobs of color.<br /> *&lt;u&gt;''Stride''&lt;/u&gt; controls how depth columns around the width and height are allocated. If the stride is 1, then we move the filters one pixel at a time. This leads to heavily [[intersection (set theory)|overlapping]] receptive fields between the columns, and to large output volumes. For any integer &lt;math display=&quot;inline&quot;&gt;S &gt; 0,&lt;/math&gt; a stride ''S'' means that the filter is translated ''S'' units at a time per output. In practice, &lt;math display=&quot;inline&quot;&gt;S \geq 3&lt;/math&gt; is rare. A greater stride means smaller overlap of receptive fields and smaller spatial dimensions of the output volume.&lt;ref&gt;{{cite web |url=https://cs231n.github.io/convolutional-networks/ |title=CS231n Convolutional Neural Networks for Visual Recognition |website=cs231n.github.io |access-date=2017-04-25 |archive-date=2019-10-23 |archive-url=https://web.archive.org/web/20191023031945/https://cs231n.github.io/convolutional-networks/ |url-status=live }}&lt;/ref&gt;<br /> * Sometimes, it is convenient to pad the input with zeros (or other values, such as the average of the region) on the border of the input volume. The size of this padding is a third hyperparameter. Padding provides control of the output volume's spatial size. In particular, sometimes it is desirable to exactly preserve the spatial size of the input volume, this is commonly referred to as &quot;same&quot; padding.<br /> <br /> The spatial size of the output volume is a function of the input volume size &lt;math&gt;W&lt;/math&gt;, the kernel field size &lt;math&gt;K&lt;/math&gt; of the convolutional layer neurons, the stride &lt;math&gt;S&lt;/math&gt;, and the amount of zero padding &lt;math&gt;P&lt;/math&gt; on the border. The number of neurons that &quot;fit&quot; in a given volume is then:<br /> :&lt;math display=&quot;block&quot;&gt;\frac{W-K+2P}{S} + 1.&lt;/math&gt;<br /> <br /> If this number is not an [[integer]], then the strides are incorrect and the neurons cannot be tiled to fit across the input volume in a [[symmetry|symmetric]] way. In general, setting zero padding to be &lt;math display=&quot;inline&quot;&gt;P = (K-1)/2&lt;/math&gt; when the stride is &lt;math&gt;S=1&lt;/math&gt; ensures that the input volume and output volume will have the same size spatially. However, it is not always completely necessary to use all of the neurons of the previous layer. For example, a neural network designer may decide to use just a portion of padding.<br /> <br /> ==== Parameter sharing ====<br /> A parameter sharing scheme is used in convolutional layers to control the number of free parameters. It relies on the assumption that if a patch feature is useful to compute at some spatial position, then it should also be useful to compute at other positions. Denoting a single 2-dimensional slice of depth as a ''depth slice'', the neurons in each depth slice are constrained to use the same weights and bias.<br /> <br /> Since all neurons in a single depth slice share the same parameters, the forward pass in each depth slice of the convolutional layer can be computed as a [[convolution]] of the neuron's weights with the input volume.&lt;ref group=&quot;nb&quot;&gt;hence the name &quot;convolutional layer&quot;&lt;/ref&gt; Therefore, it is common to refer to the sets of weights as a filter (or a [[kernel (image processing)|kernel]]), which is convolved with the input. The result of this convolution is an [[activation function|activation map]], and the set of activation maps for each different filter are stacked together along the depth dimension to produce the output volume. Parameter sharing contributes to the [[translational symmetry|translation invariance]] of the CNN architecture.&lt;ref name=&quot;:6&quot;/&gt;<br /> <br /> Sometimes, the parameter sharing assumption may not make sense. This is especially the case when the input images to a CNN have some specific centered structure; for which we expect completely different features to be learned on different spatial locations. One practical example is when the inputs are faces that have been centered in the image: we might expect different eye-specific or hair-specific features to be learned in different parts of the image. In that case it is common to relax the parameter sharing scheme, and instead simply call the layer a &quot;locally connected layer&quot;.<br /> <br /> === Pooling layer ===<br /> [[File:Max pooling.png|thumb|314x314px|Max pooling with a 2x2 filter and stride = 2]]<br /> Another important concept of CNNs is pooling, which is a form of non-linear [[downsampling (signal processing)|down-sampling]]. There are several non-linear functions to implement pooling, where ''max pooling'' is the most common. It [[partition of a set|partitions]] the input image into a set of rectangles and, for each such sub-region, outputs the maximum.<br /> <br /> Intuitively, the exact location of a feature is less important than its rough location relative to other features. This is the idea behind the use of pooling in convolutional neural networks. The pooling layer serves to progressively reduce the spatial size of the representation, to reduce the number of parameters, [[memory footprint]] and amount of computation in the network, and hence to also control [[overfitting]]. This is known as down-sampling. It is common to periodically insert a pooling layer between successive convolutional layers (each one typically followed by an activation function, such as a [[#ReLU layer|ReLU layer]]) in a CNN architecture.&lt;ref name=&quot;Géron Hands-on ML 2019&quot;/&gt;{{rp|460–461}} While pooling layers contribute to local translation invariance, they do not provide global translation invariance in a CNN, unless a form of global pooling is used.&lt;ref name=&quot;:6&quot;/&gt;&lt;ref name=&quot;:5&quot;&gt;{{cite journal |last1=Azulay |first1=Aharon |last2=Weiss |first2=Yair |date=2019 |title=Why do deep convolutional networks generalize so poorly to small image transformations? |url=https://jmlr.org/papers/v20/19-519.html |journal=Journal of Machine Learning Research |volume=20 |issue=184 |pages=1–25 |issn=1533-7928 |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331211138/https://jmlr.org/papers/v20/19-519.html |url-status=live }}&lt;/ref&gt; The pooling layer commonly operates independently on every depth, or slice, of the input and resizes it spatially. A very common form of max pooling is a layer with filters of size 2×2, applied with a stride of 2, which subsamples every depth slice in the input by 2 along both width and height, discarding 75% of the activations:&lt;math display=&quot;block&quot;&gt;f_{X,Y}(S)=\max_{a,b=0}^1S_{2X+a,2Y+b}.&lt;/math&gt;<br /> In this case, every [[maximum|max operation]] is over 4 numbers. The depth dimension remains unchanged (this is true for other forms of pooling as well).<br /> <br /> In addition to max pooling, pooling units can use other functions, such as [[average]] pooling or [[Euclidean norm|ℓ&lt;sub&gt;2&lt;/sub&gt;-norm]] pooling. Average pooling was often used historically but has recently fallen out of favor compared to max pooling, which generally performs better in practice.&lt;ref name=&quot;Scherer-ICANN-2010&quot;&gt;{{cite conference<br /> |url=http://ais.uni-bonn.de/papers/icann2010_maxpool.pdf<br /> |title=Evaluation of Pooling Operations in Convolutional Architectures for Object Recognition<br /> |last1=Scherer<br /> |first1=Dominik<br /> |last2=Müller<br /> |first2=Andreas C.<br /> |last3=Behnke<br /> |first3=Sven<br /> |year=2010<br /> |publisher=Springer<br /> |book-title=Artificial Neural Networks (ICANN), 20th International Conference on<br /> |pages=92–101<br /> |location=Thessaloniki, Greece<br /> |access-date=2016-12-28<br /> |archive-date=2018-04-03<br /> |archive-url=https://web.archive.org/web/20180403185041/http://ais.uni-bonn.de/papers/icann2010_maxpool.pdf<br /> |url-status=live<br /> }}&lt;/ref&gt;<br /> <br /> Due to the effects of fast spatial reduction of the size of the representation,{{Which|date=December 2018}} there is a recent trend towards using smaller filters&lt;ref&gt;{{cite arXiv |title=Fractional Max-Pooling |eprint=1412.6071 |date=2014-12-18 |first=Benjamin |last=Graham |class=cs.CV}}&lt;/ref&gt; or discarding pooling layers altogether.&lt;ref&gt;{{cite arXiv |title=Striving for Simplicity: The All Convolutional Net |eprint=1412.6806 |date=2014-12-21 |first1=Jost Tobias |last1=Springenberg |first2=Alexey |last2=Dosovitskiy |first3=Thomas |last3=Brox |first4=Martin |last4=Riedmiller |class=cs.LG}}&lt;/ref&gt;<br /> <br /> [[File:RoI pooling animated.gif|thumb|400x300px|RoI pooling to size 2x2. In this example region proposal (an input parameter) has size 7x5.]]<br /> &quot;[[Region of interest|Region of Interest]]&quot; pooling (also known as RoI pooling) is a variant of max pooling, in which output size is fixed and input rectangle is a parameter.&lt;ref&gt;{{cite web<br /> |last=Grel<br /> |first=Tomasz<br /> |title=Region of interest pooling explained<br /> |website=deepsense.io<br /> |date=2017-02-28<br /> |url=https://deepsense.io/region-of-interest-pooling-explained/<br /> |access-date=5 April 2017<br /> |language=en<br /> |archive-date=2017-06-02<br /> |archive-url=https://web.archive.org/web/20170602070519/https://deepsense.io/region-of-interest-pooling-explained/<br /> |url-status=dead<br /> }}&lt;/ref&gt;<br /> <br /> Pooling is a downsampling method and an important component of convolutional neural networks for [[object detection]] based on the Fast R-CNN&lt;ref name=&quot;rcnn&quot;&gt;{{cite arXiv<br /> |title=Fast R-CNN<br /> |eprint=1504.08083<br /> |date=2015-09-27<br /> |first=Ross<br /> |last=Girshick<br /> |class=cs.CV}}&lt;/ref&gt; architecture. <br /> === Channel Max Pooling ===<br /> A CMP operation layer conducts the MP operation along the channel side among the corresponding positions of the consecutive feature maps for the purpose of redundant information elimination. The CMP makes the significant features gather together within fewer channels, which is important for fine-grained image classification that needs more discriminating features. Meanwhile, another advantage of the CMP operation is to make the channel number of feature maps smaller before it connects to the first fully connected (FC) layer. Similar to the MP operation, we denote the input feature maps and output feature maps of a CMP layer as F ∈ R(C×M×N) and C ∈ R(c×M×N), respectively, where C and c are the channel numbers of the input and output feature maps, M and N are the widths and the height of the feature maps, respectively. Note that the CMP operation only changes the channel number of the feature maps. The width and the height of the feature maps are not changed, which is different from the MP operation.&lt;ref name=&quot;Ma Chang Xie Ding 2019 pp. 3224–3233&quot;&gt;{{cite journal |last1=Ma |first1=Zhanyu |last2=Chang |first2=Dongliang |last3=Xie |first3=Jiyang |last4=Ding |first4=Yifeng |last5=Wen |first5=Shaoguo |last6=Li |first6=Xiaoxu |last7=Si |first7=Zhongwei |last8=Guo |first8=Jun |title=Fine-Grained Vehicle Classification With Channel Max Pooling Modified CNNs |journal=IEEE Transactions on Vehicular Technology |publisher=Institute of Electrical and Electronics Engineers (IEEE) |volume=68 |issue=4 |year=2019 |issn=0018-9545 |doi=10.1109/tvt.2019.2899972 |pages=3224–3233|s2cid=86674074 }}&lt;/ref&gt;<br /> <br /> === ReLU layer ===<br /> ReLU is the abbreviation of [[rectifier (neural networks)|rectified linear unit]] introduced by [[Kunihiko Fukushima]] in 1969.&lt;ref name=&quot;Fukushima1969&quot;/&gt;&lt;ref name=DLhistory/&gt; ReLU applies the non-saturating [[activation function]] &lt;math alt=&quot;function of x equals maximum between zero and x&quot; display=&quot;inline&quot;&gt;f(x)=\max(0,x)&lt;/math&gt;.&lt;ref name=&quot;:02&quot;&gt;{{cite journal |last1=Krizhevsky |first1=Alex |last2=Sutskever |first2=Ilya |last3=Hinton |first3=Geoffrey E. |date=2017-05-24 |title=ImageNet classification with deep convolutional neural networks |url=https://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf |journal=Communications of the ACM |volume=60 |issue=6 |pages=84–90 |doi=10.1145/3065386 |s2cid=195908774 |issn=0001-0782 |access-date=2018-12-04 |archive-date=2017-05-16 |archive-url=https://web.archive.org/web/20170516174757/http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf |url-status=live }}&lt;/ref&gt; It effectively removes negative values from an activation map by setting them to zero.&lt;ref name=&quot;Romanuke4&quot;&gt;{{cite journal |last1=Romanuke |first1=Vadim |title=Appropriate number and allocation of ReLUs in convolutional neural networks |journal=Research Bulletin of NTUU &quot;Kyiv Polytechnic Institute&quot; |date=2017 |volume=1 |issue=1 |pages=69–78 |doi=10.20535/1810-0546.2017.1.88156 |doi-access=free}}&lt;/ref&gt; It introduces [[Nonlinearity_(disambiguation)|nonlinearity]] to the [[decision boundary|decision function]] and in the overall network without affecting the receptive fields of the convolution layers.<br /> In 2011, Xavier Glorot, Antoine Bordes and [[Yoshua Bengio]] found that ReLU enables better training of deeper networks,&lt;ref name=&quot;glorot2011&quot;&gt;{{cite conference |author1=Xavier Glorot |author2=Antoine Bordes |author3=[[Yoshua Bengio]] |year=2011 |title=Deep sparse rectifier neural networks |url=http://jmlr.org/proceedings/papers/v15/glorot11a/glorot11a.pdf |conference=AISTATS |quote=Rectifier and softplus activation functions. The second one is a smooth version of the first. |access-date=2023-04-10 |archive-date=2016-12-13 |archive-url=https://web.archive.org/web/20161213022121/http://jmlr.org/proceedings/papers/v15/glorot11a/glorot11a.pdf |url-status=dead }}&lt;/ref&gt; compared to widely used activation functions prior to 2011.<br /> <br /> Other functions can also be used to increase nonlinearity, for example the saturating [[hyperbolic tangent]] &lt;math alt=&quot;function of x equals hyperbolic tangent of x&quot;&gt;f(x)=\tanh(x)&lt;/math&gt;, &lt;math alt=&quot;function of x equals absolute value of the hyperbolic tangent of x&quot;&gt;f(x)=|\tanh(x)|&lt;/math&gt;, and the [[sigmoid function]] &lt;math alt=&quot;function of x equals the inverse of one plus e to the power of minus x&quot; display=&quot;inline&quot;&gt;\sigma(x)=(1+e^{-x} )^{-1}&lt;/math&gt;. ReLU is often preferred to other functions because it trains the neural network several times faster without a significant penalty to [[generalization (learning)|generalization]] accuracy.&lt;ref&gt;{{cite journal |last=Krizhevsky |first=A. |author2=Sutskever, I. |author3=Hinton, G. E. |title=Imagenet classification with deep convolutional neural networks |journal=Advances in Neural Information Processing Systems |volume=1 |year=2012 |pages=1097–1105 |url=https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331224736/https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> === Fully connected layer ===<br /> After several convolutional and max pooling layers, the final classification is done via fully connected layers. Neurons in a fully connected layer have connections to all activations in the previous layer, as seen in regular (non-convolutional) [[artificial neural network]]s. Their activations can thus be computed as an [[affine transformation]], with [[matrix multiplication]] followed by a bias offset ([[vector addition]] of a learned or fixed bias term).<br /> <br /> === Loss layer ===<br /> {{Main|Loss function|Loss functions for classification}}<br /> The &quot;loss layer&quot;, or &quot;[[loss function]]&quot;, specifies how [[training]] penalizes the deviation between the predicted output of the network, and the [[ground truth|true]] data labels (during supervised learning). Various [[loss function]]s can be used, depending on the specific task.<br /> <br /> The [[Softmax function|Softmax]] loss function is used for predicting a single class of ''K'' mutually exclusive classes.&lt;ref group=&quot;nb&quot;&gt;So-called [[categorical data]].&lt;/ref&gt; [[Sigmoid function|Sigmoid]] [[cross entropy|cross-entropy]] loss is used for predicting ''K'' independent probability values in &lt;math&gt;[0,1]&lt;/math&gt;. [[Euclidean distance|Euclidean]] loss is used for [[regression (machine learning)|regressing]] to [[real number|real-valued]] labels &lt;math&gt;(-\infty,\infty)&lt;/math&gt;.<br /> <br /> == Hyperparameters ==<br /> {{More citations needed section|date=June 2017}}<br /> Hyperparameters are various settings that are used to control the learning process. CNNs use more [[hyperparameter (machine learning)|hyperparameters]] than a standard multilayer perceptron (MLP).<br /> <br /> === Kernel size ===<br /> The kernel is the number of pixels processed together. It is typically expressed as the kernel's dimensions, e.g., 2x2, or 3x3.<br /> <br /> === Padding ===<br /> Padding is the addition of (typically) 0-valued pixels on the borders of an image. This is done so that the border pixels are not undervalued (lost) from the output because they would ordinarily participate in only a single receptive field instance. The padding applied is typically one less than the corresponding kernel dimension. For example, a convolutional layer using 3x3 kernels would receive a 2-pixel pad, that is 1 pixel on each side of the image.&lt;ref&gt;{{cite web |title=6.3. Padding and Stride — Dive into Deep Learning 0.17.0 documentation |url=https://d2l.ai/chapter_convolutional-neural-networks/padding-and-strides.html |access-date=2021-08-12 |website=d2l.ai |archive-date=2021-08-12 |archive-url=https://web.archive.org/web/20210812202649/https://d2l.ai/chapter_convolutional-neural-networks/padding-and-strides.html |url-status=live }}&lt;/ref&gt;<br /> <br /> === Stride ===<br /> The stride is the number of pixels that the analysis window moves on each iteration. A stride of 2 means that each kernel is offset by 2 pixels from its predecessor.<br /> <br /> === Number of filters ===<br /> Since feature map size decreases with depth, layers near the input layer tend to have fewer filters while higher layers can have more. To equalize computation at each layer, the product of feature values ''v&lt;sub&gt;a&lt;/sub&gt;'' with pixel position is kept roughly constant across layers. Preserving more information about the input would require keeping the total number of activations (number of feature maps times number of pixel positions) non-decreasing from one layer to the next.<br /> <br /> The number of feature maps directly controls the capacity and depends on the number of available examples and task complexity.<br /> <br /> === Filter size ===<br /> Common filter sizes found in the literature vary greatly, and are usually chosen based on the data set.<br /> <br /> The challenge is to find the right level of granularity so as to create abstractions at the proper scale, given a particular data set, and without [[overfitting]].<br /> <br /> === Pooling type and size ===<br /> <br /> [[Max pooling]] is typically used, often with a 2x2 dimension. This implies that the input is drastically [[downsampling (signal processing)|downsampled]], reducing processing cost.<br /> <br /> Large input volumes may warrant 4×4 pooling in the lower layers.&lt;ref&gt;{{cite web |url=https://adeshpande3.github.io/adeshpande3.github.io/The-9-Deep-Learning-Papers-You-Need-To-Know-About.html |title=The 9 Deep Learning Papers You Need To Know About (Understanding CNNs Part 3) |last=Deshpande |first=Adit |website=adeshpande3.github.io |access-date=2018-12-04 |archive-date=2018-11-21 |archive-url=https://web.archive.org/web/20181121185730/https://adeshpande3.github.io/adeshpande3.github.io/The-9-Deep-Learning-Papers-You-Need-To-Know-About.html |url-status=live }}&lt;/ref&gt; Greater pooling [[dimensionality reduction|reduces the dimension]] of the signal, and may result in unacceptable [[data loss|information loss]]. Often, non-overlapping pooling windows perform best.&lt;ref name=&quot;Scherer-ICANN-2010&quot;/&gt;<br /> <br /> === Dilation ===<br /> Dilation involves ignoring pixels within a kernel. This reduces processing/memory potentially without significant signal loss. A dilation of 2 on a 3x3 kernel expands the kernel to 5x5, while still processing 9 (evenly spaced) pixels. Accordingly, dilation of 4 expands the kernel to 9x9&lt;ref&gt;{{Cite web |last=Pröve |first=Paul-Louis |date=2018-02-07 |title=An Introduction to different Types of Convolutions in Deep Learning |url=https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d |access-date=2022-07-27 |website=Medium |language=en |archive-date=2022-07-27 |archive-url=https://web.archive.org/web/20220727225642/https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d |url-status=live }}&lt;/ref&gt;.&lt;ref&gt;{{cite news |last=Seo |first=Jae Duk |date=2018-03-12 |title=Understanding 2D Dilated Convolution Operation with Examples in Numpy and Tensorflow with… |url=https://towardsdatascience.com/understanding-2d-dilated-convolution-operation-with-examples-in-numpy-and-tensorflow-with-d376b3972b25 |access-date=2021-08-12 |website=Medium |language=en |archive-date=2021-11-06 |archive-url=https://web.archive.org/web/20211106134140/https://towardsdatascience.com/understanding-2d-dilated-convolution-operation-with-examples-in-numpy-and-tensorflow-with-d376b3972b25 |url-status=live }}&lt;/ref&gt;<br /> <br /> == Translation equivariance and aliasing==<br /> It is commonly assumed that CNNs are invariant to shifts of the input. Convolution or pooling layers within a CNN that do not have a stride greater than one are indeed [[equivariant map|equivariant]] to translations of the input.&lt;ref name=&quot;:5&quot;/&gt; However, layers with a stride greater than one ignore the [[Nyquist–Shannon sampling theorem|Nyquist-Shannon sampling theorem]] and might lead to [[aliasing]] of the input signal&lt;ref name=&quot;:5&quot;/&gt; While, in principle, CNNs are capable of implementing anti-aliasing filters, it has been observed that this does not happen in practice &lt;ref&gt;{{cite book |last=Ribeiro,Schon |first=Antonio,Thomas |title=ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) |chapter=How Convolutional Neural Networks Deal with Aliasing |date=2021 |pages=2755–2759 |arxiv=2102.07757 |doi=10.1109/ICASSP39728.2021.9414627|isbn=978-1-7281-7605-5 |s2cid=231925012 }}&lt;/ref&gt; and yield models that are not equivariant to translations.<br /> Furthermore, if a CNN makes use of fully connected layers, translation equivariance does not imply translation invariance, as the fully connected layers are not invariant to shifts of the input.&lt;ref&gt;{{cite book |last1=Myburgh |first1=Johannes C. |last2=Mouton |first2=Coenraad |last3=Davel |first3=Marelie H. |title=Artificial Intelligence Research |chapter=Tracking Translation Invariance in CNNS |date=2020 |editor-last=Gerber |editor-first=Aurona |chapter-url=https://link.springer.com/chapter/10.1007%2F978-3-030-66151-9_18 |series=Communications in Computer and Information Science |volume=1342 |language=en |location=Cham |publisher=Springer International Publishing |pages=282–295 |doi=10.1007/978-3-030-66151-9_18 |arxiv=2104.05997 |isbn=978-3-030-66151-9 |s2cid=233219976 |access-date=2021-03-26 |archive-date=2022-01-22 |archive-url=https://web.archive.org/web/20220122015258/http://link.springer.com/chapter/10.1007/978-3-030-66151-9_18 |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;:6&quot;/&gt; One solution for complete translation invariance is avoiding any down-sampling throughout the network and applying global average pooling at the last layer.&lt;ref name=&quot;:5&quot;/&gt; Additionally, several other partial solutions have been proposed, such as [[anti-aliasing filter|anti-aliasing]] before downsampling operations,&lt;ref&gt;{{cite book |last=Richard |first=Zhang |url=https://www.worldcat.org/oclc/1106340711 |title=Making Convolutional Networks Shift-Invariant Again |date=2019-04-25 |oclc=1106340711}}&lt;/ref&gt; spatial transformer networks,&lt;ref&gt;{{cite journal |last=Jadeberg, Simonyan, Zisserman, Kavukcuoglu |first=Max, Karen, Andrew, Koray |date=2015 |title=Spatial Transformer Networks |url=https://proceedings.neurips.cc/paper/2015/file/33ceb07bf4eeb3da587e268d663aba1a-Paper.pdf |journal=Advances in Neural Information Processing Systems |volume=28 |via=NIPS |access-date=2021-03-26 |archive-date=2021-07-25 |archive-url=https://web.archive.org/web/20210725115312/https://proceedings.neurips.cc/paper/2015/file/33ceb07bf4eeb3da587e268d663aba1a-Paper.pdf |url-status=live }}&lt;/ref&gt; [[data augmentation]], subsampling combined with pooling,&lt;ref name=&quot;:6&quot;/&gt; and [[capsule neural network]]s.&lt;ref&gt;{{cite book |last=E |first=Sabour, Sara Frosst, Nicholas Hinton, Geoffrey |url=https://worldcat.org/oclc/1106278545 |title=Dynamic Routing Between Capsules |date=2017-10-26 |oclc=1106278545}}&lt;/ref&gt;<br /> <br /> == Evaluation ==<br /> The accuracy of the final model is based on a sub-part of the dataset set apart at the start, often called a test-set. Other times methods such as [[cross-validation (statistics)|''k''-fold cross-validation]] are applied. Other strategies include using [[conformal prediction]].&lt;ref&gt;{{cite journal |date=2019-06-01 |title=Inductive conformal predictor for convolutional neural networks: Applications to active learning for image classification |url=https://www.sciencedirect.com/science/article/abs/pii/S003132031930055X |journal=Pattern Recognition |language=en |volume=90 |pages=172–182 |doi=10.1016/j.patcog.2019.01.035 |issn=0031-3203 |last1=Matiz |first1=Sergio |last2=Barner |first2=Kenneth E. |bibcode=2019PatRe..90..172M |s2cid=127253432 |access-date=2021-09-29 |archive-date=2021-09-29 |archive-url=https://web.archive.org/web/20210929092610/https://www.sciencedirect.com/science/article/abs/pii/S003132031930055X |url-status=live }}&lt;/ref&gt;&lt;ref&gt;{{cite journal |last1=Wieslander |first1=Håkan |last2=Harrison |first2=Philip J. |last3=Skogberg |first3=Gabriel |last4=Jackson |first4=Sonya |last5=Fridén |first5=Markus |last6=Karlsson |first6=Johan |last7=Spjuth |first7=Ola |last8=Wählby |first8=Carolina |date=February 2021 |title=Deep Learning With Conformal Prediction for Hierarchical Analysis of Large-Scale Whole-Slide Tissue Images |url=https://ieeexplore.ieee.org/document/9103229 |journal=IEEE Journal of Biomedical and Health Informatics |volume=25 |issue=2 |pages=371–380 |doi=10.1109/JBHI.2020.2996300 |pmid=32750907 |s2cid=219885788 |issn=2168-2208 |access-date=2022-01-29 |archive-date=2022-01-20 |archive-url=https://web.archive.org/web/20220120141410/https://ieeexplore.ieee.org/document/9103229/ |url-status=live }}&lt;/ref&gt;<br /> <br /> == Regularization methods ==<br /> {{Main|Regularization (mathematics)}}<br /> {{More citations needed section|date=June 2017}}<br /> [[Regularization (mathematics)|Regularization]] is a process of introducing additional information to solve an [[ill-posed problem]] or to prevent [[overfitting]]. CNNs use various types of regularization.<br /> <br /> === Empirical ===<br /> <br /> ==== Dropout ====<br /> Because a fully connected layer occupies most of the parameters, it is prone to overfitting. One method to reduce overfitting is [[dropout (neural networks)|dropout]], introduced in 2014.&lt;ref&gt;{{cite journal |last=Srivastava |first=Nitish |author2=C. Geoffrey Hinton |author3=Alex Krizhevsky |author4=Ilya Sutskever |author5=Ruslan Salakhutdinov |title=Dropout: A Simple Way to Prevent Neural Networks from overfitting |journal=Journal of Machine Learning Research |year=2014 |volume=15 |issue=1 |pages=1929–1958 |url=http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf |access-date=2015-01-03 |archive-date=2016-01-19 |archive-url=https://web.archive.org/web/20160119155849/http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;DLPATTERNS&quot;&gt;{{cite web |title=A Pattern Language for Deep Learning |author=Carlos E. Perez |url=http://www.deeplearningpatterns.com/ |access-date=2016-06-15 |archive-date=2017-06-03 |archive-url=https://web.archive.org/web/20170603205959/http://deeplearningpatterns.com/ |url-status=live }}&lt;/ref&gt; At each training stage, individual nodes are either &quot;dropped out&quot; of the net (ignored) with probability &lt;math&gt;1-p&lt;/math&gt; or kept with probability &lt;math&gt;p&lt;/math&gt;, so that a reduced network is left; incoming and outgoing edges to a dropped-out node are also removed. Only the reduced network is trained on the data in that stage. The removed nodes are then reinserted into the network with their original weights.<br /> <br /> In the training stages, &lt;math&gt;p&lt;/math&gt; is usually 0.5; for input nodes, it is typically much higher because information is directly lost when input nodes are ignored.<br /> <br /> At testing time after training has finished, we would ideally like to find a sample average of all possible &lt;math&gt;2^n&lt;/math&gt; dropped-out networks; unfortunately this is unfeasible for large values of &lt;math&gt;n&lt;/math&gt;. However, we can find an approximation by using the full network with each node's output weighted by a factor of &lt;math&gt;p&lt;/math&gt;, so the [[expected value]] of the output of any node is the same as in the training stages. This is the biggest contribution of the dropout method: although it effectively generates &lt;math&gt;2^n&lt;/math&gt; neural nets, and as such allows for model combination, at test time only a single network needs to be tested.<br /> <br /> By avoiding training all nodes on all training data, dropout decreases overfitting. The method also significantly improves training speed. This makes the model combination practical, even for [[deep neural network]]s. The technique seems to reduce node interactions, leading them to learn more robust features{{Clarify|reason=|date=December 2018}} that better generalize to new data.<br /> <br /> ==== DropConnect ====<br /> <br /> DropConnect is the generalization of dropout in which each connection, rather than each output unit, can be dropped with probability &lt;math&gt;1-p&lt;/math&gt;. Each unit thus receives input from a random subset of units in the previous layer.&lt;ref&gt;{{cite journal |title=Regularization of Neural Networks using DropConnect {{!}} ICML 2013 {{!}} JMLR W&amp;CP |pages=1058–1066 |url=http://proceedings.mlr.press/v28/wan13.html |website=jmlr.org |access-date=2015-12-17 |date=2013-02-13 |archive-date=2017-08-12 |archive-url=https://web.archive.org/web/20170812080411/http://proceedings.mlr.press/v28/wan13.html |url-status=live }}&lt;/ref&gt;<br /> <br /> DropConnect is similar to dropout as it introduces dynamic sparsity within the model, but differs in that the sparsity is on the weights, rather than the output vectors of a layer. In other words, the fully connected layer with DropConnect becomes a sparsely connected layer in which the connections are chosen at random during the training stage.<br /> <br /> ==== Stochastic pooling ====<br /> A major drawback to Dropout is that it does not have the same benefits for convolutional layers, where the neurons are not fully connected.<br /> <br /> Even before Dropout, in 2013 a technique called stochastic pooling,&lt;ref&gt;{{cite arXiv |title=Stochastic Pooling for Regularization of Deep Convolutional Neural Networks |eprint=1301.3557 |date=2013-01-15 |first1=Matthew D. |last1=Zeiler |first2=Rob |last2=Fergus |class=cs.LG}}&lt;/ref&gt; the conventional [[deterministic algorithm|deterministic]] pooling operations were replaced with a stochastic procedure, where the activation within each pooling region is picked randomly according to a [[multinomial distribution]], given by the activities within the pooling region. This approach is free of hyperparameters and can be combined with other regularization approaches, such as dropout and [[data augmentation]].<br /> <br /> An alternate view of stochastic pooling is that it is equivalent to standard max pooling but with many copies of an input image, each having small local [[deformation theory|deformations]]. This is similar to explicit [[elastic deformation]]s of the input images,&lt;ref name=&quot;:3&quot;/&gt; which delivers excellent performance on the [[MNIST database|MNIST data set]].&lt;ref name=&quot;:3&quot;&gt;{{cite journal |title=Best Practices for Convolutional Neural Networks Applied to Visual Document Analysis – Microsoft Research |url=https://www.microsoft.com/en-us/research/publication/best-practices-for-convolutional-neural-networks-applied-to-visual-document-analysis/?from=http%3A%2F%2Fresearch.microsoft.com%2Fapps%2Fpubs%2F%3Fid%3D68920 |journal=Microsoft Research |access-date=2015-12-17 |date=August 2003 |last1=Platt |first1=John |last2=Steinkraus |first2=Dave |last3=Simard |first3=Patrice Y. |archive-date=2017-11-07 |archive-url=https://web.archive.org/web/20171107112839/https://www.microsoft.com/en-us/research/publication/best-practices-for-convolutional-neural-networks-applied-to-visual-document-analysis/?from=http%3A%2F%2Fresearch.microsoft.com%2Fapps%2Fpubs%2F%3Fid%3D68920 |url-status=live }}&lt;/ref&gt; Using stochastic pooling in a multilayer model gives an exponential number of deformations since the selections in higher layers are independent of those below.<br /> <br /> ==== Artificial data ====<br /> {{Main|Data augmentation}}<br /> Because the degree of model overfitting is determined by both its power and the amount of training it receives, providing a convolutional network with more training examples can reduce overfitting. Because there is often not enough available data to train, especially considering that some part should be spared for later testing, two approaches are to either generate new data from scratch (if possible) or perturb existing data to create new ones. The latter one is used since mid-1990s.&lt;ref name=&quot;lecun95&quot; /&gt; For example, input images can be cropped, rotated, or rescaled to create new examples with the same labels as the original training set.&lt;ref&gt;{{cite arXiv |title=Improving neural networks by preventing co-adaptation of feature detectors |eprint=1207.0580 |last1=Hinton |first1=Geoffrey E. |last2=Srivastava |first2=Nitish |last3=Krizhevsky |first3=Alex |last4=Sutskever |first4=Ilya |last5=Salakhutdinov |first5=Ruslan R. |class=cs.NE |year=2012}}&lt;/ref&gt;<br /> <br /> === Explicit ===<br /> <br /> ==== Early stopping ====<br /> {{Main|Early stopping}}<br /> One of the simplest methods to prevent overfitting of a network is to simply stop the training before overfitting has had a chance to occur. It comes with the disadvantage that the learning process is halted.<br /> <br /> ==== Number of parameters ====<br /> Another simple way to prevent overfitting is to limit the number of parameters, typically by limiting the number of hidden units in each layer or limiting network depth. For convolutional networks, the filter size also affects the number of parameters. Limiting the number of parameters restricts the predictive power of the network directly, reducing the complexity of the function that it can perform on the data, and thus limits the amount of overfitting. This is equivalent to a &quot;[[zero norm]]&quot;.<br /> <br /> ==== Weight decay ====<br /> A simple form of added regularizer is weight decay, which simply adds an additional error, proportional to the sum of weights ([[L1-norm|L1 norm]]) or squared magnitude ([[L2 norm]]) of the weight vector, to the error at each node. The level of acceptable model complexity can be reduced by increasing the proportionality constant('alpha' hyperparameter), thus increasing the penalty for large weight vectors.<br /> <br /> L2 regularization is the most common form of regularization. It can be implemented by penalizing the squared magnitude of all parameters directly in the objective. The L2 regularization has the intuitive interpretation of heavily penalizing peaky weight vectors and preferring diffuse weight vectors. Due to multiplicative interactions between weights and inputs this has the useful property of encouraging the network to use all of its inputs a little rather than some of its inputs a lot.<br /> <br /> L1 regularization is also common. It makes the weight vectors sparse during optimization. In other words, neurons with L1 regularization end up using only a sparse subset of their most important inputs and become nearly invariant to the noisy inputs. L1 with L2 regularization can be combined; this is called [[elastic net regularization]].<br /> <br /> ==== Max norm constraints ====<br /> Another form of regularization is to enforce an absolute upper bound on the magnitude of the weight vector for every neuron and use [[sparse approximation#Projected Gradient Descent|projected gradient descent]] to enforce the constraint. In practice, this corresponds to performing the parameter update as normal, and then enforcing the constraint by clamping the weight vector &lt;math&gt;\vec{w}&lt;/math&gt; of every neuron to satisfy &lt;math&gt;\|\vec{w}\|_{2}&lt;c&lt;/math&gt;. Typical values of &lt;math&gt;c&lt;/math&gt; are order of 3–4. Some papers report improvements&lt;ref&gt;{{cite web |title=Dropout: A Simple Way to Prevent Neural Networks from Overfitting |url=https://jmlr.org/papers/v15/srivastava14a.html |website=jmlr.org |access-date=2015-12-17 |archive-date=2016-03-05 |archive-url=https://web.archive.org/web/20160305010425/http://jmlr.org/papers/v15/srivastava14a.html |url-status=live }}&lt;/ref&gt; when using this form of regularization.<br /> <br /> == Hierarchical coordinate frames ==<br /> Pooling loses the precise spatial relationships between high-level parts (such as nose and mouth in a face image). These relationships are needed for identity recognition. Overlapping the pools so that each feature occurs in multiple pools, helps retain the information. Translation alone cannot extrapolate the understanding of geometric relationships to a radically new viewpoint, such as a different orientation or scale. On the other hand, people are very good at extrapolating; after seeing a new shape once they can recognize it from a different viewpoint.&lt;ref&gt;{{cite journal |last1=Hinton |first1=Geoffrey |year=1979 |title=Some demonstrations of the effects of structural descriptions in mental imagery |journal=Cognitive Science |volume=3 |issue=3 |pages=231–250 |doi=10.1016/s0364-0213(79)80008-7}}&lt;/ref&gt;<br /> <br /> An earlier common way to deal with this problem is to train the network on transformed data in different orientations, scales, lighting, etc. so that the network can cope with these variations. This is computationally intensive for large data-sets. The alternative is to use a hierarchy of coordinate frames and use a group of neurons to represent a conjunction of the shape of the feature and its pose relative to the [[retina]]. The pose relative to the retina is the relationship between the coordinate frame of the retina and the intrinsic features' coordinate frame.&lt;ref&gt;Rock, Irvin. &quot;The frame of reference.&quot; The legacy of Solomon Asch: Essays in cognition and social psychology (1990): 243–268.&lt;/ref&gt;<br /> <br /> Thus, one way to represent something is to embed the coordinate frame within it. This allows large features to be recognized by using the consistency of the poses of their parts (e.g. nose and mouth poses make a consistent prediction of the pose of the whole face). This approach ensures that the higher-level entity (e.g. face) is present when the lower-level (e.g. nose and mouth) agree on its prediction of the pose. The vectors of neuronal activity that represent pose (&quot;pose vectors&quot;) allow spatial transformations modeled as linear operations that make it easier for the network to learn the hierarchy of visual entities and generalize across viewpoints. This is similar to the way the human [[visual system]] imposes coordinate frames in order to represent shapes.&lt;ref&gt;J. Hinton, Coursera lectures on Neural Networks, 2012, Url: https://www.coursera.org/learn/neural-networks {{Webarchive|url=https://web.archive.org/web/20161231174321/https://www.coursera.org/learn/neural-networks |date=2016-12-31}}&lt;/ref&gt;<br /> <br /> == Applications ==<br /> <br /> === Image recognition ===<br /> CNNs are often used in [[image recognition]] systems. In 2012, an [[per-comparison error rate|error rate]] of 0.23% on the [[MNIST database]] was reported.&lt;ref name=&quot;mcdns&quot;/&gt; Another paper on using CNN for image classification reported that the learning process was &quot;surprisingly fast&quot;; in the same paper, the best published results as of 2011 were achieved in the MNIST database and the NORB database.&lt;ref name=&quot;flexible&quot;/&gt; Subsequently, a similar CNN called<br /> [[AlexNet]]&lt;ref name=quartz&gt;{{cite web<br /> |website=[[Quartz (website)|Quartz]]<br /> |author=Dave Gershgorn<br /> |title=The inside story of how AI got good enough to dominate Silicon Valley<br /> |url=https://qz.com/1307091/the-inside-story-of-how-ai-got-good-enough-to-dominate-silicon-valley/<br /> |date=18 June 2018<br /> |access-date=5 October 2018<br /> |archive-date=12 December 2019<br /> |archive-url=https://web.archive.org/web/20191212224842/https://qz.com/1307091/the-inside-story-of-how-ai-got-good-enough-to-dominate-silicon-valley/<br /> |url-status=live<br /> }}&lt;/ref&gt; won the [[ImageNet Large Scale Visual Recognition Challenge]] 2012.<br /> <br /> When applied to [[facial recognition system|facial recognition]], CNNs achieved a large decrease in error rate.&lt;ref&gt;{{cite journal |last=Lawrence |first=Steve |author2=C. Lee Giles |author3=Ah Chung Tsoi |author4=Andrew D. Back |title=Face Recognition: A Convolutional Neural Network Approach |journal=IEEE Transactions on Neural Networks |year=1997 |volume=8 |issue=1 |pages=98–113 |citeseerx=10.1.1.92.5813 |doi=10.1109/72.554195 |pmid=18255614|s2cid=2883848 }}&lt;/ref&gt; Another paper reported a 97.6% recognition rate on &quot;5,600 still images of more than 10 subjects&quot;.&lt;ref name=&quot;robust face detection&quot;/&gt; CNNs were used to assess [[video quality]] in an objective way after manual training; the resulting system had a very low [[root mean square error]].&lt;ref name=&quot;video quality&quot;&gt;{{cite journal |last=Le Callet |first=Patrick |author2=Christian Viard-Gaudin |author3=Dominique Barba |year=2006 |title=A Convolutional Neural Network Approach for Objective Video Quality Assessment |url=https://hal.archives-ouvertes.fr/file/index/docid/287426/filename/A_convolutional_neural_network_approach_for_objective_video_quality_assessment_completefinal_manuscript.pdf |journal=IEEE Transactions on Neural Networks |volume=17 |issue=5 |pages=1316–1327 |doi=10.1109/TNN.2006.879766 |pmid=17001990 |s2cid=221185563 |access-date=17 November 2013 |archive-date=24 February 2021 |archive-url=https://web.archive.org/web/20210224123804/https://hal.archives-ouvertes.fr/file/index/docid/287426/filename/A_convolutional_neural_network_approach_for_objective_video_quality_assessment_completefinal_manuscript.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> The [[ImageNet Large Scale Visual Recognition Challenge]] is a benchmark in object classification and detection, with millions of images and hundreds of object classes. In the ILSVRC 2014,&lt;ref name=&quot;ILSVRC2014&quot;&gt;{{cite web |url=https://image-net.org/challenges/LSVRC/2014/results |title=ImageNet Large Scale Visual Recognition Competition 2014 (ILSVRC2014) |access-date=30 January 2016 |archive-date=5 February 2016 |archive-url=https://web.archive.org/web/20160205153105/http://www.image-net.org/challenges/LSVRC/2014/results |url-status=live }}&lt;/ref&gt; a large-scale visual recognition challenge, almost every highly ranked team used CNN as their basic framework. The winner [[GoogLeNet]]&lt;ref name=googlenet&gt;{{cite conference<br /> | last1 = Szegedy | first1 = Christian<br /> | last2 = Liu | first2 = Wei<br /> | last3 = Jia | first3 = Yangqing<br /> | last4 = Sermanet | first4 = Pierre<br /> | last5 = Reed | first5 = Scott E.<br /> | last6 = Anguelov | first6 = Dragomir<br /> | last7 = Erhan | first7 = Dumitru<br /> | last8 = Vanhoucke | first8 = Vincent<br /> | last9 = Rabinovich | first9 = Andrew<br /> | arxiv = 1409.4842<br /> | contribution = Going deeper with convolutions<br /> | doi = 10.1109/CVPR.2015.7298594<br /> | pages = 1–9<br /> | publisher = IEEE Computer Society<br /> | title = IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2015, Boston, MA, USA, June 7–12, 2015<br /> | year = 2015}}&lt;/ref&gt; (the foundation of [[DeepDream]]) increased the mean average [[precision and recall|precision]] of object detection to 0.439329, and reduced classification error to 0.06656, the best result to date. Its network applied more than 30 layers. That performance of convolutional neural networks on the ImageNet tests was close to that of humans.&lt;ref&gt;{{cite arXiv |eprint=1409.0575 |last1=Russakovsky |first1=Olga |title=Image ''Net'' Large Scale Visual Recognition Challenge |last2=Deng |first2=Jia |last3=Su |first3=Hao |last4=Krause |first4=Jonathan |last5=Satheesh |first5=Sanjeev |last6=Ma |first6=Sean |last7=Huang |first7=Zhiheng |last8=Karpathy |first8=Andrej |author-link8=Andrej Karpathy |last9=Khosla |first9=Aditya |last10=Bernstein |first10=Michael |last11=Berg |first11=Alexander C. |last12=Fei-Fei |first12=Li |class=cs.CV |year=2014 |author1-link=Olga Russakovsky}}&lt;/ref&gt; The best algorithms still struggle with objects that are small or thin, such as a small ant on a stem of a flower or a person holding a quill in their hand. They also have trouble with images that have been distorted with filters, an increasingly common phenomenon with modern digital cameras. By contrast, those kinds of images rarely trouble humans. Humans, however, tend to have trouble with other issues. For example, they are not good at classifying objects into fine-grained categories such as the particular breed of dog or species of bird, whereas convolutional neural networks handle this.{{citation needed|date=June 2019}}<br /> <br /> In 2015, a many-layered CNN demonstrated the ability to spot faces from a wide range of angles, including upside down, even when partially occluded, with competitive performance. The network was trained on a database of 200,000 images that included faces at various angles and orientations and a further 20 million images without faces. They used batches of 128 images over 50,000 iterations.&lt;ref&gt;{{cite news |url=https://www.technologyreview.com/2015/02/16/169357/the-face-detection-algorithm-set-to-revolutionize-image-search/ |title=The Face Detection Algorithm Set To Revolutionize Image Search |date=February 16, 2015 |work=Technology Review |access-date=27 October 2017 |archive-date=20 September 2020 |archive-url=https://web.archive.org/web/20200920130711/https://www.technologyreview.com/2015/02/16/169357/the-face-detection-algorithm-set-to-revolutionize-image-search/ |url-status=live }}&lt;/ref&gt;<br /> <br /> === Video analysis ===<br /> Compared to image data domains, there is relatively little work on applying CNNs to video classification. Video is more complex than images since it has another (temporal) dimension. However, some extensions of CNNs into the video domain have been explored. One approach is to treat space and time as equivalent dimensions of the input and perform convolutions in both time and space.&lt;ref&gt;{{cite book |publisher=Springer Berlin Heidelberg |date=2011-11-16 |isbn=978-3-642-25445-1 |pages=29–39 |series=Lecture Notes in Computer Science |first1=Moez |last1=Baccouche |first2=Franck |last2=Mamalet |first3=Christian |last3=Wolf |first4=Christophe |last4=Garcia |first5=Atilla |last5=Baskurt |editor-first=Albert Ali |editor-last=Salah |editor-first2=Bruno |editor-last2=Lepri |doi=10.1007/978-3-642-25446-8_4 |chapter=Sequential Deep Learning for Human Action Recognition |title=Human Behavior Unterstanding |volume=7065 |citeseerx=10.1.1.385.4740}}&lt;/ref&gt;&lt;ref&gt;{{cite journal |title=3D Convolutional Neural Networks for Human Action Recognition |journal=IEEE Transactions on Pattern Analysis and Machine Intelligence |date=2013-01-01 |issn=0162-8828 |pages=221–231 |volume=35 |issue=1 |doi=10.1109/TPAMI.2012.59 |pmid=22392705 |first1=Shuiwang |last1=Ji |first2=Wei |last2=Xu |first3=Ming |last3=Yang |first4=Kai |last4=Yu |citeseerx=10.1.1.169.4046 |s2cid=1923924}}&lt;/ref&gt; Another way is to fuse the features of two convolutional neural networks, one for the spatial and one for the temporal stream.&lt;ref&gt;{{cite arXiv |last1=Huang |first1=Jie |last2=Zhou |first2=Wengang |last3=Zhang |first3=Qilin |last4=Li |first4=Houqiang |last5=Li |first5=Weiping |title=Video-based Sign Language Recognition without Temporal Segmentation |eprint=1801.10111 |class=cs.CV |year=2018}}&lt;/ref&gt;&lt;ref&gt;Karpathy, Andrej, et al. &quot;[https://www.cv-foundation.org/openaccess/content_cvpr_2014/papers/Karpathy_Large-scale_Video_Classification_2014_CVPR_paper.pdf Large-scale video classification with convolutional neural networks] {{Webarchive|url=https://web.archive.org/web/20190806022753/https://www.cv-foundation.org/openaccess/content_cvpr_2014/papers/Karpathy_Large-scale_Video_Classification_2014_CVPR_paper.pdf |date=2019-08-06 }}.&quot; IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2014.&lt;/ref&gt;&lt;ref&gt;{{cite arXiv |eprint=1406.2199 |last1=Simonyan |first1=Karen |title=Two-Stream Convolutional Networks for Action Recognition in Videos |last2=Zisserman |first2=Andrew |class=cs.CV |year=2014}} (2014).&lt;/ref&gt; [[Long short-term memory]] (LSTM) [[recurrent neural network|recurrent]] units are typically incorporated after the CNN to account for inter-frame or inter-clip dependencies.&lt;ref name=&quot;Wang Duan Zhang Niu p=1657&quot;&gt;{{cite journal |last1=Wang |first1=Le |last2=Duan |first2=Xuhuan |last3=Zhang |first3=Qilin |last4=Niu |first4=Zhenxing |last5=Hua |first5=Gang |last6=Zheng |first6=Nanning |title=Segment-Tube: Spatio-Temporal Action Localization in Untrimmed Videos with Per-Frame Segmentation |journal=Sensors |volume=18 |issue=5 |date=2018-05-22 |issn=1424-8220 |doi=10.3390/s18051657 |pmid=29789447 |pmc=5982167 |page=1657 |bibcode=2018Senso..18.1657W |url=https://qilin-zhang.github.io/_pages/pdfs/Segment-Tube_Spatio-Temporal_Action_Localization_in_Untrimmed_Videos_with_Per-Frame_Segmentation.pdf |doi-access=free |access-date=2018-09-14 |archive-date=2021-03-01 |archive-url=https://web.archive.org/web/20210301195518/https://qilin-zhang.github.io/_pages/pdfs/Segment-Tube_Spatio-Temporal_Action_Localization_in_Untrimmed_Videos_with_Per-Frame_Segmentation.pdf |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;Duan Wang Zhai Zheng 2018 p. &quot;&gt;{{cite conference |last1=Duan |first1=Xuhuan |last2=Wang |first2=Le |last3=Zhai |first3=Changbo |last4=Zheng |first4=Nanning |last5=Zhang |first5=Qilin |last6=Niu |first6=Zhenxing |last7=Hua |first7=Gang |title=2018 25th IEEE International Conference on Image Processing (ICIP) |chapter=Joint Spatio-Temporal Action Localization in Untrimmed Videos with Per-Frame Segmentation |publisher=25th IEEE International Conference on Image Processing (ICIP) |year=2018 |pages=918–922 |isbn=978-1-4799-7061-2 |doi=10.1109/icip.2018.8451692}}&lt;/ref&gt; [[Unsupervised learning]] schemes for training spatio-temporal features have been introduced, based on Convolutional Gated Restricted [[Boltzmann machine|Boltzmann Machines]]&lt;ref&gt;{{cite conference |title=Convolutional Learning of Spatio-temporal Features |url=https://dl.acm.org/doi/10.5555/1888212 |publisher=Springer-Verlag |conference=Proceedings of the 11th European Conference on Computer Vision: Part VI |date=2010-01-01 |location=Berlin, Heidelberg |isbn=978-3-642-15566-6 |pages=140–153 |series=ECCV'10 |first1=Graham W. |last1=Taylor |first2=Rob |last2=Fergus |first3=Yann |last3=LeCun |first4=Christoph |last4=Bregler |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331211137/https://dl.acm.org/doi/10.5555/1888212 |url-status=live }}&lt;/ref&gt; and Independent Subspace Analysis.&lt;ref&gt;{{cite book |publisher=IEEE Computer Society |date=2011-01-01 |location=Washington, DC, USA |isbn=978-1-4577-0394-2 |pages=3361–3368 |series=CVPR '11 |doi=10.1109/CVPR.2011.5995496 |first1=Q. V. |last1=Le |first2=W. Y. |last2=Zou |first3=S. Y. |last3=Yeung |first4=A. Y. |last4=Ng |title=CVPR 2011 |chapter=Learning hierarchical invariant spatio-temporal features for action recognition with independent subspace analysis |citeseerx=10.1.1.294.5948 |s2cid=6006618}}&lt;/ref&gt; It's Application can be seen in [[Text-to-Video model]].&lt;ref&gt;{{Cite web |title=Leading India.ai |url=https://www.leadingindia.ai/downloads/projects/VP/vp_16.pdf |access-date=2022-10-13 |archive-date=2022-10-14 |archive-url=https://web.archive.org/web/20221014091907/https://www.leadingindia.ai/downloads/projects/VP/vp_16.pdf |url-status=live }}&lt;/ref&gt; <br /> <br /> === Natural language processing ===<br /> CNNs have also been explored for [[natural language processing]]. CNN models are effective for various NLP problems and achieved excellent results in [[semantic parsing]],&lt;ref&gt;{{cite arXiv |title=A Deep Architecture for Semantic Parsing |eprint=1404.7296 |date=2014-04-29 |first1=Edward |last1=Grefenstette |first2=Phil |last2=Blunsom |first3=Nando |last3=de Freitas |first4=Karl Moritz |last4=Hermann |class=cs.CL}}&lt;/ref&gt; search query retrieval,&lt;ref&gt;{{cite journal |title=Learning Semantic Representations Using Convolutional Neural Networks for Web Search – Microsoft Research |url=https://www.microsoft.com/en-us/research/publication/learning-semantic-representations-using-convolutional-neural-networks-for-web-search/?from=http%3A%2F%2Fresearch.microsoft.com%2Fapps%2Fpubs%2Fdefault.aspx%3Fid%3D214617 |journal=Microsoft Research |access-date=2015-12-17 |date=April 2014 |last1=Mesnil |first1=Gregoire |last2=Deng |first2=Li |last3=Gao |first3=Jianfeng |last4=He |first4=Xiaodong |last5=Shen |first5=Yelong |archive-date=2017-09-15 |archive-url=https://web.archive.org/web/20170915160617/https://www.microsoft.com/en-us/research/publication/learning-semantic-representations-using-convolutional-neural-networks-for-web-search/?from=http%3A%2F%2Fresearch.microsoft.com%2Fapps%2Fpubs%2Fdefault.aspx%3Fid%3D214617 |url-status=live }}&lt;/ref&gt; sentence modeling,&lt;ref&gt;{{cite arXiv |title=A Convolutional Neural Network for Modelling Sentences |eprint=1404.2188 |date=2014-04-08 |first1=Nal |last1=Kalchbrenner |first2=Edward |last2=Grefenstette |first3=Phil |last3=Blunsom |class=cs.CL}}&lt;/ref&gt; classification,&lt;ref&gt;{{cite arXiv |title=Convolutional Neural Networks for Sentence Classification |eprint=1408.5882 |date=2014-08-25 |first=Yoon |last=Kim |class=cs.CL}}&lt;/ref&gt; prediction&lt;ref&gt;Collobert, Ronan, and Jason Weston. &quot;[https://thetalkingmachines.com/sites/default/files/2018-12/unified_nlp.pdf A unified architecture for natural language processing: Deep neural networks with multitask learning] {{Webarchive|url=https://web.archive.org/web/20190904161653/https://thetalkingmachines.com/sites/default/files/2018-12/unified_nlp.pdf |date=2019-09-04 }}.&quot;Proceedings of the 25th international conference on Machine learning. ACM, 2008.&lt;/ref&gt; and other traditional NLP tasks.&lt;ref&gt;{{cite arXiv |title=Natural Language Processing (almost) from Scratch |eprint=1103.0398 |date=2011-03-02 |first1=Ronan |last1=Collobert |first2=Jason |last2=Weston |first3=Leon |last3=Bottou |first4=Michael |last4=Karlen |first5=Koray |last5=Kavukcuoglu |first6=Pavel |last6=Kuksa |class=cs.LG}}&lt;/ref&gt;<br /> Compared to traditional language processing methods such as [[recurrent neural networks]], CNNs can represent different contextual realities of language that do not rely on a series-sequence assumption, while RNNs are better suitable when classical time series modeling is required.&lt;ref&gt;{{cite arXiv |title=Comparative study of CNN and RNN for natural language processing |eprint=1702.01923 |date=2017-03-02 |first1=W |last1=Yin |first2=K |last2=Kann |first3=M |last3=Yu |first4=H |last4=Schütze |class=cs.LG}}&lt;/ref&gt;<br /> &lt;ref&gt;{{cite arXiv |title=An empirical evaluation of generic convolutional and recurrent networks for sequence modeling |eprint=1803.01271 |first1=S. |last1=Bai |first2=J.S. |last2=Kolter |first3=V. |last3=Koltun |year=2018 |class=cs.LG}}&lt;/ref&gt;<br /> &lt;ref&gt;{{cite journal |title=Detecting dynamics of action in text with a recurrent neural network |journal=Neural Computing and Applications |year=2021 |volume=33 |last1=Gruber |first1=N. |issue=12 |pages=15709–15718 |doi=10.1007/S00521-021-06190-5 |s2cid=236307579 |url=https://www.semanticscholar.org/paper/Detecting-dynamics-of-action-in-text-with-a-neural-Gruber/cd6c9da2e8c52b043faf05ccc2511a07c54ead0c |access-date=2021-10-10 |archive-date=2021-10-10 |archive-url=https://web.archive.org/web/20211010125453/https://www.semanticscholar.org/paper/Detecting-dynamics-of-action-in-text-with-a-neural-Gruber/cd6c9da2e8c52b043faf05ccc2511a07c54ead0c |url-status=live }}&lt;/ref&gt; &lt;ref&gt;{{cite journal |title=Approximation Theory of Convolutional Architectures for Time Series Modelling |journal=International Conference on Machine Learning |year=2021 |last1=Haotian |first1=J. |last2=Zhong |first2=Li |last3=Qianxiao |first3=Li |arxiv=2107.09355}}&lt;/ref&gt;<br /> <br /> === Anomaly Detection ===<br /> A CNN with 1-D convolutions was used on time series in the frequency domain (spectral residual) by an unsupervised model to detect anomalies in the time domain.&lt;ref&gt;{{cite conference |title=Time-Series Anomaly Detection Service at Microsoft {{!}} Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &amp; Data Mining|language=EN|arxiv=1906.03821|last1=Ren|first1=Hansheng|last2=Xu|first2=Bixiong|last3=Wang|first3=Yujing|last4=Yi|first4=Chao|last5=Huang|first5=Congrui|last6=Kou|first6=Xiaoyu|last7=Xing|first7=Tony|last8=Yang|first8=Mao|last9=Tong|first9=Jie|last10=Zhang|first10=Qi|year=2019|doi=10.1145/3292500.3330680|s2cid=182952311}}&lt;/ref&gt;<br /> <br /> === Drug discovery ===<br /> CNNs have been used in [[drug discovery]]. Predicting the interaction between molecules and biological [[protein]]s can identify potential treatments. In 2015, Atomwise introduced AtomNet, the first deep learning neural network for [[structure-based drug design]].&lt;ref&gt;{{cite arXiv |title=AtomNet: A Deep Convolutional Neural Network for Bioactivity Prediction in Structure-based Drug Discovery |eprint=1510.02855 |date=2015-10-09 |first1=Izhar |last1=Wallach |first2=Michael |last2=Dzamba |first3=Abraham |last3=Heifets |class=cs.LG}}&lt;/ref&gt; The system trains directly on 3-dimensional representations of chemical interactions. Similar to how image recognition networks learn to compose smaller, spatially proximate features into larger, complex structures,&lt;ref&gt;{{cite arXiv |title=Understanding Neural Networks Through Deep Visualization |eprint=1506.06579 |date=2015-06-22 |first1=Jason |last1=Yosinski |first2=Jeff |last2=Clune |first3=Anh |last3=Nguyen |first4=Thomas |last4=Fuchs |first5=Hod |last5=Lipson |class=cs.CV}}&lt;/ref&gt; AtomNet discovers chemical features, such as [[aromaticity]], [[orbital hybridisation|sp&lt;sup&gt;3&lt;/sup&gt; carbons]], and [[hydrogen bond]]ing. Subsequently, AtomNet was used to predict novel candidate [[biomolecule]]s for multiple disease targets, most notably treatments for the [[Ebola virus]]&lt;ref&gt;{{cite news |title=Toronto startup has a faster way to discover effective medicines |url=https://www.theglobeandmail.com/report-on-business/small-business/starting-out/toronto-startup-has-a-faster-way-to-discover-effective-medicines/article25660419/ |website=The Globe and Mail |access-date=2015-11-09 |archive-date=2015-10-20 |archive-url=https://web.archive.org/web/20151020040115/http://www.theglobeandmail.com/report-on-business/small-business/starting-out/toronto-startup-has-a-faster-way-to-discover-effective-medicines/article25660419/ |url-status=live }}&lt;/ref&gt; and [[multiple sclerosis]].&lt;ref&gt;{{cite web |title=Startup Harnesses Supercomputers to Seek Cures |url=https://www.kqed.org/futureofyou/3461/startup-harnesses-supercomputers-to-seek-cures |website=KQED Future of You |access-date=2015-11-09 |language=en-us |date=2015-05-27 |archive-date=2018-12-06 |archive-url=https://web.archive.org/web/20181206234956/https://www.kqed.org/futureofyou/3461/startup-harnesses-supercomputers-to-seek-cures |url-status=live }}&lt;/ref&gt;<br /> <br /> === Checkers game ===<br /> CNNs have been used in the game of [[draughts|checkers]]. From 1999 to 2001, [[David B. Fogel|Fogel]] and Chellapilla published papers showing how a convolutional neural network could learn to play '''checker''' using co-evolution. The learning process did not use prior human professional games, but rather focused on a minimal set of information contained in the checkerboard: the location and type of pieces, and the difference in number of pieces between the two sides. Ultimately, the program ([[Blondie24]]) was tested on 165 games against players and ranked in the highest 0.4%.&lt;ref&gt;{{cite journal |pmid=18252639 |doi=10.1109/72.809083 |volume=10 |issue=6 |title=Evolving neural networks to play checkers without relying on expert knowledge |journal=IEEE Trans Neural Netw |pages=1382–91 |last1=Chellapilla |first1=K |last2=Fogel |first2=DB |year=1999}}&lt;/ref&gt;&lt;ref&gt;{{cite journal |doi=10.1109/4235.942536 |title=Evolving an expert checkers playing program without using human expertise |journal=IEEE Transactions on Evolutionary Computation |volume=5 |issue=4 |pages=422–428 |year=2001 |last1=Chellapilla |first1=K. |last2=Fogel |first2=D.B.}}&lt;/ref&gt; It also earned a win against the program [[Chinook (draughts player)|Chinook]] at its &quot;expert&quot; level of play.&lt;ref&gt;{{cite book |last=Fogel |first=David |date=2001 |title=Blondie24: Playing at the Edge of AI |location=San Francisco, CA |publisher=Morgan Kaufmann |isbn=978-1558607835 |author-link=David B. Fogel}}&lt;/ref&gt;<br /> <br /> === Go ===<br /> CNNs have been used in [[computer Go]]. In December 2014, Clark and [[Amos Storkey|Storkey]] published a paper showing that a CNN trained by supervised learning from a database of human professional games could outperform [[GNU Go]] and win some games against [[Monte Carlo tree search]] Fuego 1.1 in a fraction of the time it took Fuego to play.&lt;ref&gt;{{cite arXiv |eprint=1412.3409 |last1=Clark |first1=Christopher |title=Teaching Deep Convolutional Neural Networks to Play Go |last2=Storkey |first2=Amos |class=cs.AI |year=2014}}&lt;/ref&gt; Later it was announced that a large 12-layer convolutional neural network had correctly predicted the professional move in 55% of positions, equalling the accuracy of a [[Go ranks and ratings|6 dan]] human player. When the trained convolutional network was used directly to play games of Go, without any search, it beat the traditional search program [[GNU Go]] in 97% of games, and matched the performance of the [[Monte Carlo tree search]] program Fuego simulating ten thousand playouts (about a million positions) per move.&lt;ref&gt;{{cite arXiv |eprint=1412.6564 |last1=Maddison |first1=Chris J. |title=Move Evaluation in Go Using Deep Convolutional Neural Networks |last2=Huang |first2=Aja |last3=Sutskever |first3=Ilya |last4=Silver |first4=David |class=cs.LG |year=2014}}&lt;/ref&gt;<br /> <br /> A couple of CNNs for choosing moves to try (&quot;policy network&quot;) and evaluating positions (&quot;value network&quot;) driving MCTS were used by [[AlphaGo]], the first to beat the best human player at the time.&lt;ref&gt;{{cite web |url=https://www.deepmind.com/alpha-go.html |title=AlphaGo – Google DeepMind |access-date=30 January 2016 |archive-url=https://web.archive.org/web/20160130230207/http://www.deepmind.com/alpha-go.html |archive-date=30 January 2016 |url-status=dead}}&lt;/ref&gt;<br /> <br /> === Time series forecasting ===<br /> Recurrent neural networks are generally considered the best neural network architectures for time series forecasting (and sequence modeling in general), but recent studies show that convolutional networks can perform comparably or even better.&lt;ref&gt;{{cite arXiv |last1=Bai |first1=Shaojie |last2=Kolter |first2=J. Zico |last3=Koltun |first3=Vladlen |title=An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling |date=2018-04-19 |eprint=1803.01271 |class=cs.LG}}&lt;/ref&gt;&lt;ref name=&quot;Tsantekidis 7–12&quot;/&gt; Dilated convolutions&lt;ref&gt;{{cite arXiv |last1=Yu |first1=Fisher |last2=Koltun |first2=Vladlen |title=Multi-Scale Context Aggregation by Dilated Convolutions |date=2016-04-30 |eprint=1511.07122 |class=cs.CV}}&lt;/ref&gt; might enable one-dimensional convolutional neural networks to effectively learn time series dependences.&lt;ref&gt;{{cite arXiv |last1=Borovykh |first1=Anastasia |last2=Bohte |first2=Sander |last3=Oosterlee |first3=Cornelis W. |title=Conditional Time Series Forecasting with Convolutional Neural Networks |date=2018-09-17 |eprint=1703.04691 |class=stat.ML}}&lt;/ref&gt; Convolutions can be implemented more efficiently than RNN-based solutions, and they do not suffer from vanishing (or exploding) gradients.&lt;ref&gt;{{cite arXiv |last=Mittelman |first=Roni |title=Time-series modeling with undecimated fully convolutional neural networks |date=2015-08-03 |eprint=1508.00317 |class=stat.ML}}&lt;/ref&gt; Convolutional networks can provide an improved forecasting performance when there are multiple similar time series to learn from.&lt;ref&gt;{{cite arXiv |last1=Chen |first1=Yitian |last2=Kang |first2=Yanfei |last3=Chen |first3=Yixiong |last4=Wang |first4=Zizhuo |title=Probabilistic Forecasting with Temporal Convolutional Neural Network |date=2019-06-11 |eprint=1906.04397 |class=stat.ML}}&lt;/ref&gt; CNNs can also be applied to further tasks in time series analysis (e.g., time series classification&lt;ref&gt;{{cite journal |last1=Zhao |first1=Bendong |last2=Lu |first2=Huanzhang |last3=Chen |first3=Shangfeng |last4=Liu |first4=Junliang |last5=Wu |first5=Dongya |date=2017-02-01 |title=Convolutional neural networks for time series classi |journal=Journal of Systems Engineering and Electronics |volume=28 |issue=1 |pages=162–169 |doi=10.21629/JSEE.2017.01.18}}&lt;/ref&gt; or quantile forecasting&lt;ref&gt;{{cite arXiv |last=Petneházi |first=Gábor |title=QCNN: Quantile Convolutional Neural Network |date=2019-08-21 |eprint=1908.07978 |class=cs.LG}}&lt;/ref&gt;).<br /> <br /> === Cultural Heritage and 3D-datasets ===<br /> As archaeological findings like [[clay tablet]]s with [[cuneiform|cuneiform writing]] are increasingly acquired using [[3D scanner]]s first benchmark datasets are becoming available like ''HeiCuBeDa''&lt;ref name=&quot;HeiCuBeDa_Hilprecht&quot;/&gt; providing almost 2.000 normalized 2D- and 3D-datasets prepared with the [[GigaMesh Software Framework]].&lt;ref name=&quot;ICDAR19&quot;/&gt; So [[curvature]]-based measures are used in conjunction with Geometric Neural Networks (GNNs) e.g. for period classification of those clay tablets being among the oldest documents of human history.&lt;ref name=&quot;ICFHR20&quot;/&gt;&lt;ref name=&quot;ICFHR20_Presentation&quot;/&gt;<br /> <br /> == Fine-tuning ==<br /> For many applications, the training data is less available. Convolutional neural networks usually require a large amount of training data in order to avoid [[overfitting]]. A common technique is to train the network on a larger data set from a related domain. Once the network parameters have converged an additional training step is performed using the in-domain data to fine-tune the network weights, this is known as [[transfer learning]]. Furthermore, this technique allows convolutional network architectures to successfully be applied to problems with tiny training sets.&lt;ref&gt;Durjoy Sen Maitra; Ujjwal Bhattacharya; S.K. Parui, [https://ieeexplore.ieee.org/document/7333916 &quot;CNN based common approach to handwritten character recognition of multiple scripts&quot;] {{Webarchive|url=https://web.archive.org/web/20231016190918/https://ieeexplore.ieee.org/document/7333916 |date=2023-10-16 }}, in Document Analysis and Recognition (ICDAR), 2015 13th International Conference on, vol., no., pp.1021–1025, 23–26 Aug. 2015&lt;/ref&gt;<br /> <br /> == Human interpretable explanations ==<br /> End-to-end training and prediction are common practice in [[computer vision]]. However, human interpretable explanations are required for [[safety-critical system|critical systems]] such as a [[self-driving car]]s.&lt;ref name=&quot;Interpretable ML Symposium 2017&quot;&gt;{{cite web |title=NIPS 2017 |website=Interpretable ML Symposium |date=2017-10-20 |url=http://interpretable.ml/ |access-date=2018-09-12 |archive-date=2019-09-07 |archive-url=https://web.archive.org/web/20190907063237/http://interpretable.ml/ |url-status=dead }}&lt;/ref&gt; With recent advances in [[salience (neuroscience)|visual salience]], [[visual spatial attention|spatial attention]], and [[visual temporal attention|temporal attention]], the most critical spatial regions/temporal instants could be visualized to justify the CNN predictions.&lt;ref name=&quot;Zang Wang Liu Zhang 2018 pp. 97–108&quot;&gt;{{cite book |last1=Zang |first1=Jinliang |last2=Wang |first2=Le |last3=Liu |first3=Ziyi |last4=Zhang |first4=Qilin |last5=Hua |first5=Gang |last6=Zheng |first6=Nanning |title=Artificial Intelligence Applications and Innovations |series=IFIP Advances in Information and Communication Technology |volume=519 |chapter=Attention-Based Temporal Weighted Convolutional Neural Network for Action Recognition |publisher=Springer International Publishing |location=Cham |year=2018 |isbn=978-3-319-92006-1 |issn=1868-4238 |doi=10.1007/978-3-319-92007-8_9 |pages=97–108 |arxiv=1803.07179 |s2cid=4058889}}&lt;/ref&gt;&lt;ref name=&quot;Wang Zang Zhang Niu p=1979&quot;&gt;{{cite journal |last1=Wang |first1=Le |last2=Zang |first2=Jinliang |last3=Zhang |first3=Qilin |last4=Niu |first4=Zhenxing |last5=Hua |first5=Gang |last6=Zheng |first6=Nanning |title=Action Recognition by an Attention-Aware Temporal Weighted Convolutional Neural Network |journal=Sensors |volume=18 |issue=7 |date=2018-06-21 |issn=1424-8220 |doi=10.3390/s18071979 |pmid=29933555 |pmc=6069475 |page=1979 |bibcode=2018Senso..18.1979W |url=https://qilin-zhang.github.io/_pages/pdfs/sensors-18-01979-Action_Recognition_by_an_Attention-Aware_Temporal_Weighted_Convolutional_Neural_Network.pdf |doi-access=free |access-date=2018-09-14 |archive-date=2018-09-13 |archive-url=https://web.archive.org/web/20180913040055/https://qilin-zhang.github.io/_pages/pdfs/sensors-18-01979-Action_Recognition_by_an_Attention-Aware_Temporal_Weighted_Convolutional_Neural_Network.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> == Related architectures ==<br /> <br /> === Deep Q-networks ===<br /> A deep Q-network (DQN) is a type of deep learning model that combines a deep neural network with [[Q-learning]], a form of [[reinforcement learning]]. Unlike earlier reinforcement learning agents, DQNs that utilize CNNs can learn directly from high-dimensional sensory inputs via reinforcement learning.&lt;ref name=&quot;Ong Chavez Hong 2015&quot;&gt;{{cite arXiv |last1=Ong |first1=Hao Yi |last2=Chavez |first2=Kevin |last3=Hong |first3=Augustus |title=Distributed Deep Q-Learning |date=2015-08-18 |class=cs.LG |eprint=1508.04186v2}}&lt;/ref&gt;<br /> <br /> Preliminary results were presented in 2014, with an accompanying paper in February 2015.&lt;ref name=&quot;DQN&quot;&gt;{{cite journal |last1=Mnih |first1=Volodymyr |display-authors=etal |date=2015 |title=Human-level control through deep reinforcement learning |journal=Nature |volume=518 |issue=7540 |pages=529–533 |doi=10.1038/nature14236 |pmid=25719670 |bibcode=2015Natur.518..529M |s2cid=205242740}}&lt;/ref&gt; The research described an application to [[Atari 2600]] gaming. Other deep reinforcement learning models preceded it.&lt;ref&gt;{{cite journal |last1=Sun |first1=R. |last2=Sessions |first2=C. |date=June 2000 |title=Self-segmentation of sequences: automatic formation of hierarchies of sequential behaviors |journal=IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics |volume=30 |issue=3 |pages=403–418 |doi=10.1109/3477.846230 |pmid=18252373 |issn=1083-4419 |citeseerx=10.1.1.11.226}}&lt;/ref&gt;<br /> <br /> === Deep belief networks ===<br /> {{Main|Deep belief network}}<br /> [[Convolutional deep belief network|Convolutional deep belief networks]] (CDBN) have structure very similar to convolutional neural networks and are trained similarly to deep belief networks. Therefore, they exploit the 2D structure of images, like CNNs do, and make use of pre-training like [[deep belief network]]s. They provide a generic structure that can be used in many image and signal processing tasks. Benchmark results on standard image datasets like CIFAR&lt;ref name=&quot;CDBN-CIFAR&quot;&gt;{{Cite web|url=http://www.cs.toronto.edu/~kriz/conv-cifar10-aug2010.pdf|title=Convolutional Deep Belief Networks on CIFAR-10|access-date=2017-08-18|archive-date=2017-08-30|archive-url=https://web.archive.org/web/20170830060223/http://www.cs.toronto.edu/~kriz/conv-cifar10-aug2010.pdf|url-status=live}}&lt;/ref&gt; have been obtained using CDBNs.&lt;ref name=&quot;CDBN&quot;&gt;{{cite book |last1=Lee |first1=Honglak |last2=Grosse |first2=Roger |last3=Ranganath |first3=Rajesh |last4=Ng |first4=Andrew Y. |title=Proceedings of the 26th Annual International Conference on Machine Learning |chapter=Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations |date=1 January 2009 |publisher=ACM |pages=609–616 |doi=10.1145/1553374.1553453 |isbn=9781605585161 |citeseerx=10.1.1.149.6800 |s2cid=12008458}}&lt;/ref&gt;<br /> <br /> == Notable libraries ==<br /> *[[Caffe (software)|Caffe]]: A library for convolutional neural networks. Created by the Berkeley Vision and Learning Center (BVLC). It supports both CPU and GPU. Developed in [[C++]], and has [[Python (programming language)|Python]] and [[MATLAB]] wrappers.<br /> *[[Deeplearning4j]]: Deep learning in [[Java (programming language)|Java]] and [[Scala (programming language)|Scala]] on multi-GPU-enabled [[Apache Spark|Spark]]. A general-purpose deep learning library for the JVM production stack running on a C++ scientific computing engine. Allows the creation of custom layers. Integrates with Hadoop and Kafka.<br /> *[[Dlib]]: A toolkit for making real world machine learning and data analysis applications in C++.<br /> *[[Microsoft Cognitive Toolkit]]: A deep learning toolkit written by Microsoft with several unique features enhancing scalability over multiple nodes. It supports full-fledged interfaces for training in C++ and Python and with additional support for model inference in [[C Sharp (programming language)|C#]] and Java.<br /> *[[TensorFlow]]: [[Apache License#Version 2.0|Apache 2.0]]-licensed Theano-like library with support for CPU, GPU, Google's proprietary [[tensor processing unit]] (TPU),&lt;ref&gt;{{cite news |url=https://www.wired.com/2016/05/google-tpu-custom-chips/ |title=Google Built Its Very Own Chips to Power Its AI Bots |author=Cade Metz |date=May 18, 2016 |newspaper=Wired |access-date=March 6, 2017 |archive-date=January 13, 2018 |archive-url=https://web.archive.org/web/20180113150305/https://www.wired.com/2016/05/google-tpu-custom-chips/ |url-status=live }}&lt;/ref&gt; and mobile devices.<br /> *[[Theano (software)|Theano]]: The reference deep-learning library for Python with an API largely compatible with the popular [[NumPy]] library. Allows user to write symbolic mathematical expressions, then automatically generates their derivatives, saving the user from having to code gradients or backpropagation. These symbolic expressions are automatically compiled to [[CUDA]] code for a fast, [[Compute kernel|on-the-GPU]] implementation.<br /> *[[Torch (machine learning)|Torch]]: A [[scientific computing]] framework with wide support for machine learning algorithms, written in [[C (programming language)|C]] and [[Lua (programming language)|Lua]].<br /> <br /> == See also ==<br /> * [[Attention (machine learning)]]<br /> * [[Convolution]]<br /> * [[Deep learning]]<br /> * [[Natural-language processing]]<br /> * [[Neocognitron]]<br /> * [[Scale-invariant feature transform]]<br /> * [[Time delay neural network]]<br /> * [[Vision processing unit]]<br /> <br /> == Notes ==<br /> {{Reflist|group=nb}}<br /> <br /> == References ==<br /> {{reflist|30em|refs=<br /> &lt;ref name=&quot;ICDAR19&quot;&gt;<br /> {{citation |surname1=Hubert Mara and Bartosz Bogacz |periodical=Proceedings of the 15th International Conference on Document Analysis and Recognition (ICDAR) |title=Breaking the Code on Broken Tablets: The Learning Challenge for Annotated Cuneiform Script in Normalized 2D and 3D Datasets |location=Sydney, Australien |date=2019 |pages=148–153 |language=de |doi=10.1109/ICDAR.2019.00032 |isbn=978-1-7281-3014-9 |s2cid=211026941}}<br /> &lt;/ref&gt;<br /> &lt;ref name=&quot;HeiCuBeDa_Hilprecht&quot;&gt;<br /> {{citation |surname1=[[Hubert Mara]] |title=HeiCuBeDa Hilprecht – Heidelberg Cuneiform Benchmark Dataset for the Hilprecht Collection |publisher=heiDATA – institutional repository for research data of Heidelberg University |date=2019-06-07 |language=de |doi=10.11588/data/IE8CCN}}<br /> &lt;/ref&gt;&lt;ref name=&quot;ICFHR20&quot;&gt;<br /> {{citation<br /> |last1=Bogacz|first1=Bartosz<br /> |last2=Mara|first2=Hubert<br /> |periodical=Proceedings of the 17th International Conference on Frontiers of Handwriting Recognition (ICFHR)<br /> |title=Period Classification of 3D Cuneiform Tablets with Geometric Neural Networks<br /> |location=Dortmund, Germany<br /> |date=2020<br /> }}&lt;/ref&gt;<br /> &lt;ref name=&quot;ICFHR20_Presentation&quot;&gt;{{YouTube<br /> |id=-iFntE51HRw<br /> |title=Presentation of the ICFHR paper on Period Classification of 3D Cuneiform Tablets with Geometric Neural Networks<br /> }}&lt;/ref&gt;<br /> }}<br /> <br /> == External links ==<br /> * [https://cs231n.github.io/ CS231n: Convolutional Neural Networks for Visual Recognition] — [[Andrej Karpathy]]'s [[Stanford University|Stanford]] computer science course on CNNs in computer vision<br /> * [https://ujjwalkarn.me/2016/08/11/intuitive-explanation-convnets/ An Intuitive Explanation of Convolutional Neural Networks] — A beginner level introduction to what Convolutional Neural Networks are and how they work<br /> * [https://www.completegate.com/2017022864/blog/deep-machine-learning-images-lenet-alexnet-cnn/all-pages Convolutional Neural Networks for Image Classification] {{Webarchive|url=https://web.archive.org/web/20180121184329/https://www.completegate.com/2017022864/blog/deep-machine-learning-images-lenet-alexnet-cnn/all-pages |date=2018-01-21 }} — Literature Survey<br /> {{Differentiable computing}}<br /> {{Authority control}}<br /> <br /> [[Category:Neural network architectures]]<br /> [[Category:Computer vision]]<br /> [[Category:Computational neuroscience]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Convolutional_neural_network&diff=1181040174 Convolutional neural network 2023-10-20T13:40:09Z <p>205.189.94.9: </p> <hr /> <div>{{short description|Artificial neural network}}<br /> {{Other uses|CNN (disambiguation)}}<br /> {{More citations needed|date=June 2019}}<br /> {{Machine learning|Artificial neural network}}<br /> '''Convolutional neural network''' ('''CNN''') is a [[regularization (mathematics)|regularized]] type of [[feed-forward neural network]] that learns [[feature engineering]] by itself via [[filter (signal processing)|filters]] (or kernel) optimization. Vanishing gradients and exploding gradients, seen during [[backpropagation]] in earlier neural networks, are prevented by using regularized weights over fewer connections.&lt;ref name=&quot;auto3&quot;&gt;{{cite book |last1=Venkatesan |first1=Ragav |url=https://books.google.com/books?id=bAM7DwAAQBAJ&amp;q=vanishing+gradient |title=Convolutional Neural Networks in Visual Computing: A Concise Guide |last2=Li |first2=Baoxin |date=2017-10-23 |publisher=CRC Press |isbn=978-1-351-65032-8 |language=en |access-date=2020-12-13 |archive-date=2023-10-16 |archive-url=https://web.archive.org/web/20231016190415/https://books.google.com/books?id=bAM7DwAAQBAJ&amp;q=vanishing+gradient#v=snippet&amp;q=vanishing%20gradient&amp;f=false |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;auto2&quot;&gt;{{cite book |last1=Balas |first1=Valentina E. |url=https://books.google.com/books?id=XRS_DwAAQBAJ&amp;q=exploding+gradient |title=Recent Trends and Advances in Artificial Intelligence and Internet of Things |last2=Kumar |first2=Raghvendra |last3=Srivastava |first3=Rajshree |date=2019-11-19 |publisher=Springer Nature |isbn=978-3-030-32644-9 |language=en |access-date=2020-12-13 |archive-date=2023-10-16 |archive-url=https://web.archive.org/web/20231016190414/https://books.google.com/books?id=XRS_DwAAQBAJ&amp;q=exploding+gradient#v=snippet&amp;q=exploding%20gradient&amp;f=false |url-status=live }}&lt;/ref&gt; For example, for ''each'' neuron in the fully-connected layer 10,000 weights would be required for processing an image sized 100 × 100 pixels. However, applying cascaded ''convolution'' (or cross-correlation) kernels,&lt;ref&gt;{{Cite journal|last1=Zhang|first1=Yingjie|last2=Soon|first2=Hong Geok|last3=Ye|first3=Dongsen|last4=Fuh|first4=Jerry Ying Hsi|last5=Zhu|first5=Kunpeng|date=September 2020|title=Powder-Bed Fusion Process Monitoring by Machine Vision With Hybrid Convolutional Neural Networks|url=https://ieeexplore.ieee.org/document/8913613|journal=IEEE Transactions on Industrial Informatics|volume=16|issue=9|pages=5769–5779|doi=10.1109/TII.2019.2956078|s2cid=213010088|issn=1941-0050|access-date=2023-08-12|archive-date=2023-07-31|archive-url=https://web.archive.org/web/20230731120013/https://ieeexplore.ieee.org/document/8913613/|url-status=live}}&lt;/ref&gt;&lt;ref&gt;{{Cite journal|last1=Chervyakov|first1=N.I.|last2=Lyakhov|first2=P.A.|last3=Deryabin|first3=M.A.|last4=Nagornov|first4=N.N.|last5=Valueva|first5=M.V.|last6=Valuev|first6=G.V.|date=September 2020|title=Residue Number System-Based Solution for Reducing the Hardware Cost of a Convolutional Neural Network|url=https://linkinghub.elsevier.com/retrieve/pii/S092523122030583X|journal=Neurocomputing|language=en|volume=407|pages=439–453|doi=10.1016/j.neucom.2020.04.018|s2cid=219470398|quote=Convolutional neural networks represent deep learning architectures that are currently used in a wide range of applications, including computer vision, speech recognition, time series analysis in finance, and many others.|access-date=2023-08-12|archive-date=2023-06-29|archive-url=https://web.archive.org/web/20230629155646/https://linkinghub.elsevier.com/retrieve/pii/S092523122030583X|url-status=live}}&lt;/ref&gt; only 25 neurons are required to process 5x5-sized tiles.&lt;ref name=&quot;auto1&quot;&gt;{{cite book |title=Guide to convolutional neural networks : a practical application to traffic-sign detection and classification |last=Habibi |first=Aghdam, Hamed |others=Heravi, Elnaz Jahani |isbn=9783319575490 |location=Cham, Switzerland |oclc=987790957 |date=2017-05-30}}&lt;/ref&gt;&lt;ref&gt;{{Cite journal|last=Atlas, Homma, and Marks|title=An Artificial Neural Network for Spatio-Temporal Bipolar Patterns: Application to Phoneme Classification|url=https://papers.nips.cc/paper/1987/file/98f13708210194c475687be6106a3b84-Paper.pdf |archive-url=https://web.archive.org/web/20210414091306/https://papers.nips.cc/paper/1987/file/98f13708210194c475687be6106a3b84-Paper.pdf |archive-date=2021-04-14 |url-status=live|journal=Neural Information Processing Systems (NIPS 1987)|volume=1}}&lt;/ref&gt; Higher-layer features are extracted from wider context windows, compared to lower-layer features. This is seen as a brute force approach to solving large-task static database development issues resolved and unresolved in ]]twentieth-first century]] computing.<br /> <br /> They have applications in: <br /> * [[computer vision|image and video recognition]],&lt;ref name=&quot;Valueva Nagornov Lyakhov Valuev 2020 pp. 232–243&quot;&gt;{{cite journal |last1=Valueva |first1=M.V. |last2=Nagornov |first2=N.N. |last3=Lyakhov |first3=P.A. |last4=Valuev |first4=G.V. |last5=Chervyakov |first5=N.I. |title=Application of the residue number system to reduce hardware costs of the convolutional neural network implementation |journal=Mathematics and Computers in Simulation |publisher=Elsevier BV |volume=177 |year=2020 |issn=0378-4754 |doi=10.1016/j.matcom.2020.04.031 |pages=232–243 |s2cid=218955622 |quote=Convolutional neural networks are a promising tool for solving the problem of pattern recognition.}}&lt;/ref&gt;<br /> * [[recommender system]]s,&lt;ref&gt;{{cite book |url=https://proceedings.neurips.cc/paper/2013/file/b3ba8f1bee1238a2f37603d90b58898d-Paper.pdf |title=Deep content-based music recommendation |last1=van den Oord |first1=Aaron |last2=Dieleman |first2=Sander |last3=Schrauwen |first3=Benjamin |date=2013-01-01 |publisher=Curran Associates, Inc. |editor-last=Burges |editor-first=C. J. C. |pages=2643–2651 |editor-last2=Bottou |editor-first2=L. |editor-last3=Welling |editor-first3=M. |editor-last4=Ghahramani |editor-first4=Z. |editor-last5=Weinberger |editor-first5=K. Q. |access-date=2022-03-31 |archive-date=2022-03-07 |archive-url=https://web.archive.org/web/20220307172303/https://proceedings.neurips.cc/paper/2013/file/b3ba8f1bee1238a2f37603d90b58898d-Paper.pdf |url-status=live }}&lt;/ref&gt; <br /> * <br /> * [[image classification]],<br /> * <br /> * [[image segmentation]], <br /> * <br /> * [[medical image computing|medical image analysis]], <br /> * <br /> * [[natural language processing]],&lt;ref&gt;{{cite book |last1=Collobert |first1=Ronan |last2=Weston |first2=Jason |title=Proceedings of the 25th international conference on Machine learning - ICML '08 |chapter=A unified architecture for natural language processing |date=2008-01-01 |location=New York, NY, USA |publisher=ACM |pages=160–167 |doi=10.1145/1390156.1390177 |isbn=978-1-60558-205-4 |s2cid=2617020}}&lt;/ref&gt;<br /> * <br /> * [[brain–computer interface]]s,&lt;ref&gt;{{cite book |last1=Avilov |first1=Oleksii |last2=Rimbert |first2=Sebastien |last3=Popov |first3=Anton |last4=Bougrain |first4=Laurent |title=2020 42nd Annual International Conference of the IEEE Engineering in Medicine &amp; Biology Society (EMBC) |chapter=Deep Learning Techniques to Improve Intraoperative Awareness Detection from Electroencephalographic Signals |date=July 2020 |chapter-url=https://ieeexplore.ieee.org/document/9176228 |volume=2020 |location=Montreal, QC, Canada |publisher=IEEE |pages=142–145 |doi=10.1109/EMBC44109.2020.9176228 |pmid=33017950 |isbn=978-1-7281-1990-8 |s2cid=221386616 |url=https://hal.inria.fr/hal-02920320/file/Avilov_EMBC2020.pdf |access-date=2023-07-21 |archive-date=2022-05-19 |archive-url=https://web.archive.org/web/20220519135428/https://hal.inria.fr/hal-02920320/file/Avilov_EMBC2020.pdf |url-status=live }}&lt;/ref&gt; and <br /> * <br /> * financial [[time series]].&lt;ref name=&quot;Tsantekidis 7–12&quot;&gt;{{cite book |last1=Tsantekidis |first1=Avraam |last2=Passalis |first2=Nikolaos |last3=Tefas |first3=Anastasios |last4=Kanniainen |first4=Juho |last5=Gabbouj |first5=Moncef |last6=Iosifidis |first6=Alexandros |title=2017 IEEE 19th Conference on Business Informatics (CBI) |chapter=Forecasting Stock Prices from the Limit Order Book Using Convolutional Neural Networks |date=July 2017 |location=Thessaloniki, Greece |publisher=IEEE |pages=7–12 |doi=10.1109/CBI.2017.23 |isbn=978-1-5386-3035-8 |s2cid=4950757}}&lt;/ref&gt;<br /> <br /> CNNs are also known as '''Shift Invariant''' or '''Space Invariant Artificial Neural Networks''' ('''SIANN'''), based on the shared-weight architecture of the [[convolution]] kernels or filters that slide along input features and provide translation-[[equivariant map|equivariant]] responses known as feature maps.&lt;ref name=&quot;:0&quot;&gt;{{cite journal |last=Zhang |first=Wei |date=1988 |title=Shift-invariant pattern recognition neural network and its optical architecture |url=https://drive.google.com/file/d/1nN_5odSG_QVae54EsQN_qSz-0ZsX6wA0/view?usp=sharing |journal=Proceedings of Annual Conference of the Japan Society of Applied Physics |access-date=2020-06-22 |archive-date=2020-06-23 |archive-url=https://web.archive.org/web/20200623051222/https://drive.google.com/file/d/1nN_5odSG_QVae54EsQN_qSz-0ZsX6wA0/view?usp=sharing |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;:1&quot;&gt;{{cite journal |last=Zhang |first=Wei |date=1990 |title=Parallel distributed processing model with local space-invariant interconnections and its optical architecture |url=https://drive.google.com/file/d/0B65v6Wo67Tk5ODRzZmhSR29VeDg/view?usp=sharing |journal=Applied Optics |volume=29 |issue=32 |pages=4790–7 |doi=10.1364/AO.29.004790 |pmid=20577468 |bibcode=1990ApOpt..29.4790Z |access-date=2016-09-22 |archive-date=2017-02-06 |archive-url=https://web.archive.org/web/20170206111407/https://drive.google.com/file/d/0B65v6Wo67Tk5ODRzZmhSR29VeDg/view?usp=sharing |url-status=live }}&lt;/ref&gt; Counter-intuitively, most convolutional neural networks are not [[translation invariant|invariant to translation]], due to the downsampling operation they apply to the input.&lt;ref name=&quot;:6&quot;&gt;{{cite book |last1=Mouton |first1=Coenraad |last2=Myburgh |first2=Johannes C. |last3=Davel |first3=Marelie H. |title=Artificial Intelligence Research |chapter=Stride and Translation Invariance in CNNS |date=2020 |editor-last=Gerber |editor-first=Aurona |chapter-url=https://link.springer.com/chapter/10.1007%2F978-3-030-66151-9_17 |series=Communications in Computer and Information Science |volume=1342 |language=en |location=Cham |publisher=Springer International Publishing |pages=267–281 |doi=10.1007/978-3-030-66151-9_17 |arxiv=2103.10097 |isbn=978-3-030-66151-9 |s2cid=232269854 |access-date=2021-03-26 |archive-date=2021-06-27 |archive-url=https://web.archive.org/web/20210627074505/https://link.springer.com/chapter/10.1007%2F978-3-030-66151-9_17 |url-status=live }}&lt;/ref&gt;<br /> <br /> [[Feed-forward neural network]]s are usually fully connected networks, that is, each neuron in one [[Layer (deep learning)|layer]] is connected to all neurons in the next [[layer (deep learning)|layer]]. The &quot;full connectivity&quot; of these networks make them prone to [[overfitting]] data. Typical ways of regularization, or preventing overfitting, include: penalizing parameters during training (such as weight decay) or trimming connectivity (skipped connections, dropout, etc.) Robust datasets also increases the probability that CNNs will learn the generalized principles that characterize a given dataset rather than the biases of a poorly-populated set.&lt;ref&gt;{{Cite journal |last=Kurtzman |first=Thomas |date=August 20, 2019 |title=Hidden bias in the DUD-E dataset leads to misleading performance of deep learning in structure-based virtual screening |journal=PLOS ONE|volume=14 |issue=8 |pages=e0220113 |doi=10.1371/journal.pone.0220113 |pmid=31430292 |pmc=6701836 |bibcode=2019PLoSO..1420113C |doi-access=free }}&lt;/ref&gt; <br /> <br /> Convolutional networks were [[mathematical biology|inspired]] by [[biological]] processes&lt;ref name=fukuneoscholar/&gt;&lt;ref name=&quot;hubelwiesel1968&quot;/&gt;&lt;ref name=&quot;intro&quot;/&gt;&lt;ref name=&quot;robust face detection&quot;&gt;{{cite journal |last=Matusugu |first=Masakazu |year=2003 |title=Subject independent facial expression recognition with robust face detection using a convolutional neural network |url=http://www.iro.umontreal.ca/~pift6080/H09/documents/papers/sparse/matsugo_etal_face_expression_conv_nnet.pdf |journal=Neural Networks |volume=16 |issue=5 |pages=555–559 |doi=10.1016/S0893-6080(03)00115-1 |pmid=12850007 |author2=Katsuhiko Mori |author3=Yusuke Mitari |author4=Yuji Kaneda |access-date=17 November 2013 |archive-date=13 December 2013 |archive-url=https://web.archive.org/web/20131213022740/http://www.iro.umontreal.ca/~pift6080/H09/documents/papers/sparse/matsugo_etal_face_expression_conv_nnet.pdf |url-status=live }}&lt;/ref&gt; in that the connectivity pattern between [[artificial neuron|neurons]] resembles the organization of the animal [[visual cortex]]. Individual [[cortical neuron]]s respond to stimuli only in a restricted region of the [[visual field]] known as the [[receptive field]]. The receptive fields of different neurons partially overlap such that they cover the entire visual field.<br /> <br /> CNNs use relatively little pre-processing compared to other [[image classification|image classification algorithms]]. This means that the network learns to optimize the [[filter (signal processing)|filters]] (or kernels) through automated learning, whereas in traditional algorithms these filters are [[feature engineering|hand-engineered]]. This independence from prior knowledge and human intervention in feature extraction is a major advantage.{{To whom?&lt;!--e.g. to the programmers? the users? the CNN? --&gt;|date=April 2023}}<br /> <br /> {{TOC limit|3}}<br /> <br /> == Architecture ==<br /> [[File:Comparison image neural networks.svg|thumb|480px|Comparison of the LeNet and AlexNet convolution, pooling and dense layers&lt;br&gt;(AlexNet image size should be 227×227×3, instead of 224×224×3, so the math will come out right. The original paper said different numbers, but Andrej Karpathy, the head of computer vision at Tesla, said it should be 227×227×3 (he said Alex didn't describe why he put 224×224×3). The next convolution should be 11×11 with stride 4: 55×55×96 (instead of 54×54×96). It would be calculated, for example, as: [(input width 227 - kernel width 11) / stride 4] + 1 = [(227 - 11) / 4] + 1 = 55. Since the kernel output is the same length as width, its area is 55×55.)]]<br /> {{Main|Layer (deep learning)}}<br /> A convolutional neural network consists of an input layer, [[Artificial neural network#Organization|hidden layers]] and an output layer. In a convolutional neural network, the hidden layers include one or more layers that perform convolutions. Typically this includes a layer that performs a [[dot product]] of the convolution kernel with the layer's input matrix. This product is usually the [[Frobenius inner product]], and its activation function is commonly [[rectifier (neural networks)|ReLU]]. As the convolution kernel slides along the input matrix for the layer, the convolution operation generates a feature map, which in turn contributes to the input of the next layer. This is followed by other layers such as pooling layers, fully connected layers, and normalization layers.<br /> <br /> === Convolutional layers ===<br /> In a CNN, the input is a [[Tensor (machine learning)|tensor]] with shape: (number of inputs) × (input height) × (input width) × (input [[channel (digital image)|channels]]). After passing through a convolutional layer, the image becomes abstracted to a feature map, also called an activation map, with shape: (number of inputs) × (feature map height) × (feature map width) × (feature map [[channel (digital image)|channels]]).<br /> <br /> Convolutional layers convolve the input and pass its result to the next layer. This is similar to the response of a neuron in the visual cortex to a specific stimulus.&lt;ref name=&quot;deeplearning&quot;&gt;{{cite web |title=Convolutional Neural Networks (LeNet) – DeepLearning 0.1 documentation |url=http://deeplearning.net/tutorial/lenet.html |work=DeepLearning 0.1 |publisher=LISA Lab |access-date=31 August 2013 |archive-date=28 December 2017 |archive-url=https://web.archive.org/web/20171228091645/http://deeplearning.net/tutorial/lenet.html |url-status=dead }}&lt;/ref&gt; Each convolutional neuron processes data only for its [[receptive field]]. Although [[multilayer perceptron|fully connected feedforward neural networks]] can be used to learn features and classify data, this architecture is generally impractical for larger inputs (e.g., high-resolution images), which would require massive numbers of neurons because each pixel is a relevant input feature. A fully connected layer for an image of size 100 × 100 has 10,000 weights for ''each'' neuron in the second layer. Convolution reduces the number of free parameters, allowing the network to be deeper.&lt;ref name=&quot;auto1&quot;/&gt; For example, using a 5 × 5 tiling region, each with the same shared weights, requires only 25 neurons. Using regularized weights over fewer parameters avoids the vanishing gradients and exploding gradients problems seen during [[backpropagation]] in earlier neural networks.&lt;ref name=&quot;auto3&quot;/&gt;&lt;ref name=&quot;auto2&quot;/&gt; <br /> <br /> To speed processing, standard convolutional layers can be replaced by depthwise separable convolutional layers,&lt;ref&gt;{{Cite arXiv |last=Chollet |first=François |date=2017-04-04 |title=Xception: Deep Learning with Depthwise Separable Convolutions |class=cs.CV |eprint=1610.02357 }}&lt;/ref&gt; which are based on a depthwise convolution followed by a pointwise convolution. The ''depthwise convolution'' is a spatial convolution applied independently over each channel of the input tensor, while the ''pointwise convolution'' is a standard convolution restricted to the use of &lt;math&gt;1\times1&lt;/math&gt; kernels.<br /> <br /> === Pooling layers ===<br /> Convolutional networks may include local and/or global pooling layers along with traditional convolutional layers. Pooling layers reduce the dimensions of data by combining the outputs of neuron clusters at one layer into a single neuron in the next layer. Local pooling combines small clusters, tiling sizes such as 2 × 2 are commonly used. Global pooling acts on all the neurons of the feature map.&lt;ref name=&quot;flexible&quot;/&gt;&lt;ref&gt;{{cite web |last=[[Alex Krizhevsky |Krizhevsky]] |first=Alex |title=ImageNet Classification with Deep Convolutional Neural Networks |url=https://image-net.org/static_files/files/supervision.pdf |access-date=17 November 2013 |archive-date=25 April 2021 |archive-url=https://web.archive.org/web/20210425025127/http://www.image-net.org/static_files/files/supervision.pdf |url-status=live }}&lt;/ref&gt; There are two common types of pooling in popular use: max and average. ''Max pooling'' uses the maximum value of each local cluster of neurons in the feature map,&lt;ref name=Yamaguchi111990&gt;{{cite conference |title=A Neural Network for Speaker-Independent Isolated Word Recognition |last1=Yamaguchi |first1=Kouichi |last2=Sakamoto |first2=Kenji |last3=Akabane |first3=Toshio |last4=Fujimoto |first4=Yoshiji |date=November 1990 |location=Kobe, Japan |conference=First International Conference on Spoken Language Processing (ICSLP 90) |url=https://www.isca-speech.org/archive/icslp_1990/i90_1077.html |access-date=2019-09-04 |archive-date=2021-03-07 |archive-url=https://web.archive.org/web/20210307233750/https://www.isca-speech.org/archive/icslp_1990/i90_1077.html |url-status=dead }}&lt;/ref&gt;&lt;ref name=&quot;mcdns&quot;&gt;{{cite book |last1=Ciresan |first1=Dan |first2=Ueli |last2=Meier |first3=Jürgen |last3=Schmidhuber |title=2012 IEEE Conference on Computer Vision and Pattern Recognition |chapter=Multi-column deep neural networks for image classification |date=June 2012 |pages=3642–3649 |doi=10.1109/CVPR.2012.6248110 |arxiv=1202.2745 |isbn=978-1-4673-1226-4 |oclc=812295155 |publisher=[[Institute of Electrical and Electronics Engineers]] (IEEE) |location=New York, NY |citeseerx=10.1.1.300.3283 |s2cid=2161592}}&lt;/ref&gt; while ''average pooling'' takes the average value.<br /> <br /> === Fully connected layers ===<br /> <br /> Fully connected layers connect every neuron in one layer to every neuron in another layer. It is the same as a traditional [[multilayer perceptron]] neural network (MLP). The flattened matrix goes through a fully connected layer to classify the images.<br /> <br /> === Receptive field ===<br /> In neural networks, each neuron receives input from some number of locations in the previous layer. In a convolutional layer, each neuron receives input from only a restricted area of the previous layer called the neuron's ''receptive field''. Typically the area is a square (e.g. 5 by 5 neurons). Whereas, in a fully connected layer, the receptive field is the ''entire previous layer''. Thus, in each convolutional layer, each neuron takes input from a larger area in the input than previous layers. This is due to applying the convolution over and over, which takes the value of a pixel into account, as well as its surrounding pixels. When using dilated layers, the number of pixels in the receptive field remains constant, but the field is more sparsely populated as its dimensions grow when combining the effect of several layers.<br /> <br /> To manipulate the receptive field size as desired, there are some alternatives to the standard convolutional layer. For example, atrous or dilated convolution&lt;ref&gt;{{Cite arXiv|last1=Yu |first1=Fisher |last2=Koltun |first2=Vladlen |date=2016-04-30 |title=Multi-Scale Context Aggregation by Dilated Convolutions |class=cs.CV |eprint=1511.07122 }}&lt;/ref&gt;&lt;ref&gt;{{Cite arXiv|last1=Chen |first1=Liang-Chieh |last2=Papandreou |first2=George |last3=Schroff |first3=Florian |last4=Adam |first4=Hartwig |date=2017-12-05 |title=Rethinking Atrous Convolution for Semantic Image Segmentation |class=cs.CV |eprint=1706.05587 }}&lt;/ref&gt; expands the receptive field size without increasing the number of parameters by interleaving visible and blind regions. Moreover, a single dilated convolutional layer can comprise filters with multiple dilation ratios,&lt;ref&gt;{{Cite arXiv|last1=Duta |first1=Ionut Cosmin |last2=Georgescu |first2=Mariana Iuliana |last3=Ionescu |first3=Radu Tudor |date=2021-08-16 |title=Contextual Convolutional Neural Networks |class=cs.CV |eprint=2108.07387 }}&lt;/ref&gt; thus having a variable receptive field size.<br /> <br /> === Weights ===<br /> Each neuron in a neural network computes an output value by applying a specific function to the input values received from the receptive field in the previous layer. The function that is applied to the input values is determined by a vector of weights and a bias (typically real numbers). Learning consists of iteratively adjusting these biases and weights.<br /> <br /> The vectors of weights and biases are called ''filters'' and represent particular [[feature (machine learning)|feature]]s of the input (e.g., a particular shape). A distinguishing feature of CNNs is that many neurons can share the same filter. This reduces the [[memory footprint]] because a single bias and a single vector of weights are used across all receptive fields that share that filter, as opposed to each receptive field having its own bias and vector weighting.&lt;ref name=&quot;LeCun&quot;&gt;{{cite web |url=http://yann.lecun.com/exdb/lenet/ |title=LeNet-5, convolutional neural networks |last=LeCun |first=Yann |access-date=16 November 2013 |archive-date=24 February 2021 |archive-url=https://web.archive.org/web/20210224225707/http://yann.lecun.com/exdb/lenet/ |url-status=live }}&lt;/ref&gt;<br /> <br /> == History ==<br /> <br /> CNN are often compared to the way the brain achieves vision processing in living [[organisms]].&lt;ref name=&quot;auto&quot;&gt;{{cite news |url=https://becominghuman.ai/from-human-vision-to-computer-vision-convolutional-neural-network-part3-4-24b55ffa7045 |title=From Human Vision to Computer Vision — Convolutional Neural Network(Part3/4) |first=Puttatida |last=Mahapattanakul |date=November 11, 2019 |website=Medium |access-date=May 25, 2021 |archive-date=May 25, 2021 |archive-url=https://web.archive.org/web/20210525073017/https://becominghuman.ai/from-human-vision-to-computer-vision-convolutional-neural-network-part3-4-24b55ffa7045 |url-status=live }}&lt;/ref&gt;&lt;ref&gt;{{Cite journal |last1=van Dyck |first1=Leonard Elia |last2=Kwitt |first2=Roland |last3=Denzler |first3=Sebastian Jochen |last4=Gruber |first4=Walter Roland |date=2021 |title=Comparing Object Recognition in Humans and Deep Convolutional Neural Networks—An Eye Tracking Study |journal=Frontiers in Neuroscience |volume=15 |page=750639 |doi=10.3389/fnins.2021.750639 |pmid=34690686 |pmc=8526843 |issn=1662-453X |doi-access=free }}&lt;/ref&gt;<br /> <br /> === Receptive fields in the visual cortex ===<br /> Work by [[David H. Hubel|Hubel]] and [[Torsten Wiesel|Wiesel]] in the 1950s and 1960s showed that cat [[visual cortex|visual cortices]] contain neurons that individually respond to small regions of the [[visual field]]. Provided the eyes are not moving, the region of visual space within which visual stimuli affect the firing of a single neuron is known as its [[receptive field]].&lt;ref name=&quot;:4&quot;/&gt; Neighboring cells have similar and overlapping receptive fields. &lt;ref name=&quot;auto&quot;/&gt; Receptive field size and location varies systematically across the cortex to form a complete map of visual space. &lt;ref name=&quot;auto&quot;/&gt;{{citation needed|date=October 2017}} The cortex in each hemisphere represents the contralateral [[visual field]].{{citation needed|date=October 2017}}<br /> <br /> Their 1968 paper identified two basic visual cell types in the brain:&lt;ref name=&quot;hubelwiesel1968&quot;&gt;{{cite journal |title=Receptive fields and functional architecture of monkey striate cortex |journal=The Journal of Physiology |date=1968-03-01 |issn=0022-3751 |pmc=1557912 |pmid=4966457 |pages=215–243 |volume=195 |issue=1 |first1=D. H. |last1=Hubel |first2=T. N. |last2=Wiesel |doi=10.1113/jphysiol.1968.sp008455}}&lt;/ref&gt;<br /> <br /> *[[simple cell]]s, whose output is maximized by straight edges having particular orientations within their receptive field<br /> *[[complex cell]]s, which have larger [[receptive field]]s, whose output is insensitive to the exact position of the edges in the field.<br /> <br /> Hubel and Wiesel also proposed a cascading model of these two types of cells for use in pattern recognition tasks.&lt;ref&gt;{{cite book<br /> |title=Brain and visual perception: the story of a 25-year collaboration<br /> |author=David H. Hubel and Torsten N. Wiesel<br /> |publisher=Oxford University Press US<br /> |year=2005<br /> |isbn=978-0-19-517618-6<br /> |page=106<br /> |url=https://books.google.com/books?id=8YrxWojxUA4C&amp;pg=PA106<br /> |access-date=2019-01-18<br /> |archive-date=2023-10-16<br /> |archive-url=https://web.archive.org/web/20231016190414/https://books.google.com/books?id=8YrxWojxUA4C&amp;pg=PA106#v=onepage&amp;q&amp;f=false<br /> |url-status=live<br /> }}&lt;/ref&gt;&lt;ref name=&quot;:4&quot;&gt;{{cite journal |pmc=1363130 |pmid=14403679 |volume=148 |issue=3 |title=Receptive fields of single neurones in the cat's striate cortex |date=October 1959 |journal=J. Physiol. |pages=574–91 |last1=Hubel |first1=DH |last2=Wiesel |first2=TN |doi=10.1113/jphysiol.1959.sp006308}}&lt;/ref&gt;<br /> <br /> === Neocognitron, origin of the CNN architecture ===<br /> <br /> The &quot;[[neocognitron]]&quot;&lt;ref name=fukuneoscholar&gt;{{cite journal |last1=Fukushima |first1=K. |year=2007 |title=Neocognitron |journal=Scholarpedia |volume=2 |issue=1 |page=1717 |doi=10.4249/scholarpedia.1717 |bibcode=2007SchpJ...2.1717F |doi-access=free}}&lt;/ref&gt; was introduced by [[Kunihiko Fukushima]] in 1980.&lt;ref name=&quot;intro&quot;&gt;{{cite journal |last=Fukushima |first=Kunihiko |title=Neocognitron: A Self-organizing Neural Network Model for a Mechanism of Pattern Recognition Unaffected by Shift in Position |journal=Biological Cybernetics |year=1980 |volume=36 |issue=4 |pages=193–202 |url=https://www.cs.princeton.edu/courses/archive/spr08/cos598B/Readings/Fukushima1980.pdf |access-date=16 November 2013 |doi=10.1007/BF00344251 |pmid=7370364 |s2cid=206775608 |archive-date=3 June 2014 |archive-url=https://web.archive.org/web/20140603013137/http://www.cs.princeton.edu/courses/archive/spr08/cos598B/Readings/Fukushima1980.pdf |url-status=live }}&lt;/ref&gt;&lt;ref name=mcdns/&gt;&lt;ref&gt;{{cite journal |first1=Yann |last1=LeCun |first2=Yoshua |last2=Bengio |first3=Geoffrey |last3=Hinton |title=Deep learning |journal=Nature |volume=521 |issue=7553 |year=2015 |pages=436–444 |doi=10.1038/nature14539 |pmid=26017442 |bibcode=2015Natur.521..436L |s2cid=3074096}}&lt;/ref&gt;<br /> It was inspired by the above-mentioned work of Hubel and Wiesel. The neocognitron introduced the two basic types of layers in CNNs: convolutional layers, and downsampling layers. A convolutional layer contains units whose receptive fields cover a patch of the previous layer. The weight vector (the set of adaptive parameters) of such a unit is often called a filter. Units can share filters. Downsampling layers contain units whose receptive fields cover patches of previous convolutional layers. Such a unit typically computes the average of the activations of the units in its patch. This downsampling helps to correctly classify objects in visual scenes even when the objects are shifted.<br /> <br /> In 1969, [[Kunihiko Fukushima]] also introduced the [[rectifier (neural networks)|ReLU]] (rectified linear unit) [[activation function]].&lt;ref name=&quot;Fukushima1969&quot;&gt;{{cite journal |first1=K. |last1=Fukushima |title=Visual feature extraction by a multilayered network of analog threshold elements |journal=IEEE Transactions on Systems Science and Cybernetics |volume=5 |issue=4 |date=1969 |pages=322–333 |doi=10.1109/TSSC.1969.300225}}&lt;/ref&gt;&lt;ref name=DLhistory&gt;{{cite arXiv|last=Schmidhuber|first=Juergen|author-link=Juergen Schmidhuber|date=2022|title=Annotated History of Modern AI and Deep Learning |class=cs.NE|eprint=2212.11279}}&lt;/ref&gt; The rectifier has become the most popular activation function for CNNs and [[deep learning|deep neural networks]] in general.&lt;ref&gt;{{cite arXiv |last1=Ramachandran |first1=Prajit |last2=Barret |first2=Zoph |last3=Quoc |first3=V. Le |date=October 16, 2017 |title=Searching for Activation Functions |eprint=1710.05941 |class=cs.NE}}&lt;/ref&gt;<br /> <br /> In a variant of the neocognitron called the cresceptron, instead of using Fukushima's spatial averaging, J. Weng et al. in 1993 introduced a method called max-pooling where a downsampling unit computes the maximum of the activations of the units in its patch.&lt;ref name=&quot;weng1993&quot;&gt;{{cite book |first1=J |last1=Weng |first2=N |last2=Ahuja |first3=TS |last3=Huang |title=1993 (4th) International Conference on Computer Vision |chapter=Learning recognition and segmentation of 3-D objects from 2-D images |s2cid=8619176 |journal=Proc. 4th International Conf. Computer Vision |year=1993 |pages=121–128 |doi=10.1109/ICCV.1993.378228 |isbn=0-8186-3870-2}}&lt;/ref&gt;{{clarify|date=April 2023}}&lt;!--a lower paragraph states that it was introduced in 1990--&gt; Max-pooling is often used in modern CNNs.&lt;ref name=&quot;schdeepscholar&quot;/&gt;<br /> <br /> Several supervised and unsupervised learning algorithms have been proposed over the decades to train the weights of a neocognitron.&lt;ref name=fukuneoscholar/&gt; Today, however, the CNN architecture is usually trained through [[backpropagation]].<br /> <br /> The [[neocognitron]] is the first CNN which requires units located at multiple network positions to have shared weights.<br /> <br /> Convolutional neural networks were presented at the Neural Information Processing Workshop in 1987, automatically analyzing time-varying signals by replacing learned multiplication with convolution in time, and demonstrated for speech recognition.&lt;ref&gt;{{cite journal |last=Homma |first=Toshiteru |author2=Les Atlas |author3=Robert Marks II |year=1988 |title=An Artificial Neural Network for Spatio-Temporal Bipolar Patters: Application to Phoneme Classification |url=https://proceedings.neurips.cc/paper/1987/file/98f13708210194c475687be6106a3b84-Paper.pdf |journal=Advances in Neural Information Processing Systems |volume=1 |pages=31–40 |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331211142/https://proceedings.neurips.cc/paper/1987/file/98f13708210194c475687be6106a3b84-Paper.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> === Time delay neural networks ===<br /> The [[time delay neural network]] (TDNN) was introduced in 1987 by [[Alex Waibel]] et al. and was one of the first convolutional networks, as it achieved shift invariance.&lt;ref name=Waibel1987&gt;{{cite conference |title=Phoneme Recognition Using Time-Delay Neural Networks |last1=Waibel |first1=Alex |date=December 1987 |location=Tokyo, Japan |conference=Meeting of the Institute of Electrical, Information and Communication Engineers (IEICE)}}&lt;/ref&gt; It did so by utilizing weight sharing in combination with [[backpropagation]] training.&lt;ref name=&quot;speechsignal&quot;&gt;[[Alex Waibel|Alexander Waibel]] et al., ''[http://www.inf.ufrgs.br/~engel/data/media/file/cmp121/waibel89_TDNN.pdf Phoneme Recognition Using Time-Delay Neural Networks] {{Webarchive|url=https://web.archive.org/web/20210225163001/http://www.inf.ufrgs.br/~engel/data/media/file/cmp121/waibel89_TDNN.pdf |date=2021-02-25 }}'' IEEE Transactions on Acoustics, Speech, and Signal Processing, Volume 37, No. 3, pp. 328. - 339 March 1989.&lt;/ref&gt; Thus, while also using a pyramidal structure as in the neocognitron, it performed a global optimization of the weights instead of a local one.&lt;ref name=Waibel1987/&gt;<br /> <br /> TDNNs are convolutional networks that share weights along the temporal dimension.&lt;ref&gt;{{cite encyclopedia |last1=LeCun |first1=Yann |last2=Bengio |first2=Yoshua |editor-last=Arbib |editor-first=Michael A. |title=Convolutional networks for images, speech, and time series |encyclopedia=The handbook of brain theory and neural networks |edition=Second |year=1995 |publisher=The MIT press |pages=276–278 |url=https://www.researchgate.net/publication/2453996 |access-date=2019-12-03 |archive-date=2020-07-28 |archive-url=https://web.archive.org/web/20200728164116/https://www.researchgate.net/publication/2453996_Convolutional_Networks_for_Images_Speech_and_Time-Series |url-status=live }}&lt;/ref&gt; They allow speech signals to be processed time-invariantly. In 1990 Hampshire and Waibel introduced a variant which performs a two dimensional convolution.&lt;ref name=&quot;Hampshire1990&quot;&gt;John B. Hampshire and Alexander Waibel, ''[https://proceedings.neurips.cc/paper/1989/file/979d472a84804b9f647bc185a877a8b5-Paper.pdf Connectionist Architectures for Multi-Speaker Phoneme Recognition] {{Webarchive|url=https://web.archive.org/web/20220331225059/https://proceedings.neurips.cc/paper/1989/file/979d472a84804b9f647bc185a877a8b5-Paper.pdf |date=2022-03-31 }}'', Advances in Neural Information Processing Systems, 1990, Morgan Kaufmann.&lt;/ref&gt; Since these TDNNs operated on spectrograms, the resulting phoneme recognition system was invariant to both shifts in time and in frequency. This inspired [[translation invariance]] in image processing with CNNs.&lt;ref name=&quot;speechsignal&quot;/&gt; The tiling of neuron outputs can cover timed stages.&lt;ref name=&quot;video quality&quot;/&gt;<br /> <br /> TDNNs now {{When|date=August 2022}} achieve the best performance in far distance speech recognition.&lt;ref name=Ko2017&gt;{{cite conference |title=A Study on Data Augmentation of Reverberant Speech for Robust Speech Recognition |last1=Ko |first1=Tom |last2=Peddinti |first2=Vijayaditya |last3=Povey |first3=Daniel |last4=Seltzer |first4=Michael L. |last5=Khudanpur |first5=Sanjeev |date=March 2018 |location=New Orleans, LA, USA |conference=The 42nd IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2017) |url=https://www.danielpovey.com/files/2017_icassp_reverberation.pdf |access-date=2019-09-04 |archive-date=2018-07-08 |archive-url=https://web.archive.org/web/20180708072725/http://danielpovey.com/files/2017_icassp_reverberation.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> ==== Max pooling ====<br /> In 1990 Yamaguchi et al. introduced the concept of max pooling, which is a fixed filtering operation that calculates and propagates the maximum value of a given region. They did so by combining TDNNs with max pooling in order to realize a speaker independent isolated word recognition system.&lt;ref name=&quot;Yamaguchi111990&quot;/&gt; In their system they used several TDNNs per word, one for each [[syllable]]. The results of each TDNN over the input signal were combined using max pooling and the outputs of the pooling layers were then passed on to networks performing the actual word classification.<br /> <br /> === Image recognition with CNNs trained by gradient descent ===<br /> A system to recognize hand-written [[ZIP Code]] numbers&lt;ref&gt;Denker, J S, Gardner, W R, Graf, H. P, Henderson, D, Howard, R E, Hubbard, W, Jackel, L D, BaIrd, H S, and Guyon (1989) [http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.852.5499&amp;rep=rep1&amp;type=pdf Neural network recognizer for hand-written zip code digits] {{Webarchive|url=https://web.archive.org/web/20180804013916/http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.852.5499&amp;rep=rep1&amp;type=pdf |date=2018-08-04 }}, AT&amp;T Bell Laboratories&lt;/ref&gt; involved convolutions in which the kernel coefficients had been laboriously hand designed.&lt;ref name=&quot;:2&quot;&gt;Y. LeCun, B. Boser, J. S. Denker, D. Henderson, R. E. Howard, W. Hubbard, L. D. Jackel, [http://yann.lecun.com/exdb/publis/pdf/lecun-89e.pdf Backpropagation Applied to Handwritten Zip Code Recognition] {{Webarchive|url=https://web.archive.org/web/20200110090230/http://yann.lecun.com/exdb/publis/pdf/lecun-89e.pdf |date=2020-01-10 }}; AT&amp;T Bell Laboratories&lt;/ref&gt;<br /> <br /> [[Yann LeCun]] et al. (1989)&lt;ref name=&quot;:2&quot;/&gt; used back-propagation to learn the convolution kernel coefficients directly from images of hand-written numbers. Learning was thus fully automatic, performed better than manual coefficient design, and was suited to a broader range of image recognition problems and image types.<br /> <br /> Wei Zhang et al. (1988)&lt;ref name=&quot;:0&quot;/&gt;&lt;ref name=&quot;:1&quot;/&gt; used back-propagation to train the convolution kernels of a CNN for alphabets recognition. The model was called Shift-Invariant Artificial Neural Network (SIANN) before the name CNN was coined later in the early 1990s. Wei Zhang et al. also applied the same CNN without the last fully connected layer for medical image object segmentation (1991)&lt;ref name=&quot;:wz1991&quot;/&gt; and breast cancer detection in mammograms (1994).&lt;ref name=&quot;:wz1994&quot;/&gt; <br /> <br /> This approach became a foundation of modern [[computer vision]].<br /> <br /> ==== LeNet-5 ====<br /> {{Main|LeNet}}<br /> LeNet-5, a pioneering 7-level convolutional network by [[Yann LeCun|LeCun]] et al. in 1995,&lt;ref name=&quot;lecun95&quot;&gt;http://yann.lecun.com/exdb/publis/pdf/lecun-95a.pdf {{Webarchive|url=https://web.archive.org/web/20230502220356/http://yann.lecun.com/exdb/publis/pdf/lecun-95a.pdf |date=2023-05-02 }} {{bare URL PDF|date=May 2023}}&lt;/ref&gt; that classifies digits, was applied by several banks to recognize hand-written numbers on checks ({{Lang-en-GB|cheques}}) digitized in 32x32 pixel images. The ability to process higher-resolution images requires larger and more layers of convolutional neural networks, so this technique is constrained by the availability of computing resources.<br /> <br /> ===Shift-invariant neural network===<br /> <br /> A shift-invariant neural network was proposed by Wei Zhang et al. for image character recognition in 1988.&lt;ref name=&quot;:0&quot;/&gt;&lt;ref name=&quot;:1&quot;/&gt; It is a modified Neocognitron by keeping only the convolutional interconnections between the image feature layers and the last fully connected layer. The model was trained with back-propagation. The training algorithm were further improved in 1991&lt;ref&gt;{{cite journal |last=Zhang |first=Wei |date=1991 |title=Error Back Propagation with Minimum-Entropy Weights: A Technique for Better Generalization of 2-D Shift-Invariant NNs |url=https://drive.google.com/file/d/0B65v6Wo67Tk5dkJTcEMtU2c5Znc/view?usp=sharing |journal=Proceedings of the International Joint Conference on Neural Networks |access-date=2016-09-22 |archive-date=2017-02-06 |archive-url=https://web.archive.org/web/20170206155801/https://drive.google.com/file/d/0B65v6Wo67Tk5dkJTcEMtU2c5Znc/view?usp=sharing |url-status=live }}&lt;/ref&gt; to improve its generalization ability. The model architecture was modified by removing the last fully connected layer and applied for medical image segmentation (1991)&lt;ref name=&quot;:wz1991&quot;&gt;{{cite journal |last=Zhang |first=Wei |date=1991 |title=Image processing of human corneal endothelium based on a learning network |url=https://drive.google.com/file/d/0B65v6Wo67Tk5cm5DTlNGd0NPUmM/view?usp=sharing |journal=Applied Optics |volume=30 |issue=29 |pages=4211–7 |doi=10.1364/AO.30.004211 |pmid=20706526 |bibcode=1991ApOpt..30.4211Z |access-date=2016-09-22 |archive-date=2017-02-06 |archive-url=https://web.archive.org/web/20170206122612/https://drive.google.com/file/d/0B65v6Wo67Tk5cm5DTlNGd0NPUmM/view?usp=sharing |url-status=live }}&lt;/ref&gt; and automatic detection of breast cancer in [[mammography|mammograms (1994)]].&lt;ref name=&quot;:wz1994&quot;&gt;{{cite journal |last=Zhang |first=Wei |date=1994 |title=Computerized detection of clustered microcalcifications in digital mammograms using a shift-invariant artificial neural network |url=https://drive.google.com/file/d/0B65v6Wo67Tk5Ml9qeW5nQ3poVTQ/view?usp=sharing |journal=Medical Physics |volume=21 |issue=4 |pages=517–24 |doi=10.1118/1.597177 |pmid=8058017 |bibcode=1994MedPh..21..517Z |access-date=2016-09-22 |archive-date=2017-02-06 |archive-url=https://web.archive.org/web/20170206030321/https://drive.google.com/file/d/0B65v6Wo67Tk5Ml9qeW5nQ3poVTQ/view?usp=sharing |url-status=live }}&lt;/ref&gt;<br /> <br /> A different convolution-based design was proposed in 1988&lt;ref&gt;Daniel Graupe, Ruey Wen Liu, George S Moschytz.&quot;[https://www.researchgate.net/profile/Daniel_Graupe2/publication/241130197_Applications_of_signal_and_image_processing_to_medicine/links/575eef7e08aec91374b42bd2.pdf Applications of neural networks to medical signal processing] {{Webarchive|url=https://web.archive.org/web/20200728164114/https://www.researchgate.net/profile/Daniel_Graupe2/publication/241130197_Applications_of_signal_and_image_processing_to_medicine/links/575eef7e08aec91374b42bd2.pdf |date=2020-07-28 }}&quot;. In Proc. 27th IEEE Decision and Control Conf., pp. 343–347, 1988.&lt;/ref&gt; for application to decomposition of one-dimensional [[electromyography]] convolved signals via de-convolution. This design was modified in 1989 to other de-convolution-based designs.&lt;ref&gt;Daniel Graupe, Boris Vern, G. Gruener, Aaron Field, and Qiu Huang. &quot;[https://ieeexplore.ieee.org/abstract/document/100522/ Decomposition of surface EMG signals into single fiber action potentials by means of neural network] {{Webarchive|url=https://web.archive.org/web/20190904161656/https://ieeexplore.ieee.org/abstract/document/100522/ |date=2019-09-04 }}&quot;. Proc. IEEE International Symp. on Circuits and Systems, pp. 1008–1011, 1989.&lt;/ref&gt;&lt;ref&gt;Qiu Huang, Daniel Graupe, Yi Fang Huang, Ruey Wen Liu.&quot;[http://www.academia.edu/download/42092095/graupe_huang_q_huang_yf_liu_rw_1989.pdf Identification of firing patterns of neuronal signals]{{dead link|date=July 2022|bot=medic}}{{cbignore|bot=medic}}.&quot; In Proc. 28th IEEE Decision and Control Conf., pp. 266–271, 1989. https://ieeexplore.ieee.org/document/70115 {{Webarchive|url=https://web.archive.org/web/20220331211138/https://ieeexplore.ieee.org/document/70115 |date=2022-03-31 }}&lt;/ref&gt;<br /> <br /> === Neural abstraction pyramid ===<br /> [[File:Neural Abstraction Pyramid.jpg|alt=Neural Abstraction Pyramid|thumb|Neural abstraction pyramid]]<br /> The feed-forward architecture of convolutional neural networks was extended in the neural abstraction pyramid&lt;ref&gt;{{cite book<br /> |last1=Behnke<br /> |first1=Sven<br /> |year=2003<br /> |title=Hierarchical Neural Networks for Image Interpretation<br /> |url=https://www.ais.uni-bonn.de/books/LNCS2766.pdf<br /> |series=Lecture Notes in Computer Science<br /> |volume=2766<br /> |publisher=Springer<br /> |doi=10.1007/b11963<br /> |isbn=978-3-540-40722-5<br /> |s2cid=1304548<br /> |access-date=2016-12-28<br /> |archive-date=2017-08-10<br /> |archive-url=https://web.archive.org/web/20170810020001/http://www.ais.uni-bonn.de/books/LNCS2766.pdf<br /> |url-status=live<br /> }}&lt;/ref&gt; by lateral and feedback connections. The resulting recurrent convolutional network allows for the flexible incorporation of contextual information to iteratively resolve local ambiguities. In contrast to previous models, image-like outputs at the highest resolution were generated, e.g., for semantic segmentation, image reconstruction, and object localization tasks.<br /> <br /> === GPU implementations ===<br /> Although CNNs were invented in the 1980s, their breakthrough in the 2000s required fast implementations on [[graphics processing unit]]s (GPUs).<br /> <br /> In 2004, it was shown by K. S. Oh and K. Jung that standard neural networks can be greatly accelerated on GPUs. Their implementation was 20 times faster than an equivalent implementation on [[CPU]].&lt;ref&gt;{{cite journal |last1=Oh |first1=KS |last2=Jung |first2=K |title=GPU implementation of neural networks. |journal=Pattern Recognition |date=2004 |volume=37 |issue=6 |pages=1311–1314 |doi=10.1016/j.patcog.2004.01.013 |bibcode=2004PatRe..37.1311O}}&lt;/ref&gt;&lt;ref name=&quot;schdeepscholar&quot;&gt;{{cite journal |last1=Schmidhuber |first1=Jürgen |title=Deep Learning |journal=Scholarpedia |url=http://www.scholarpedia.org/article/Deep_Learning |date=2015 |volume=10 |issue=11 |pages=1527–54 |pmid=16764513 |doi=10.1162/neco.2006.18.7.1527 |citeseerx=10.1.1.76.1541 |s2cid=2309950 |access-date=2019-01-20 |archive-date=2016-04-19 |archive-url=https://web.archive.org/web/20160419024349/http://www.scholarpedia.org/article/Deep_Learning |url-status=live }}&lt;/ref&gt; In 2005, another paper also emphasised the value of [[GPGPU]] for [[machine learning]].&lt;ref&gt;{{cite conference |author1=Dave Steinkraus |author2=Patrice Simard |author3=Ian Buck |title=12th International Conference on Document Analysis and Recognition (ICDAR 2005) |date=2005 |pages=1115–1119 |chapter-url=https://www.computer.org/csdl/proceedings-article/icdar/2005/24201115/12OmNylKAVX |chapter=Using GPUs for Machine Learning Algorithms |doi=10.1109/ICDAR.2005.251 |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331211138/https://www.computer.org/csdl/proceedings-article/icdar/2005/24201115/12OmNylKAVX |url-status=live }}&lt;/ref&gt;<br /> <br /> The first GPU-implementation of a CNN was described in 2006 by K. Chellapilla et al. Their implementation was 4 times faster than an equivalent implementation on CPU.&lt;ref&gt;{{cite book |author1=Kumar Chellapilla |author2=Sid Puri |author3=Patrice Simard |editor1-last=Lorette |editor1-first=Guy |title=Tenth International Workshop on Frontiers in Handwriting Recognition |date=2006 |publisher=Suvisoft |chapter-url=https://hal.inria.fr/inria-00112631/document |chapter=High Performance Convolutional Neural Networks for Document Processing |access-date=2016-03-14 |archive-date=2020-05-18 |archive-url=https://web.archive.org/web/20200518193413/https://hal.inria.fr/inria-00112631/document |url-status=live }}&lt;/ref&gt; Subsequent work also used GPUs, initially for other types of neural networks (different from CNNs), especially unsupervised neural networks.&lt;ref&gt;{{cite journal |last1=Hinton |first1=GE |last2=Osindero |first2=S |last3=Teh |first3=YW |title=A fast learning algorithm for deep belief nets. |journal=Neural Computation |date=Jul 2006 |volume=18 |issue=7 |pages=1527–54 |pmid=16764513 |doi=10.1162/neco.2006.18.7.1527 |citeseerx=10.1.1.76.1541 |s2cid=2309950}}&lt;/ref&gt;&lt;ref&gt;{{cite journal |last1=Bengio |first1=Yoshua |last2=Lamblin |first2=Pascal |last3=Popovici |first3=Dan |last4=Larochelle |first4=Hugo |title=Greedy Layer-Wise Training of Deep Networks |journal=Advances in Neural Information Processing Systems |date=2007 |pages=153–160 |url=https://proceedings.neurips.cc/paper/2006/file/5da713a690c067105aeb2fae32403405-Paper.pdf |access-date=2022-03-31 |archive-date=2022-06-02 |archive-url=https://web.archive.org/web/20220602144141/https://proceedings.neurips.cc/paper/2006/file/5da713a690c067105aeb2fae32403405-Paper.pdf |url-status=live }}&lt;/ref&gt;&lt;ref&gt;{{cite journal |last1=Ranzato |first1=MarcAurelio |last2=Poultney |first2=Christopher |last3=Chopra |first3=Sumit |last4=LeCun |first4=Yann |title=Efficient Learning of Sparse Representations with an Energy-Based Model |journal=Advances in Neural Information Processing Systems |date=2007 |url=http://yann.lecun.com/exdb/publis/pdf/ranzato-06.pdf |access-date=2014-06-26 |archive-date=2016-03-22 |archive-url=https://web.archive.org/web/20160322112400/http://yann.lecun.com/exdb/publis/pdf/ranzato-06.pdf |url-status=live }}&lt;/ref&gt;&lt;ref&gt;{{cite book |last1=Raina |first1=R |last2=Madhavan |first2=A |last3=Ng |first3=Andrew |title=Proceedings of the 26th Annual International Conference on Machine Learning |chapter=Large-scale deep unsupervised learning using graphics processors |journal=ICML |date=2009 |pages=873–880 |doi=10.1145/1553374.1553486 |isbn=9781605585161 |s2cid=392458 |chapter-url=http://robotics.stanford.edu/~ang/papers/icml09-LargeScaleUnsupervisedDeepLearningGPU.pdf |access-date=2019-09-04 |archive-date=2020-12-08 |archive-url=https://web.archive.org/web/20201208104513/http://robotics.stanford.edu/~ang/papers/icml09-LargeScaleUnsupervisedDeepLearningGPU.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> In 2010, Dan Ciresan et al. at [[IDSIA]] showed that even deep standard neural networks with many layers can be quickly trained on GPU by supervised learning through the old method known as [[backpropagation]]. Their network outperformed previous machine learning methods on the [[MNIST]] handwritten digits benchmark.&lt;ref&gt;{{cite journal |last1=Ciresan |first1=Dan |last2=Meier |first2=Ueli |last3=Gambardella |first3=Luca |last4=Schmidhuber |first4=Jürgen |title=Deep big simple neural nets for handwritten digit recognition. |journal=Neural Computation |date=2010 |volume=22 |issue=12 |pages=3207–3220 |doi=10.1162/NECO_a_00052 |pmid=20858131 |arxiv=1003.0358 |s2cid=1918673}}&lt;/ref&gt; In 2011, they extended this GPU approach to CNNs, achieving an acceleration factor of 60, with impressive results.&lt;ref name=&quot;flexible&quot;&gt;{{cite journal |last=Ciresan |first=Dan |author2=Ueli Meier |author3=Jonathan Masci |author4=Luca M. Gambardella |author5=Jurgen Schmidhuber |title=Flexible, High Performance Convolutional Neural Networks for Image Classification |journal=Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence-Volume Volume Two |year=2011 |volume=2 |pages=1237–1242 |url=https://people.idsia.ch/~juergen/ijcai2011.pdf |access-date=17 November 2013 |archive-date=5 April 2022 |archive-url=https://web.archive.org/web/20220405190128/https://people.idsia.ch/~juergen/ijcai2011.pdf |url-status=live }}&lt;/ref&gt; In 2011, they used such CNNs on GPU to win an image recognition contest where they achieved superhuman performance for the first time.&lt;ref&gt;{{cite web |url=https://benchmark.ini.rub.de/gtsrb_results.html |title=IJCNN 2011 Competition result table |website=OFFICIAL IJCNN2011 COMPETITION |language=en-US |access-date=2019-01-14 |date=2010 |archive-date=2021-01-17 |archive-url=https://web.archive.org/web/20210117024729/https://benchmark.ini.rub.de/gtsrb_results.html |url-status=live }}&lt;/ref&gt; Between May 15, 2011 and September 30, 2012, their CNNs won no less than four image competitions.&lt;ref&gt;{{cite web |url=https://people.idsia.ch/~juergen/computer-vision-contests-won-by-gpu-cnns.html |last1=Schmidhuber |first1=Jürgen |title=History of computer vision contests won by deep CNNs on GPU |language=en-US |access-date=14 January 2019 |date=17 March 2017 |archive-date=19 December 2018 |archive-url=https://web.archive.org/web/20181219224934/http://people.idsia.ch/~juergen/computer-vision-contests-won-by-gpu-cnns.html |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;schdeepscholar&quot;/&gt; In 2012, they also significantly improved on the best performance in the literature for multiple image [[database]]s, including the [[MNIST database]], the NORB database, the HWDB1.0 dataset (Chinese characters) and the [[CIFAR-10|CIFAR10 dataset]] (dataset of 60000 32x32 labeled [[RGB images]]).&lt;ref name=&quot;mcdns&quot;/&gt;<br /> <br /> Subsequently, a similar GPU-based CNN by Alex Krizhevsky et al. won the [[ImageNet Large Scale Visual Recognition Challenge]] 2012.&lt;ref name=&quot;:02&quot;/&gt; A very deep CNN with over 100 layers by Microsoft won the ImageNet 2015 contest.&lt;ref&gt;{{cite book |last1=He |first1=Kaiming |last2=Zhang |first2=Xiangyu |last3=Ren |first3=Shaoqing |last4=Sun |first4=Jian |title=2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) |chapter=Deep Residual Learning for Image Recognition |pages=770–778 |date=2016 |chapter-url=https://openaccess.thecvf.com/content_cvpr_2016/papers/He_Deep_Residual_Learning_CVPR_2016_paper.pdf |doi=10.1109/CVPR.2016.90 |arxiv=1512.03385 |isbn=978-1-4673-8851-1 |s2cid=206594692 |access-date=2022-03-31 |archive-date=2022-04-05 |archive-url=https://web.archive.org/web/20220405165303/https://openaccess.thecvf.com/content_cvpr_2016/papers/He_Deep_Residual_Learning_CVPR_2016_paper.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> === Intel Xeon Phi implementations ===<br /> Compared to the training of CNNs using [[GPU]]s, not much attention was given to the [[Intel Xeon Phi]] [[coprocessor]].&lt;ref&gt;{{cite conference<br /> |last1=Viebke<br /> |first1=Andre<br /> |last2=Pllana<br /> |first2=Sabri<br /> |title=2015 IEEE 17th International Conference on High Performance Computing and Communications, 2015 IEEE 7th International Symposium on Cyberspace Safety and Security, and 2015 IEEE 12th International Conference on Embedded Software and Systems<br /> |chapter=The Potential of the Intel (R) Xeon Phi for Supervised Deep Learning<br /> |pages=758–765<br /> |website=IEEE Xplore<br /> |publisher=IEEE 2015<br /> |doi=10.1109/HPCC-CSS-ICESS.2015.45<br /> |isbn=978-1-4799-8937-9<br /> |year=2015<br /> |s2cid=15411954<br /> |chapter-url=http://lnu.diva-portal.org/smash/record.jsf?pid=diva2%3A877421&amp;dswid=4277<br /> |access-date=2022-03-31<br /> |archive-date=2023-03-06<br /> |archive-url=https://web.archive.org/web/20230306003530/http://lnu.diva-portal.org/smash/record.jsf?pid=diva2:877421&amp;dswid=4277<br /> |url-status=live<br /> }}&lt;/ref&gt;<br /> A notable development is a parallelization method for training convolutional neural networks on the Intel Xeon Phi, named Controlled Hogwild with Arbitrary Order of Synchronization (CHAOS).&lt;ref&gt;<br /> {{cite journal<br /> |last1=Viebke<br /> |first1=Andre<br /> |last2=Memeti<br /> |first2=Suejb<br /> |last3=Pllana<br /> |first3=Sabri<br /> |last4=Abraham<br /> |first4=Ajith<br /> |title=CHAOS: a parallelization scheme for training convolutional neural networks on Intel Xeon Phi<br /> |journal=The Journal of Supercomputing<br /> |date=2019<br /> |volume=75<br /> |issue=1<br /> |pages=197–227<br /> |doi=10.1007/s11227-017-1994-x<br /> |arxiv=1702.07908<br /> |s2cid=14135321<br /> }}<br /> &lt;/ref&gt;<br /> CHAOS exploits both the thread- and [[SIMD]]-level parallelism that is available on the Intel Xeon Phi.<br /> <br /> == Distinguishing features ==<br /> In the past, traditional [[multilayer perceptron]] (MLP) models were used for image recognition.{{Example needed|date=October 2017}} However, the full connectivity between nodes caused the [[curse of dimensionality]], and was computationally intractable with higher-resolution images. A 1000×1000-pixel image with [[RGB color model|RGB color]] channels has 3 million weights per fully-connected neuron, which is too high to feasibly process efficiently at scale.<br /> [[File:Conv layers.png|left|thumb|237x237px|CNN layers arranged in 3 dimensions]]<br /> For example, in [[CIFAR-10]], images are only of size 32×32×3 (32 wide, 32 high, 3 color channels), so a single fully connected neuron in the first hidden layer of a regular neural network would have 32*32*3 = 3,072 weights. A 200×200 image, however, would lead to neurons that have 200*200*3 = 120,000 weights.<br /> <br /> Also, such network architecture does not take into account the spatial structure of data, treating input pixels which are far apart in the same way as pixels that are close together. This ignores [[locality of reference]] in data with a grid-topology (such as images), both computationally and semantically. Thus, full connectivity of neurons is wasteful for purposes such as image recognition that are dominated by [[spatial locality|spatially local]] input patterns.<br /> <br /> Convolutional neural networks are variants of multilayer perceptrons, designed to emulate the behavior of a [[visual cortex]]. These models mitigate the challenges posed by the MLP architecture by exploiting the strong spatially local correlation present in natural images. As opposed to MLPs, CNNs have the following distinguishing features:<br /> * 3D volumes of neurons. The layers of a CNN have neurons arranged in [[three-dimensional space|3 dimensions]]: width, height and depth.&lt;ref&gt;{{cite journal |last=Hinton |first=Geoffrey |date=2012 |title=ImageNet Classification with Deep Convolutional Neural Networks |url=https://dl.acm.org/doi/10.5555/2999134.2999257 |journal=NIPS'12: Proceedings of the 25th International Conference on Neural Information Processing Systems - Volume 1 |volume=1 |pages=1097–1105 |via=ACM |access-date=2021-03-26 |archive-date=2019-12-20 |archive-url=https://web.archive.org/web/20191220014019/https://dl.acm.org/citation.cfm?id=2999134.2999257 |url-status=live }}&lt;/ref&gt; Where each neuron inside a convolutional layer is connected to only a small region of the layer before it, called a receptive field. Distinct types of layers, both locally and completely connected, are stacked to form a CNN architecture.<br /> * Local connectivity: following the concept of receptive fields, CNNs exploit spatial locality by enforcing a local connectivity pattern between neurons of adjacent layers. The architecture thus ensures that the learned &quot;[[filter (signal processing)|filters]]&quot; produce the strongest response to a spatially local input pattern. Stacking many such layers leads to [[nonlinear filter]]s that become increasingly global (i.e. responsive to a larger region of pixel space) so that the network first creates representations of small parts of the input, then from them assembles representations of larger areas.<br /> * Shared weights: In CNNs, each filter is replicated across the entire visual field. These replicated units share the same parameterization (weight vector and bias) and form a feature map. This means that all the neurons in a given convolutional layer respond to the same feature within their specific response field. Replicating units in this way allows for the resulting activation map to be [[equivariant map|equivariant]] under shifts of the locations of input features in the visual field, i.e. they grant translational [[equivariant map|equivariance]] - given that the layer has a stride of one.&lt;ref name=&quot;:5&quot;/&gt;<br /> * Pooling: In a CNN's pooling layers, feature maps are divided into rectangular sub-regions, and the features in each rectangle are independently down-sampled to a single value, commonly by taking their average or maximum value. In addition to reducing the sizes of feature maps, the pooling operation grants a degree of local [[translational symmetry|translational invariance]] to the features contained therein, allowing the CNN to be more robust to variations in their positions.&lt;ref name=&quot;:6&quot;/&gt;<br /> <br /> Together, these properties allow CNNs to achieve better generalization on [[computer vision|vision problems]]. Weight sharing dramatically reduces the number of [[free parameter]]s learned, thus lowering the memory requirements for running the network and allowing the training of larger, more powerful networks.<br /> <br /> == Building blocks ==<br /> {{More citations needed section|date=June 2017}}<br /> <br /> A CNN architecture is formed by a stack of distinct layers that transform the input volume into an output volume (e.g. holding the class scores) through a differentiable function. A few distinct types of layers are commonly used. These are further discussed below.[[File:Conv layer.png|left|thumb|Neurons of a convolutional layer (blue), connected to their receptive field (red)|229x229px]]<br /> <br /> === Convolutional layer ===<br /> The convolutional layer is the core building block of a CNN. The layer's parameters consist of a set of learnable [[filter (signal processing)|filters]] (or [[kernel (image processing)|kernels]]), which have a small receptive field, but extend through the full depth of the input volume. During the forward pass, each filter is [[convolution|convolved]] across the width and height of the input volume, computing the [[dot product]] between the filter entries and the input, producing a 2-dimensional [[activation function|activation map]] of that filter. As a result, the network learns filters that activate when it detects some specific type of [[feature (machine learning)|feature]] at some spatial position in the input.&lt;ref name=&quot;Géron Hands-on ML 2019&quot;&gt;{{cite book<br /> |last1=Géron<br /> |first1=Aurélien<br /> |title=Hands-on Machine Learning with Scikit-Learn, Keras, and TensorFlow<br /> |date=2019<br /> |publisher=O'Reilly Media<br /> |location=Sebastopol, CA<br /> |isbn=978-1-492-03264-9<br /> }}, pp. 448&lt;/ref&gt;&lt;ref group=&quot;nb&quot;&gt;When applied to other types of data than image data, such as sound data, &quot;spatial position&quot; may variously correspond to different points in the [[time domain]], [[frequency domain]], or other [[space (mathematics)|mathematical spaces]].&lt;/ref&gt;<br /> <br /> Stacking the activation maps for all filters along the depth dimension forms the full output volume of the convolution layer. Every entry in the output volume can thus also be interpreted as an output of a neuron that looks at a small region in the input. Each entry in an activation map use the same set of parameters that define the filter.<br /> <br /> [[Self-supervised learning]] has been adapted for use in convolutional layers by using sparse patches with a high-mask ratio and a global response normalization layer.&lt;ref&gt;{{Cite web |last=Raschka |first=Sebastian |title=Ahead of AI #5: RevAIval of Ideas |url=https://magazine.sebastianraschka.com/p/ahead-of-ai-5-revaival-of-ideas |access-date=2023-02-07 |website=magazine.sebastianraschka.com |language=en |archive-date=2023-02-07 |archive-url=https://web.archive.org/web/20230207003859/https://magazine.sebastianraschka.com/p/ahead-of-ai-5-revaival-of-ideas |url-status=live }}&lt;/ref&gt;<br /> <br /> ==== Local connectivity ====<br /> [[File:Typical cnn.png|thumb|395x395px|Typical CNN architecture]]<br /> <br /> When dealing with high-dimensional inputs such as images, it is impractical to connect neurons to all neurons in the previous volume because such a network architecture does not take the spatial structure of the data into account. Convolutional networks exploit spatially local correlation by enforcing a [[sparse network|sparse local connectivity]] pattern between neurons of adjacent layers: each neuron is connected to only a small region of the input volume.<br /> <br /> The extent of this connectivity is a [[hyperparameter optimization|hyperparameter]] called the [[receptive field]] of the neuron. The connections are [[spatial locality|local in space]] (along width and height), but always extend along the entire depth of the input volume. Such an architecture ensures that the learned ({{Lang-en-GB|learnt}}) filters produce the strongest response to a spatially local input pattern.<br /> <br /> ==== Spatial arrangement ====<br /> <br /> Three [[hyperparameter (machine learning)|hyperparameters]] control the size of the output volume of the convolutional layer: the depth, [[stride of an array|stride]], and padding size:<br /> * The ''&lt;u&gt;depth&lt;/u&gt;'' of the output volume controls the number of neurons in a layer that connect to the same region of the input volume. These neurons learn to activate for different features in the input. For example, if the first convolutional layer takes the raw image as input, then different neurons along the depth dimension may activate in the presence of various oriented edges, or blobs of color.<br /> *&lt;u&gt;''Stride''&lt;/u&gt; controls how depth columns around the width and height are allocated. If the stride is 1, then we move the filters one pixel at a time. This leads to heavily [[intersection (set theory)|overlapping]] receptive fields between the columns, and to large output volumes. For any integer &lt;math display=&quot;inline&quot;&gt;S &gt; 0,&lt;/math&gt; a stride ''S'' means that the filter is translated ''S'' units at a time per output. In practice, &lt;math display=&quot;inline&quot;&gt;S \geq 3&lt;/math&gt; is rare. A greater stride means smaller overlap of receptive fields and smaller spatial dimensions of the output volume.&lt;ref&gt;{{cite web |url=https://cs231n.github.io/convolutional-networks/ |title=CS231n Convolutional Neural Networks for Visual Recognition |website=cs231n.github.io |access-date=2017-04-25 |archive-date=2019-10-23 |archive-url=https://web.archive.org/web/20191023031945/https://cs231n.github.io/convolutional-networks/ |url-status=live }}&lt;/ref&gt;<br /> * Sometimes, it is convenient to pad the input with zeros (or other values, such as the average of the region) on the border of the input volume. The size of this padding is a third hyperparameter. Padding provides control of the output volume's spatial size. In particular, sometimes it is desirable to exactly preserve the spatial size of the input volume, this is commonly referred to as &quot;same&quot; padding.<br /> <br /> The spatial size of the output volume is a function of the input volume size &lt;math&gt;W&lt;/math&gt;, the kernel field size &lt;math&gt;K&lt;/math&gt; of the convolutional layer neurons, the stride &lt;math&gt;S&lt;/math&gt;, and the amount of zero padding &lt;math&gt;P&lt;/math&gt; on the border. The number of neurons that &quot;fit&quot; in a given volume is then:<br /> :&lt;math display=&quot;block&quot;&gt;\frac{W-K+2P}{S} + 1.&lt;/math&gt;<br /> <br /> If this number is not an [[integer]], then the strides are incorrect and the neurons cannot be tiled to fit across the input volume in a [[symmetry|symmetric]] way. In general, setting zero padding to be &lt;math display=&quot;inline&quot;&gt;P = (K-1)/2&lt;/math&gt; when the stride is &lt;math&gt;S=1&lt;/math&gt; ensures that the input volume and output volume will have the same size spatially. However, it is not always completely necessary to use all of the neurons of the previous layer. For example, a neural network designer may decide to use just a portion of padding.<br /> <br /> ==== Parameter sharing ====<br /> A parameter sharing scheme is used in convolutional layers to control the number of free parameters. It relies on the assumption that if a patch feature is useful to compute at some spatial position, then it should also be useful to compute at other positions. Denoting a single 2-dimensional slice of depth as a ''depth slice'', the neurons in each depth slice are constrained to use the same weights and bias.<br /> <br /> Since all neurons in a single depth slice share the same parameters, the forward pass in each depth slice of the convolutional layer can be computed as a [[convolution]] of the neuron's weights with the input volume.&lt;ref group=&quot;nb&quot;&gt;hence the name &quot;convolutional layer&quot;&lt;/ref&gt; Therefore, it is common to refer to the sets of weights as a filter (or a [[kernel (image processing)|kernel]]), which is convolved with the input. The result of this convolution is an [[activation function|activation map]], and the set of activation maps for each different filter are stacked together along the depth dimension to produce the output volume. Parameter sharing contributes to the [[translational symmetry|translation invariance]] of the CNN architecture.&lt;ref name=&quot;:6&quot;/&gt;<br /> <br /> Sometimes, the parameter sharing assumption may not make sense. This is especially the case when the input images to a CNN have some specific centered structure; for which we expect completely different features to be learned on different spatial locations. One practical example is when the inputs are faces that have been centered in the image: we might expect different eye-specific or hair-specific features to be learned in different parts of the image. In that case it is common to relax the parameter sharing scheme, and instead simply call the layer a &quot;locally connected layer&quot;.<br /> <br /> === Pooling layer ===<br /> [[File:Max pooling.png|thumb|314x314px|Max pooling with a 2x2 filter and stride = 2]]<br /> Another important concept of CNNs is pooling, which is a form of non-linear [[downsampling (signal processing)|down-sampling]]. There are several non-linear functions to implement pooling, where ''max pooling'' is the most common. It [[partition of a set|partitions]] the input image into a set of rectangles and, for each such sub-region, outputs the maximum.<br /> <br /> Intuitively, the exact location of a feature is less important than its rough location relative to other features. This is the idea behind the use of pooling in convolutional neural networks. The pooling layer serves to progressively reduce the spatial size of the representation, to reduce the number of parameters, [[memory footprint]] and amount of computation in the network, and hence to also control [[overfitting]]. This is known as down-sampling. It is common to periodically insert a pooling layer between successive convolutional layers (each one typically followed by an activation function, such as a [[#ReLU layer|ReLU layer]]) in a CNN architecture.&lt;ref name=&quot;Géron Hands-on ML 2019&quot;/&gt;{{rp|460–461}} While pooling layers contribute to local translation invariance, they do not provide global translation invariance in a CNN, unless a form of global pooling is used.&lt;ref name=&quot;:6&quot;/&gt;&lt;ref name=&quot;:5&quot;&gt;{{cite journal |last1=Azulay |first1=Aharon |last2=Weiss |first2=Yair |date=2019 |title=Why do deep convolutional networks generalize so poorly to small image transformations? |url=https://jmlr.org/papers/v20/19-519.html |journal=Journal of Machine Learning Research |volume=20 |issue=184 |pages=1–25 |issn=1533-7928 |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331211138/https://jmlr.org/papers/v20/19-519.html |url-status=live }}&lt;/ref&gt; The pooling layer commonly operates independently on every depth, or slice, of the input and resizes it spatially. A very common form of max pooling is a layer with filters of size 2×2, applied with a stride of 2, which subsamples every depth slice in the input by 2 along both width and height, discarding 75% of the activations:&lt;math display=&quot;block&quot;&gt;f_{X,Y}(S)=\max_{a,b=0}^1S_{2X+a,2Y+b}.&lt;/math&gt;<br /> In this case, every [[maximum|max operation]] is over 4 numbers. The depth dimension remains unchanged (this is true for other forms of pooling as well).<br /> <br /> In addition to max pooling, pooling units can use other functions, such as [[average]] pooling or [[Euclidean norm|ℓ&lt;sub&gt;2&lt;/sub&gt;-norm]] pooling. Average pooling was often used historically but has recently fallen out of favor compared to max pooling, which generally performs better in practice.&lt;ref name=&quot;Scherer-ICANN-2010&quot;&gt;{{cite conference<br /> |url=http://ais.uni-bonn.de/papers/icann2010_maxpool.pdf<br /> |title=Evaluation of Pooling Operations in Convolutional Architectures for Object Recognition<br /> |last1=Scherer<br /> |first1=Dominik<br /> |last2=Müller<br /> |first2=Andreas C.<br /> |last3=Behnke<br /> |first3=Sven<br /> |year=2010<br /> |publisher=Springer<br /> |book-title=Artificial Neural Networks (ICANN), 20th International Conference on<br /> |pages=92–101<br /> |location=Thessaloniki, Greece<br /> |access-date=2016-12-28<br /> |archive-date=2018-04-03<br /> |archive-url=https://web.archive.org/web/20180403185041/http://ais.uni-bonn.de/papers/icann2010_maxpool.pdf<br /> |url-status=live<br /> }}&lt;/ref&gt;<br /> <br /> Due to the effects of fast spatial reduction of the size of the representation,{{Which|date=December 2018}} there is a recent trend towards using smaller filters&lt;ref&gt;{{cite arXiv |title=Fractional Max-Pooling |eprint=1412.6071 |date=2014-12-18 |first=Benjamin |last=Graham |class=cs.CV}}&lt;/ref&gt; or discarding pooling layers altogether.&lt;ref&gt;{{cite arXiv |title=Striving for Simplicity: The All Convolutional Net |eprint=1412.6806 |date=2014-12-21 |first1=Jost Tobias |last1=Springenberg |first2=Alexey |last2=Dosovitskiy |first3=Thomas |last3=Brox |first4=Martin |last4=Riedmiller |class=cs.LG}}&lt;/ref&gt;<br /> <br /> [[File:RoI pooling animated.gif|thumb|400x300px|RoI pooling to size 2x2. In this example region proposal (an input parameter) has size 7x5.]]<br /> &quot;[[Region of interest|Region of Interest]]&quot; pooling (also known as RoI pooling) is a variant of max pooling, in which output size is fixed and input rectangle is a parameter.&lt;ref&gt;{{cite web<br /> |last=Grel<br /> |first=Tomasz<br /> |title=Region of interest pooling explained<br /> |website=deepsense.io<br /> |date=2017-02-28<br /> |url=https://deepsense.io/region-of-interest-pooling-explained/<br /> |access-date=5 April 2017<br /> |language=en<br /> |archive-date=2017-06-02<br /> |archive-url=https://web.archive.org/web/20170602070519/https://deepsense.io/region-of-interest-pooling-explained/<br /> |url-status=dead<br /> }}&lt;/ref&gt;<br /> <br /> Pooling is a downsampling method and an important component of convolutional neural networks for [[object detection]] based on the Fast R-CNN&lt;ref name=&quot;rcnn&quot;&gt;{{cite arXiv<br /> |title=Fast R-CNN<br /> |eprint=1504.08083<br /> |date=2015-09-27<br /> |first=Ross<br /> |last=Girshick<br /> |class=cs.CV}}&lt;/ref&gt; architecture. <br /> === Channel Max Pooling ===<br /> A CMP operation layer conducts the MP operation along the channel side among the corresponding positions of the consecutive feature maps for the purpose of redundant information elimination. The CMP makes the significant features gather together within fewer channels, which is important for fine-grained image classification that needs more discriminating features. Meanwhile, another advantage of the CMP operation is to make the channel number of feature maps smaller before it connects to the first fully connected (FC) layer. Similar to the MP operation, we denote the input feature maps and output feature maps of a CMP layer as F ∈ R(C×M×N) and C ∈ R(c×M×N), respectively, where C and c are the channel numbers of the input and output feature maps, M and N are the widths and the height of the feature maps, respectively. Note that the CMP operation only changes the channel number of the feature maps. The width and the height of the feature maps are not changed, which is different from the MP operation.&lt;ref name=&quot;Ma Chang Xie Ding 2019 pp. 3224–3233&quot;&gt;{{cite journal |last1=Ma |first1=Zhanyu |last2=Chang |first2=Dongliang |last3=Xie |first3=Jiyang |last4=Ding |first4=Yifeng |last5=Wen |first5=Shaoguo |last6=Li |first6=Xiaoxu |last7=Si |first7=Zhongwei |last8=Guo |first8=Jun |title=Fine-Grained Vehicle Classification With Channel Max Pooling Modified CNNs |journal=IEEE Transactions on Vehicular Technology |publisher=Institute of Electrical and Electronics Engineers (IEEE) |volume=68 |issue=4 |year=2019 |issn=0018-9545 |doi=10.1109/tvt.2019.2899972 |pages=3224–3233|s2cid=86674074 }}&lt;/ref&gt;<br /> <br /> === ReLU layer ===<br /> ReLU is the abbreviation of [[rectifier (neural networks)|rectified linear unit]] introduced by [[Kunihiko Fukushima]] in 1969.&lt;ref name=&quot;Fukushima1969&quot;/&gt;&lt;ref name=DLhistory/&gt; ReLU applies the non-saturating [[activation function]] &lt;math alt=&quot;function of x equals maximum between zero and x&quot; display=&quot;inline&quot;&gt;f(x)=\max(0,x)&lt;/math&gt;.&lt;ref name=&quot;:02&quot;&gt;{{cite journal |last1=Krizhevsky |first1=Alex |last2=Sutskever |first2=Ilya |last3=Hinton |first3=Geoffrey E. |date=2017-05-24 |title=ImageNet classification with deep convolutional neural networks |url=https://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf |journal=Communications of the ACM |volume=60 |issue=6 |pages=84–90 |doi=10.1145/3065386 |s2cid=195908774 |issn=0001-0782 |access-date=2018-12-04 |archive-date=2017-05-16 |archive-url=https://web.archive.org/web/20170516174757/http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural-networks.pdf |url-status=live }}&lt;/ref&gt; It effectively removes negative values from an activation map by setting them to zero.&lt;ref name=&quot;Romanuke4&quot;&gt;{{cite journal |last1=Romanuke |first1=Vadim |title=Appropriate number and allocation of ReLUs in convolutional neural networks |journal=Research Bulletin of NTUU &quot;Kyiv Polytechnic Institute&quot; |date=2017 |volume=1 |issue=1 |pages=69–78 |doi=10.20535/1810-0546.2017.1.88156 |doi-access=free}}&lt;/ref&gt; It introduces [[Nonlinearity_(disambiguation)|nonlinearity]] to the [[decision boundary|decision function]] and in the overall network without affecting the receptive fields of the convolution layers.<br /> In 2011, Xavier Glorot, Antoine Bordes and [[Yoshua Bengio]] found that ReLU enables better training of deeper networks,&lt;ref name=&quot;glorot2011&quot;&gt;{{cite conference |author1=Xavier Glorot |author2=Antoine Bordes |author3=[[Yoshua Bengio]] |year=2011 |title=Deep sparse rectifier neural networks |url=http://jmlr.org/proceedings/papers/v15/glorot11a/glorot11a.pdf |conference=AISTATS |quote=Rectifier and softplus activation functions. The second one is a smooth version of the first. |access-date=2023-04-10 |archive-date=2016-12-13 |archive-url=https://web.archive.org/web/20161213022121/http://jmlr.org/proceedings/papers/v15/glorot11a/glorot11a.pdf |url-status=dead }}&lt;/ref&gt; compared to widely used activation functions prior to 2011.<br /> <br /> Other functions can also be used to increase nonlinearity, for example the saturating [[hyperbolic tangent]] &lt;math alt=&quot;function of x equals hyperbolic tangent of x&quot;&gt;f(x)=\tanh(x)&lt;/math&gt;, &lt;math alt=&quot;function of x equals absolute value of the hyperbolic tangent of x&quot;&gt;f(x)=|\tanh(x)|&lt;/math&gt;, and the [[sigmoid function]] &lt;math alt=&quot;function of x equals the inverse of one plus e to the power of minus x&quot; display=&quot;inline&quot;&gt;\sigma(x)=(1+e^{-x} )^{-1}&lt;/math&gt;. ReLU is often preferred to other functions because it trains the neural network several times faster without a significant penalty to [[generalization (learning)|generalization]] accuracy.&lt;ref&gt;{{cite journal |last=Krizhevsky |first=A. |author2=Sutskever, I. |author3=Hinton, G. E. |title=Imagenet classification with deep convolutional neural networks |journal=Advances in Neural Information Processing Systems |volume=1 |year=2012 |pages=1097–1105 |url=https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331224736/https://proceedings.neurips.cc/paper/2012/file/c399862d3b9d6b76c8436e924a68c45b-Paper.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> === Fully connected layer ===<br /> After several convolutional and max pooling layers, the final classification is done via fully connected layers. Neurons in a fully connected layer have connections to all activations in the previous layer, as seen in regular (non-convolutional) [[artificial neural network]]s. Their activations can thus be computed as an [[affine transformation]], with [[matrix multiplication]] followed by a bias offset ([[vector addition]] of a learned or fixed bias term).<br /> <br /> === Loss layer ===<br /> {{Main|Loss function|Loss functions for classification}}<br /> The &quot;loss layer&quot;, or &quot;[[loss function]]&quot;, specifies how [[training]] penalizes the deviation between the predicted output of the network, and the [[ground truth|true]] data labels (during supervised learning). Various [[loss function]]s can be used, depending on the specific task.<br /> <br /> The [[Softmax function|Softmax]] loss function is used for predicting a single class of ''K'' mutually exclusive classes.&lt;ref group=&quot;nb&quot;&gt;So-called [[categorical data]].&lt;/ref&gt; [[Sigmoid function|Sigmoid]] [[cross entropy|cross-entropy]] loss is used for predicting ''K'' independent probability values in &lt;math&gt;[0,1]&lt;/math&gt;. [[Euclidean distance|Euclidean]] loss is used for [[regression (machine learning)|regressing]] to [[real number|real-valued]] labels &lt;math&gt;(-\infty,\infty)&lt;/math&gt;.<br /> <br /> == Hyperparameters ==<br /> {{More citations needed section|date=June 2017}}<br /> Hyperparameters are various settings that are used to control the learning process. CNNs use more [[hyperparameter (machine learning)|hyperparameters]] than a standard multilayer perceptron (MLP).<br /> <br /> === Kernel size ===<br /> The kernel is the number of pixels processed together. It is typically expressed as the kernel's dimensions, e.g., 2x2, or 3x3.<br /> <br /> === Padding ===<br /> Padding is the addition of (typically) 0-valued pixels on the borders of an image. This is done so that the border pixels are not undervalued (lost) from the output because they would ordinarily participate in only a single receptive field instance. The padding applied is typically one less than the corresponding kernel dimension. For example, a convolutional layer using 3x3 kernels would receive a 2-pixel pad, that is 1 pixel on each side of the image.&lt;ref&gt;{{cite web |title=6.3. Padding and Stride — Dive into Deep Learning 0.17.0 documentation |url=https://d2l.ai/chapter_convolutional-neural-networks/padding-and-strides.html |access-date=2021-08-12 |website=d2l.ai |archive-date=2021-08-12 |archive-url=https://web.archive.org/web/20210812202649/https://d2l.ai/chapter_convolutional-neural-networks/padding-and-strides.html |url-status=live }}&lt;/ref&gt;<br /> <br /> === Stride ===<br /> The stride is the number of pixels that the analysis window moves on each iteration. A stride of 2 means that each kernel is offset by 2 pixels from its predecessor.<br /> <br /> === Number of filters ===<br /> Since feature map size decreases with depth, layers near the input layer tend to have fewer filters while higher layers can have more. To equalize computation at each layer, the product of feature values ''v&lt;sub&gt;a&lt;/sub&gt;'' with pixel position is kept roughly constant across layers. Preserving more information about the input would require keeping the total number of activations (number of feature maps times number of pixel positions) non-decreasing from one layer to the next.<br /> <br /> The number of feature maps directly controls the capacity and depends on the number of available examples and task complexity.<br /> <br /> === Filter size ===<br /> Common filter sizes found in the literature vary greatly, and are usually chosen based on the data set.<br /> <br /> The challenge is to find the right level of granularity so as to create abstractions at the proper scale, given a particular data set, and without [[overfitting]].<br /> <br /> === Pooling type and size ===<br /> <br /> [[Max pooling]] is typically used, often with a 2x2 dimension. This implies that the input is drastically [[downsampling (signal processing)|downsampled]], reducing processing cost.<br /> <br /> Large input volumes may warrant 4×4 pooling in the lower layers.&lt;ref&gt;{{cite web |url=https://adeshpande3.github.io/adeshpande3.github.io/The-9-Deep-Learning-Papers-You-Need-To-Know-About.html |title=The 9 Deep Learning Papers You Need To Know About (Understanding CNNs Part 3) |last=Deshpande |first=Adit |website=adeshpande3.github.io |access-date=2018-12-04 |archive-date=2018-11-21 |archive-url=https://web.archive.org/web/20181121185730/https://adeshpande3.github.io/adeshpande3.github.io/The-9-Deep-Learning-Papers-You-Need-To-Know-About.html |url-status=live }}&lt;/ref&gt; Greater pooling [[dimensionality reduction|reduces the dimension]] of the signal, and may result in unacceptable [[data loss|information loss]]. Often, non-overlapping pooling windows perform best.&lt;ref name=&quot;Scherer-ICANN-2010&quot;/&gt;<br /> <br /> === Dilation ===<br /> Dilation involves ignoring pixels within a kernel. This reduces processing/memory potentially without significant signal loss. A dilation of 2 on a 3x3 kernel expands the kernel to 5x5, while still processing 9 (evenly spaced) pixels. Accordingly, dilation of 4 expands the kernel to 9x9&lt;ref&gt;{{Cite web |last=Pröve |first=Paul-Louis |date=2018-02-07 |title=An Introduction to different Types of Convolutions in Deep Learning |url=https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d |access-date=2022-07-27 |website=Medium |language=en |archive-date=2022-07-27 |archive-url=https://web.archive.org/web/20220727225642/https://towardsdatascience.com/types-of-convolutions-in-deep-learning-717013397f4d |url-status=live }}&lt;/ref&gt;.&lt;ref&gt;{{cite news |last=Seo |first=Jae Duk |date=2018-03-12 |title=Understanding 2D Dilated Convolution Operation with Examples in Numpy and Tensorflow with… |url=https://towardsdatascience.com/understanding-2d-dilated-convolution-operation-with-examples-in-numpy-and-tensorflow-with-d376b3972b25 |access-date=2021-08-12 |website=Medium |language=en |archive-date=2021-11-06 |archive-url=https://web.archive.org/web/20211106134140/https://towardsdatascience.com/understanding-2d-dilated-convolution-operation-with-examples-in-numpy-and-tensorflow-with-d376b3972b25 |url-status=live }}&lt;/ref&gt;<br /> <br /> == Translation equivariance and aliasing==<br /> It is commonly assumed that CNNs are invariant to shifts of the input. Convolution or pooling layers within a CNN that do not have a stride greater than one are indeed [[equivariant map|equivariant]] to translations of the input.&lt;ref name=&quot;:5&quot;/&gt; However, layers with a stride greater than one ignore the [[Nyquist–Shannon sampling theorem|Nyquist-Shannon sampling theorem]] and might lead to [[aliasing]] of the input signal&lt;ref name=&quot;:5&quot;/&gt; While, in principle, CNNs are capable of implementing anti-aliasing filters, it has been observed that this does not happen in practice &lt;ref&gt;{{cite book |last=Ribeiro,Schon |first=Antonio,Thomas |title=ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) |chapter=How Convolutional Neural Networks Deal with Aliasing |date=2021 |pages=2755–2759 |arxiv=2102.07757 |doi=10.1109/ICASSP39728.2021.9414627|isbn=978-1-7281-7605-5 |s2cid=231925012 }}&lt;/ref&gt; and yield models that are not equivariant to translations.<br /> Furthermore, if a CNN makes use of fully connected layers, translation equivariance does not imply translation invariance, as the fully connected layers are not invariant to shifts of the input.&lt;ref&gt;{{cite book |last1=Myburgh |first1=Johannes C. |last2=Mouton |first2=Coenraad |last3=Davel |first3=Marelie H. |title=Artificial Intelligence Research |chapter=Tracking Translation Invariance in CNNS |date=2020 |editor-last=Gerber |editor-first=Aurona |chapter-url=https://link.springer.com/chapter/10.1007%2F978-3-030-66151-9_18 |series=Communications in Computer and Information Science |volume=1342 |language=en |location=Cham |publisher=Springer International Publishing |pages=282–295 |doi=10.1007/978-3-030-66151-9_18 |arxiv=2104.05997 |isbn=978-3-030-66151-9 |s2cid=233219976 |access-date=2021-03-26 |archive-date=2022-01-22 |archive-url=https://web.archive.org/web/20220122015258/http://link.springer.com/chapter/10.1007/978-3-030-66151-9_18 |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;:6&quot;/&gt; One solution for complete translation invariance is avoiding any down-sampling throughout the network and applying global average pooling at the last layer.&lt;ref name=&quot;:5&quot;/&gt; Additionally, several other partial solutions have been proposed, such as [[anti-aliasing filter|anti-aliasing]] before downsampling operations,&lt;ref&gt;{{cite book |last=Richard |first=Zhang |url=https://www.worldcat.org/oclc/1106340711 |title=Making Convolutional Networks Shift-Invariant Again |date=2019-04-25 |oclc=1106340711}}&lt;/ref&gt; spatial transformer networks,&lt;ref&gt;{{cite journal |last=Jadeberg, Simonyan, Zisserman, Kavukcuoglu |first=Max, Karen, Andrew, Koray |date=2015 |title=Spatial Transformer Networks |url=https://proceedings.neurips.cc/paper/2015/file/33ceb07bf4eeb3da587e268d663aba1a-Paper.pdf |journal=Advances in Neural Information Processing Systems |volume=28 |via=NIPS |access-date=2021-03-26 |archive-date=2021-07-25 |archive-url=https://web.archive.org/web/20210725115312/https://proceedings.neurips.cc/paper/2015/file/33ceb07bf4eeb3da587e268d663aba1a-Paper.pdf |url-status=live }}&lt;/ref&gt; [[data augmentation]], subsampling combined with pooling,&lt;ref name=&quot;:6&quot;/&gt; and [[capsule neural network]]s.&lt;ref&gt;{{cite book |last=E |first=Sabour, Sara Frosst, Nicholas Hinton, Geoffrey |url=https://worldcat.org/oclc/1106278545 |title=Dynamic Routing Between Capsules |date=2017-10-26 |oclc=1106278545}}&lt;/ref&gt;<br /> <br /> == Evaluation ==<br /> The accuracy of the final model is based on a sub-part of the dataset set apart at the start, often called a test-set. Other times methods such as [[cross-validation (statistics)|''k''-fold cross-validation]] are applied. Other strategies include using [[conformal prediction]].&lt;ref&gt;{{cite journal |date=2019-06-01 |title=Inductive conformal predictor for convolutional neural networks: Applications to active learning for image classification |url=https://www.sciencedirect.com/science/article/abs/pii/S003132031930055X |journal=Pattern Recognition |language=en |volume=90 |pages=172–182 |doi=10.1016/j.patcog.2019.01.035 |issn=0031-3203 |last1=Matiz |first1=Sergio |last2=Barner |first2=Kenneth E. |bibcode=2019PatRe..90..172M |s2cid=127253432 |access-date=2021-09-29 |archive-date=2021-09-29 |archive-url=https://web.archive.org/web/20210929092610/https://www.sciencedirect.com/science/article/abs/pii/S003132031930055X |url-status=live }}&lt;/ref&gt;&lt;ref&gt;{{cite journal |last1=Wieslander |first1=Håkan |last2=Harrison |first2=Philip J. |last3=Skogberg |first3=Gabriel |last4=Jackson |first4=Sonya |last5=Fridén |first5=Markus |last6=Karlsson |first6=Johan |last7=Spjuth |first7=Ola |last8=Wählby |first8=Carolina |date=February 2021 |title=Deep Learning With Conformal Prediction for Hierarchical Analysis of Large-Scale Whole-Slide Tissue Images |url=https://ieeexplore.ieee.org/document/9103229 |journal=IEEE Journal of Biomedical and Health Informatics |volume=25 |issue=2 |pages=371–380 |doi=10.1109/JBHI.2020.2996300 |pmid=32750907 |s2cid=219885788 |issn=2168-2208 |access-date=2022-01-29 |archive-date=2022-01-20 |archive-url=https://web.archive.org/web/20220120141410/https://ieeexplore.ieee.org/document/9103229/ |url-status=live }}&lt;/ref&gt;<br /> <br /> == Regularization methods ==<br /> {{Main|Regularization (mathematics)}}<br /> {{More citations needed section|date=June 2017}}<br /> [[Regularization (mathematics)|Regularization]] is a process of introducing additional information to solve an [[ill-posed problem]] or to prevent [[overfitting]]. CNNs use various types of regularization.<br /> <br /> === Empirical ===<br /> <br /> ==== Dropout ====<br /> Because a fully connected layer occupies most of the parameters, it is prone to overfitting. One method to reduce overfitting is [[dropout (neural networks)|dropout]], introduced in 2014.&lt;ref&gt;{{cite journal |last=Srivastava |first=Nitish |author2=C. Geoffrey Hinton |author3=Alex Krizhevsky |author4=Ilya Sutskever |author5=Ruslan Salakhutdinov |title=Dropout: A Simple Way to Prevent Neural Networks from overfitting |journal=Journal of Machine Learning Research |year=2014 |volume=15 |issue=1 |pages=1929–1958 |url=http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf |access-date=2015-01-03 |archive-date=2016-01-19 |archive-url=https://web.archive.org/web/20160119155849/http://www.cs.toronto.edu/~rsalakhu/papers/srivastava14a.pdf |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;DLPATTERNS&quot;&gt;{{cite web |title=A Pattern Language for Deep Learning |author=Carlos E. Perez |url=http://www.deeplearningpatterns.com/ |access-date=2016-06-15 |archive-date=2017-06-03 |archive-url=https://web.archive.org/web/20170603205959/http://deeplearningpatterns.com/ |url-status=live }}&lt;/ref&gt; At each training stage, individual nodes are either &quot;dropped out&quot; of the net (ignored) with probability &lt;math&gt;1-p&lt;/math&gt; or kept with probability &lt;math&gt;p&lt;/math&gt;, so that a reduced network is left; incoming and outgoing edges to a dropped-out node are also removed. Only the reduced network is trained on the data in that stage. The removed nodes are then reinserted into the network with their original weights.<br /> <br /> In the training stages, &lt;math&gt;p&lt;/math&gt; is usually 0.5; for input nodes, it is typically much higher because information is directly lost when input nodes are ignored.<br /> <br /> At testing time after training has finished, we would ideally like to find a sample average of all possible &lt;math&gt;2^n&lt;/math&gt; dropped-out networks; unfortunately this is unfeasible for large values of &lt;math&gt;n&lt;/math&gt;. However, we can find an approximation by using the full network with each node's output weighted by a factor of &lt;math&gt;p&lt;/math&gt;, so the [[expected value]] of the output of any node is the same as in the training stages. This is the biggest contribution of the dropout method: although it effectively generates &lt;math&gt;2^n&lt;/math&gt; neural nets, and as such allows for model combination, at test time only a single network needs to be tested.<br /> <br /> By avoiding training all nodes on all training data, dropout decreases overfitting. The method also significantly improves training speed. This makes the model combination practical, even for [[deep neural network]]s. The technique seems to reduce node interactions, leading them to learn more robust features{{Clarify|reason=|date=December 2018}} that better generalize to new data.<br /> <br /> ==== DropConnect ====<br /> <br /> DropConnect is the generalization of dropout in which each connection, rather than each output unit, can be dropped with probability &lt;math&gt;1-p&lt;/math&gt;. Each unit thus receives input from a random subset of units in the previous layer.&lt;ref&gt;{{cite journal |title=Regularization of Neural Networks using DropConnect {{!}} ICML 2013 {{!}} JMLR W&amp;CP |pages=1058–1066 |url=http://proceedings.mlr.press/v28/wan13.html |website=jmlr.org |access-date=2015-12-17 |date=2013-02-13 |archive-date=2017-08-12 |archive-url=https://web.archive.org/web/20170812080411/http://proceedings.mlr.press/v28/wan13.html |url-status=live }}&lt;/ref&gt;<br /> <br /> DropConnect is similar to dropout as it introduces dynamic sparsity within the model, but differs in that the sparsity is on the weights, rather than the output vectors of a layer. In other words, the fully connected layer with DropConnect becomes a sparsely connected layer in which the connections are chosen at random during the training stage.<br /> <br /> ==== Stochastic pooling ====<br /> A major drawback to Dropout is that it does not have the same benefits for convolutional layers, where the neurons are not fully connected.<br /> <br /> Even before Dropout, in 2013 a technique called stochastic pooling,&lt;ref&gt;{{cite arXiv |title=Stochastic Pooling for Regularization of Deep Convolutional Neural Networks |eprint=1301.3557 |date=2013-01-15 |first1=Matthew D. |last1=Zeiler |first2=Rob |last2=Fergus |class=cs.LG}}&lt;/ref&gt; the conventional [[deterministic algorithm|deterministic]] pooling operations were replaced with a stochastic procedure, where the activation within each pooling region is picked randomly according to a [[multinomial distribution]], given by the activities within the pooling region. This approach is free of hyperparameters and can be combined with other regularization approaches, such as dropout and [[data augmentation]].<br /> <br /> An alternate view of stochastic pooling is that it is equivalent to standard max pooling but with many copies of an input image, each having small local [[deformation theory|deformations]]. This is similar to explicit [[elastic deformation]]s of the input images,&lt;ref name=&quot;:3&quot;/&gt; which delivers excellent performance on the [[MNIST database|MNIST data set]].&lt;ref name=&quot;:3&quot;&gt;{{cite journal |title=Best Practices for Convolutional Neural Networks Applied to Visual Document Analysis – Microsoft Research |url=https://www.microsoft.com/en-us/research/publication/best-practices-for-convolutional-neural-networks-applied-to-visual-document-analysis/?from=http%3A%2F%2Fresearch.microsoft.com%2Fapps%2Fpubs%2F%3Fid%3D68920 |journal=Microsoft Research |access-date=2015-12-17 |date=August 2003 |last1=Platt |first1=John |last2=Steinkraus |first2=Dave |last3=Simard |first3=Patrice Y. |archive-date=2017-11-07 |archive-url=https://web.archive.org/web/20171107112839/https://www.microsoft.com/en-us/research/publication/best-practices-for-convolutional-neural-networks-applied-to-visual-document-analysis/?from=http%3A%2F%2Fresearch.microsoft.com%2Fapps%2Fpubs%2F%3Fid%3D68920 |url-status=live }}&lt;/ref&gt; Using stochastic pooling in a multilayer model gives an exponential number of deformations since the selections in higher layers are independent of those below.<br /> <br /> ==== Artificial data ====<br /> {{Main|Data augmentation}}<br /> Because the degree of model overfitting is determined by both its power and the amount of training it receives, providing a convolutional network with more training examples can reduce overfitting. Because there is often not enough available data to train, especially considering that some part should be spared for later testing, two approaches are to either generate new data from scratch (if possible) or perturb existing data to create new ones. The latter one is used since mid-1990s.&lt;ref name=&quot;lecun95&quot; /&gt; For example, input images can be cropped, rotated, or rescaled to create new examples with the same labels as the original training set.&lt;ref&gt;{{cite arXiv |title=Improving neural networks by preventing co-adaptation of feature detectors |eprint=1207.0580 |last1=Hinton |first1=Geoffrey E. |last2=Srivastava |first2=Nitish |last3=Krizhevsky |first3=Alex |last4=Sutskever |first4=Ilya |last5=Salakhutdinov |first5=Ruslan R. |class=cs.NE |year=2012}}&lt;/ref&gt;<br /> <br /> === Explicit ===<br /> <br /> ==== Early stopping ====<br /> {{Main|Early stopping}}<br /> One of the simplest methods to prevent overfitting of a network is to simply stop the training before overfitting has had a chance to occur. It comes with the disadvantage that the learning process is halted.<br /> <br /> ==== Number of parameters ====<br /> Another simple way to prevent overfitting is to limit the number of parameters, typically by limiting the number of hidden units in each layer or limiting network depth. For convolutional networks, the filter size also affects the number of parameters. Limiting the number of parameters restricts the predictive power of the network directly, reducing the complexity of the function that it can perform on the data, and thus limits the amount of overfitting. This is equivalent to a &quot;[[zero norm]]&quot;.<br /> <br /> ==== Weight decay ====<br /> A simple form of added regularizer is weight decay, which simply adds an additional error, proportional to the sum of weights ([[L1-norm|L1 norm]]) or squared magnitude ([[L2 norm]]) of the weight vector, to the error at each node. The level of acceptable model complexity can be reduced by increasing the proportionality constant('alpha' hyperparameter), thus increasing the penalty for large weight vectors.<br /> <br /> L2 regularization is the most common form of regularization. It can be implemented by penalizing the squared magnitude of all parameters directly in the objective. The L2 regularization has the intuitive interpretation of heavily penalizing peaky weight vectors and preferring diffuse weight vectors. Due to multiplicative interactions between weights and inputs this has the useful property of encouraging the network to use all of its inputs a little rather than some of its inputs a lot.<br /> <br /> L1 regularization is also common. It makes the weight vectors sparse during optimization. In other words, neurons with L1 regularization end up using only a sparse subset of their most important inputs and become nearly invariant to the noisy inputs. L1 with L2 regularization can be combined; this is called [[elastic net regularization]].<br /> <br /> ==== Max norm constraints ====<br /> Another form of regularization is to enforce an absolute upper bound on the magnitude of the weight vector for every neuron and use [[sparse approximation#Projected Gradient Descent|projected gradient descent]] to enforce the constraint. In practice, this corresponds to performing the parameter update as normal, and then enforcing the constraint by clamping the weight vector &lt;math&gt;\vec{w}&lt;/math&gt; of every neuron to satisfy &lt;math&gt;\|\vec{w}\|_{2}&lt;c&lt;/math&gt;. Typical values of &lt;math&gt;c&lt;/math&gt; are order of 3–4. Some papers report improvements&lt;ref&gt;{{cite web |title=Dropout: A Simple Way to Prevent Neural Networks from Overfitting |url=https://jmlr.org/papers/v15/srivastava14a.html |website=jmlr.org |access-date=2015-12-17 |archive-date=2016-03-05 |archive-url=https://web.archive.org/web/20160305010425/http://jmlr.org/papers/v15/srivastava14a.html |url-status=live }}&lt;/ref&gt; when using this form of regularization.<br /> <br /> == Hierarchical coordinate frames ==<br /> Pooling loses the precise spatial relationships between high-level parts (such as nose and mouth in a face image). These relationships are needed for identity recognition. Overlapping the pools so that each feature occurs in multiple pools, helps retain the information. Translation alone cannot extrapolate the understanding of geometric relationships to a radically new viewpoint, such as a different orientation or scale. On the other hand, people are very good at extrapolating; after seeing a new shape once they can recognize it from a different viewpoint.&lt;ref&gt;{{cite journal |last1=Hinton |first1=Geoffrey |year=1979 |title=Some demonstrations of the effects of structural descriptions in mental imagery |journal=Cognitive Science |volume=3 |issue=3 |pages=231–250 |doi=10.1016/s0364-0213(79)80008-7}}&lt;/ref&gt;<br /> <br /> An earlier common way to deal with this problem is to train the network on transformed data in different orientations, scales, lighting, etc. so that the network can cope with these variations. This is computationally intensive for large data-sets. The alternative is to use a hierarchy of coordinate frames and use a group of neurons to represent a conjunction of the shape of the feature and its pose relative to the [[retina]]. The pose relative to the retina is the relationship between the coordinate frame of the retina and the intrinsic features' coordinate frame.&lt;ref&gt;Rock, Irvin. &quot;The frame of reference.&quot; The legacy of Solomon Asch: Essays in cognition and social psychology (1990): 243–268.&lt;/ref&gt;<br /> <br /> Thus, one way to represent something is to embed the coordinate frame within it. This allows large features to be recognized by using the consistency of the poses of their parts (e.g. nose and mouth poses make a consistent prediction of the pose of the whole face). This approach ensures that the higher-level entity (e.g. face) is present when the lower-level (e.g. nose and mouth) agree on its prediction of the pose. The vectors of neuronal activity that represent pose (&quot;pose vectors&quot;) allow spatial transformations modeled as linear operations that make it easier for the network to learn the hierarchy of visual entities and generalize across viewpoints. This is similar to the way the human [[visual system]] imposes coordinate frames in order to represent shapes.&lt;ref&gt;J. Hinton, Coursera lectures on Neural Networks, 2012, Url: https://www.coursera.org/learn/neural-networks {{Webarchive|url=https://web.archive.org/web/20161231174321/https://www.coursera.org/learn/neural-networks |date=2016-12-31}}&lt;/ref&gt;<br /> <br /> == Applications ==<br /> <br /> === Image recognition ===<br /> CNNs are often used in [[image recognition]] systems. In 2012, an [[per-comparison error rate|error rate]] of 0.23% on the [[MNIST database]] was reported.&lt;ref name=&quot;mcdns&quot;/&gt; Another paper on using CNN for image classification reported that the learning process was &quot;surprisingly fast&quot;; in the same paper, the best published results as of 2011 were achieved in the MNIST database and the NORB database.&lt;ref name=&quot;flexible&quot;/&gt; Subsequently, a similar CNN called<br /> [[AlexNet]]&lt;ref name=quartz&gt;{{cite web<br /> |website=[[Quartz (website)|Quartz]]<br /> |author=Dave Gershgorn<br /> |title=The inside story of how AI got good enough to dominate Silicon Valley<br /> |url=https://qz.com/1307091/the-inside-story-of-how-ai-got-good-enough-to-dominate-silicon-valley/<br /> |date=18 June 2018<br /> |access-date=5 October 2018<br /> |archive-date=12 December 2019<br /> |archive-url=https://web.archive.org/web/20191212224842/https://qz.com/1307091/the-inside-story-of-how-ai-got-good-enough-to-dominate-silicon-valley/<br /> |url-status=live<br /> }}&lt;/ref&gt; won the [[ImageNet Large Scale Visual Recognition Challenge]] 2012.<br /> <br /> When applied to [[facial recognition system|facial recognition]], CNNs achieved a large decrease in error rate.&lt;ref&gt;{{cite journal |last=Lawrence |first=Steve |author2=C. Lee Giles |author3=Ah Chung Tsoi |author4=Andrew D. Back |title=Face Recognition: A Convolutional Neural Network Approach |journal=IEEE Transactions on Neural Networks |year=1997 |volume=8 |issue=1 |pages=98–113 |citeseerx=10.1.1.92.5813 |doi=10.1109/72.554195 |pmid=18255614|s2cid=2883848 }}&lt;/ref&gt; Another paper reported a 97.6% recognition rate on &quot;5,600 still images of more than 10 subjects&quot;.&lt;ref name=&quot;robust face detection&quot;/&gt; CNNs were used to assess [[video quality]] in an objective way after manual training; the resulting system had a very low [[root mean square error]].&lt;ref name=&quot;video quality&quot;&gt;{{cite journal |last=Le Callet |first=Patrick |author2=Christian Viard-Gaudin |author3=Dominique Barba |year=2006 |title=A Convolutional Neural Network Approach for Objective Video Quality Assessment |url=https://hal.archives-ouvertes.fr/file/index/docid/287426/filename/A_convolutional_neural_network_approach_for_objective_video_quality_assessment_completefinal_manuscript.pdf |journal=IEEE Transactions on Neural Networks |volume=17 |issue=5 |pages=1316–1327 |doi=10.1109/TNN.2006.879766 |pmid=17001990 |s2cid=221185563 |access-date=17 November 2013 |archive-date=24 February 2021 |archive-url=https://web.archive.org/web/20210224123804/https://hal.archives-ouvertes.fr/file/index/docid/287426/filename/A_convolutional_neural_network_approach_for_objective_video_quality_assessment_completefinal_manuscript.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> The [[ImageNet Large Scale Visual Recognition Challenge]] is a benchmark in object classification and detection, with millions of images and hundreds of object classes. In the ILSVRC 2014,&lt;ref name=&quot;ILSVRC2014&quot;&gt;{{cite web |url=https://image-net.org/challenges/LSVRC/2014/results |title=ImageNet Large Scale Visual Recognition Competition 2014 (ILSVRC2014) |access-date=30 January 2016 |archive-date=5 February 2016 |archive-url=https://web.archive.org/web/20160205153105/http://www.image-net.org/challenges/LSVRC/2014/results |url-status=live }}&lt;/ref&gt; a large-scale visual recognition challenge, almost every highly ranked team used CNN as their basic framework. The winner [[GoogLeNet]]&lt;ref name=googlenet&gt;{{cite conference<br /> | last1 = Szegedy | first1 = Christian<br /> | last2 = Liu | first2 = Wei<br /> | last3 = Jia | first3 = Yangqing<br /> | last4 = Sermanet | first4 = Pierre<br /> | last5 = Reed | first5 = Scott E.<br /> | last6 = Anguelov | first6 = Dragomir<br /> | last7 = Erhan | first7 = Dumitru<br /> | last8 = Vanhoucke | first8 = Vincent<br /> | last9 = Rabinovich | first9 = Andrew<br /> | arxiv = 1409.4842<br /> | contribution = Going deeper with convolutions<br /> | doi = 10.1109/CVPR.2015.7298594<br /> | pages = 1–9<br /> | publisher = IEEE Computer Society<br /> | title = IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2015, Boston, MA, USA, June 7–12, 2015<br /> | year = 2015}}&lt;/ref&gt; (the foundation of [[DeepDream]]) increased the mean average [[precision and recall|precision]] of object detection to 0.439329, and reduced classification error to 0.06656, the best result to date. Its network applied more than 30 layers. That performance of convolutional neural networks on the ImageNet tests was close to that of humans.&lt;ref&gt;{{cite arXiv |eprint=1409.0575 |last1=Russakovsky |first1=Olga |title=Image ''Net'' Large Scale Visual Recognition Challenge |last2=Deng |first2=Jia |last3=Su |first3=Hao |last4=Krause |first4=Jonathan |last5=Satheesh |first5=Sanjeev |last6=Ma |first6=Sean |last7=Huang |first7=Zhiheng |last8=Karpathy |first8=Andrej |author-link8=Andrej Karpathy |last9=Khosla |first9=Aditya |last10=Bernstein |first10=Michael |last11=Berg |first11=Alexander C. |last12=Fei-Fei |first12=Li |class=cs.CV |year=2014 |author1-link=Olga Russakovsky}}&lt;/ref&gt; The best algorithms still struggle with objects that are small or thin, such as a small ant on a stem of a flower or a person holding a quill in their hand. They also have trouble with images that have been distorted with filters, an increasingly common phenomenon with modern digital cameras. By contrast, those kinds of images rarely trouble humans. Humans, however, tend to have trouble with other issues. For example, they are not good at classifying objects into fine-grained categories such as the particular breed of dog or species of bird, whereas convolutional neural networks handle this.{{citation needed|date=June 2019}}<br /> <br /> In 2015, a many-layered CNN demonstrated the ability to spot faces from a wide range of angles, including upside down, even when partially occluded, with competitive performance. The network was trained on a database of 200,000 images that included faces at various angles and orientations and a further 20 million images without faces. They used batches of 128 images over 50,000 iterations.&lt;ref&gt;{{cite news |url=https://www.technologyreview.com/2015/02/16/169357/the-face-detection-algorithm-set-to-revolutionize-image-search/ |title=The Face Detection Algorithm Set To Revolutionize Image Search |date=February 16, 2015 |work=Technology Review |access-date=27 October 2017 |archive-date=20 September 2020 |archive-url=https://web.archive.org/web/20200920130711/https://www.technologyreview.com/2015/02/16/169357/the-face-detection-algorithm-set-to-revolutionize-image-search/ |url-status=live }}&lt;/ref&gt;<br /> <br /> === Video analysis ===<br /> Compared to image data domains, there is relatively little work on applying CNNs to video classification. Video is more complex than images since it has another (temporal) dimension. However, some extensions of CNNs into the video domain have been explored. One approach is to treat space and time as equivalent dimensions of the input and perform convolutions in both time and space.&lt;ref&gt;{{cite book |publisher=Springer Berlin Heidelberg |date=2011-11-16 |isbn=978-3-642-25445-1 |pages=29–39 |series=Lecture Notes in Computer Science |first1=Moez |last1=Baccouche |first2=Franck |last2=Mamalet |first3=Christian |last3=Wolf |first4=Christophe |last4=Garcia |first5=Atilla |last5=Baskurt |editor-first=Albert Ali |editor-last=Salah |editor-first2=Bruno |editor-last2=Lepri |doi=10.1007/978-3-642-25446-8_4 |chapter=Sequential Deep Learning for Human Action Recognition |title=Human Behavior Unterstanding |volume=7065 |citeseerx=10.1.1.385.4740}}&lt;/ref&gt;&lt;ref&gt;{{cite journal |title=3D Convolutional Neural Networks for Human Action Recognition |journal=IEEE Transactions on Pattern Analysis and Machine Intelligence |date=2013-01-01 |issn=0162-8828 |pages=221–231 |volume=35 |issue=1 |doi=10.1109/TPAMI.2012.59 |pmid=22392705 |first1=Shuiwang |last1=Ji |first2=Wei |last2=Xu |first3=Ming |last3=Yang |first4=Kai |last4=Yu |citeseerx=10.1.1.169.4046 |s2cid=1923924}}&lt;/ref&gt; Another way is to fuse the features of two convolutional neural networks, one for the spatial and one for the temporal stream.&lt;ref&gt;{{cite arXiv |last1=Huang |first1=Jie |last2=Zhou |first2=Wengang |last3=Zhang |first3=Qilin |last4=Li |first4=Houqiang |last5=Li |first5=Weiping |title=Video-based Sign Language Recognition without Temporal Segmentation |eprint=1801.10111 |class=cs.CV |year=2018}}&lt;/ref&gt;&lt;ref&gt;Karpathy, Andrej, et al. &quot;[https://www.cv-foundation.org/openaccess/content_cvpr_2014/papers/Karpathy_Large-scale_Video_Classification_2014_CVPR_paper.pdf Large-scale video classification with convolutional neural networks] {{Webarchive|url=https://web.archive.org/web/20190806022753/https://www.cv-foundation.org/openaccess/content_cvpr_2014/papers/Karpathy_Large-scale_Video_Classification_2014_CVPR_paper.pdf |date=2019-08-06 }}.&quot; IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2014.&lt;/ref&gt;&lt;ref&gt;{{cite arXiv |eprint=1406.2199 |last1=Simonyan |first1=Karen |title=Two-Stream Convolutional Networks for Action Recognition in Videos |last2=Zisserman |first2=Andrew |class=cs.CV |year=2014}} (2014).&lt;/ref&gt; [[Long short-term memory]] (LSTM) [[recurrent neural network|recurrent]] units are typically incorporated after the CNN to account for inter-frame or inter-clip dependencies.&lt;ref name=&quot;Wang Duan Zhang Niu p=1657&quot;&gt;{{cite journal |last1=Wang |first1=Le |last2=Duan |first2=Xuhuan |last3=Zhang |first3=Qilin |last4=Niu |first4=Zhenxing |last5=Hua |first5=Gang |last6=Zheng |first6=Nanning |title=Segment-Tube: Spatio-Temporal Action Localization in Untrimmed Videos with Per-Frame Segmentation |journal=Sensors |volume=18 |issue=5 |date=2018-05-22 |issn=1424-8220 |doi=10.3390/s18051657 |pmid=29789447 |pmc=5982167 |page=1657 |bibcode=2018Senso..18.1657W |url=https://qilin-zhang.github.io/_pages/pdfs/Segment-Tube_Spatio-Temporal_Action_Localization_in_Untrimmed_Videos_with_Per-Frame_Segmentation.pdf |doi-access=free |access-date=2018-09-14 |archive-date=2021-03-01 |archive-url=https://web.archive.org/web/20210301195518/https://qilin-zhang.github.io/_pages/pdfs/Segment-Tube_Spatio-Temporal_Action_Localization_in_Untrimmed_Videos_with_Per-Frame_Segmentation.pdf |url-status=live }}&lt;/ref&gt;&lt;ref name=&quot;Duan Wang Zhai Zheng 2018 p. &quot;&gt;{{cite conference |last1=Duan |first1=Xuhuan |last2=Wang |first2=Le |last3=Zhai |first3=Changbo |last4=Zheng |first4=Nanning |last5=Zhang |first5=Qilin |last6=Niu |first6=Zhenxing |last7=Hua |first7=Gang |title=2018 25th IEEE International Conference on Image Processing (ICIP) |chapter=Joint Spatio-Temporal Action Localization in Untrimmed Videos with Per-Frame Segmentation |publisher=25th IEEE International Conference on Image Processing (ICIP) |year=2018 |pages=918–922 |isbn=978-1-4799-7061-2 |doi=10.1109/icip.2018.8451692}}&lt;/ref&gt; [[Unsupervised learning]] schemes for training spatio-temporal features have been introduced, based on Convolutional Gated Restricted [[Boltzmann machine|Boltzmann Machines]]&lt;ref&gt;{{cite conference |title=Convolutional Learning of Spatio-temporal Features |url=https://dl.acm.org/doi/10.5555/1888212 |publisher=Springer-Verlag |conference=Proceedings of the 11th European Conference on Computer Vision: Part VI |date=2010-01-01 |location=Berlin, Heidelberg |isbn=978-3-642-15566-6 |pages=140–153 |series=ECCV'10 |first1=Graham W. |last1=Taylor |first2=Rob |last2=Fergus |first3=Yann |last3=LeCun |first4=Christoph |last4=Bregler |access-date=2022-03-31 |archive-date=2022-03-31 |archive-url=https://web.archive.org/web/20220331211137/https://dl.acm.org/doi/10.5555/1888212 |url-status=live }}&lt;/ref&gt; and Independent Subspace Analysis.&lt;ref&gt;{{cite book |publisher=IEEE Computer Society |date=2011-01-01 |location=Washington, DC, USA |isbn=978-1-4577-0394-2 |pages=3361–3368 |series=CVPR '11 |doi=10.1109/CVPR.2011.5995496 |first1=Q. V. |last1=Le |first2=W. Y. |last2=Zou |first3=S. Y. |last3=Yeung |first4=A. Y. |last4=Ng |title=CVPR 2011 |chapter=Learning hierarchical invariant spatio-temporal features for action recognition with independent subspace analysis |citeseerx=10.1.1.294.5948 |s2cid=6006618}}&lt;/ref&gt; It's Application can be seen in [[Text-to-Video model]].&lt;ref&gt;{{Cite web |title=Leading India.ai |url=https://www.leadingindia.ai/downloads/projects/VP/vp_16.pdf |access-date=2022-10-13 |archive-date=2022-10-14 |archive-url=https://web.archive.org/web/20221014091907/https://www.leadingindia.ai/downloads/projects/VP/vp_16.pdf |url-status=live }}&lt;/ref&gt; <br /> <br /> === Natural language processing ===<br /> CNNs have also been explored for [[natural language processing]]. CNN models are effective for various NLP problems and achieved excellent results in [[semantic parsing]],&lt;ref&gt;{{cite arXiv |title=A Deep Architecture for Semantic Parsing |eprint=1404.7296 |date=2014-04-29 |first1=Edward |last1=Grefenstette |first2=Phil |last2=Blunsom |first3=Nando |last3=de Freitas |first4=Karl Moritz |last4=Hermann |class=cs.CL}}&lt;/ref&gt; search query retrieval,&lt;ref&gt;{{cite journal |title=Learning Semantic Representations Using Convolutional Neural Networks for Web Search – Microsoft Research |url=https://www.microsoft.com/en-us/research/publication/learning-semantic-representations-using-convolutional-neural-networks-for-web-search/?from=http%3A%2F%2Fresearch.microsoft.com%2Fapps%2Fpubs%2Fdefault.aspx%3Fid%3D214617 |journal=Microsoft Research |access-date=2015-12-17 |date=April 2014 |last1=Mesnil |first1=Gregoire |last2=Deng |first2=Li |last3=Gao |first3=Jianfeng |last4=He |first4=Xiaodong |last5=Shen |first5=Yelong |archive-date=2017-09-15 |archive-url=https://web.archive.org/web/20170915160617/https://www.microsoft.com/en-us/research/publication/learning-semantic-representations-using-convolutional-neural-networks-for-web-search/?from=http%3A%2F%2Fresearch.microsoft.com%2Fapps%2Fpubs%2Fdefault.aspx%3Fid%3D214617 |url-status=live }}&lt;/ref&gt; sentence modeling,&lt;ref&gt;{{cite arXiv |title=A Convolutional Neural Network for Modelling Sentences |eprint=1404.2188 |date=2014-04-08 |first1=Nal |last1=Kalchbrenner |first2=Edward |last2=Grefenstette |first3=Phil |last3=Blunsom |class=cs.CL}}&lt;/ref&gt; classification,&lt;ref&gt;{{cite arXiv |title=Convolutional Neural Networks for Sentence Classification |eprint=1408.5882 |date=2014-08-25 |first=Yoon |last=Kim |class=cs.CL}}&lt;/ref&gt; prediction&lt;ref&gt;Collobert, Ronan, and Jason Weston. &quot;[https://thetalkingmachines.com/sites/default/files/2018-12/unified_nlp.pdf A unified architecture for natural language processing: Deep neural networks with multitask learning] {{Webarchive|url=https://web.archive.org/web/20190904161653/https://thetalkingmachines.com/sites/default/files/2018-12/unified_nlp.pdf |date=2019-09-04 }}.&quot;Proceedings of the 25th international conference on Machine learning. ACM, 2008.&lt;/ref&gt; and other traditional NLP tasks.&lt;ref&gt;{{cite arXiv |title=Natural Language Processing (almost) from Scratch |eprint=1103.0398 |date=2011-03-02 |first1=Ronan |last1=Collobert |first2=Jason |last2=Weston |first3=Leon |last3=Bottou |first4=Michael |last4=Karlen |first5=Koray |last5=Kavukcuoglu |first6=Pavel |last6=Kuksa |class=cs.LG}}&lt;/ref&gt;<br /> Compared to traditional language processing methods such as [[recurrent neural networks]], CNNs can represent different contextual realities of language that do not rely on a series-sequence assumption, while RNNs are better suitable when classical time series modeling is required.&lt;ref&gt;{{cite arXiv |title=Comparative study of CNN and RNN for natural language processing |eprint=1702.01923 |date=2017-03-02 |first1=W |last1=Yin |first2=K |last2=Kann |first3=M |last3=Yu |first4=H |last4=Schütze |class=cs.LG}}&lt;/ref&gt;<br /> &lt;ref&gt;{{cite arXiv |title=An empirical evaluation of generic convolutional and recurrent networks for sequence modeling |eprint=1803.01271 |first1=S. |last1=Bai |first2=J.S. |last2=Kolter |first3=V. |last3=Koltun |year=2018 |class=cs.LG}}&lt;/ref&gt;<br /> &lt;ref&gt;{{cite journal |title=Detecting dynamics of action in text with a recurrent neural network |journal=Neural Computing and Applications |year=2021 |volume=33 |last1=Gruber |first1=N. |issue=12 |pages=15709–15718 |doi=10.1007/S00521-021-06190-5 |s2cid=236307579 |url=https://www.semanticscholar.org/paper/Detecting-dynamics-of-action-in-text-with-a-neural-Gruber/cd6c9da2e8c52b043faf05ccc2511a07c54ead0c |access-date=2021-10-10 |archive-date=2021-10-10 |archive-url=https://web.archive.org/web/20211010125453/https://www.semanticscholar.org/paper/Detecting-dynamics-of-action-in-text-with-a-neural-Gruber/cd6c9da2e8c52b043faf05ccc2511a07c54ead0c |url-status=live }}&lt;/ref&gt; &lt;ref&gt;{{cite journal |title=Approximation Theory of Convolutional Architectures for Time Series Modelling |journal=International Conference on Machine Learning |year=2021 |last1=Haotian |first1=J. |last2=Zhong |first2=Li |last3=Qianxiao |first3=Li |arxiv=2107.09355}}&lt;/ref&gt;<br /> <br /> === Anomaly Detection ===<br /> A CNN with 1-D convolutions was used on time series in the frequency domain (spectral residual) by an unsupervised model to detect anomalies in the time domain.&lt;ref&gt;{{cite conference |title=Time-Series Anomaly Detection Service at Microsoft {{!}} Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery &amp; Data Mining|language=EN|arxiv=1906.03821|last1=Ren|first1=Hansheng|last2=Xu|first2=Bixiong|last3=Wang|first3=Yujing|last4=Yi|first4=Chao|last5=Huang|first5=Congrui|last6=Kou|first6=Xiaoyu|last7=Xing|first7=Tony|last8=Yang|first8=Mao|last9=Tong|first9=Jie|last10=Zhang|first10=Qi|year=2019|doi=10.1145/3292500.3330680|s2cid=182952311}}&lt;/ref&gt;<br /> <br /> === Drug discovery ===<br /> CNNs have been used in [[drug discovery]]. Predicting the interaction between molecules and biological [[protein]]s can identify potential treatments. In 2015, Atomwise introduced AtomNet, the first deep learning neural network for [[structure-based drug design]].&lt;ref&gt;{{cite arXiv |title=AtomNet: A Deep Convolutional Neural Network for Bioactivity Prediction in Structure-based Drug Discovery |eprint=1510.02855 |date=2015-10-09 |first1=Izhar |last1=Wallach |first2=Michael |last2=Dzamba |first3=Abraham |last3=Heifets |class=cs.LG}}&lt;/ref&gt; The system trains directly on 3-dimensional representations of chemical interactions. Similar to how image recognition networks learn to compose smaller, spatially proximate features into larger, complex structures,&lt;ref&gt;{{cite arXiv |title=Understanding Neural Networks Through Deep Visualization |eprint=1506.06579 |date=2015-06-22 |first1=Jason |last1=Yosinski |first2=Jeff |last2=Clune |first3=Anh |last3=Nguyen |first4=Thomas |last4=Fuchs |first5=Hod |last5=Lipson |class=cs.CV}}&lt;/ref&gt; AtomNet discovers chemical features, such as [[aromaticity]], [[orbital hybridisation|sp&lt;sup&gt;3&lt;/sup&gt; carbons]], and [[hydrogen bond]]ing. Subsequently, AtomNet was used to predict novel candidate [[biomolecule]]s for multiple disease targets, most notably treatments for the [[Ebola virus]]&lt;ref&gt;{{cite news |title=Toronto startup has a faster way to discover effective medicines |url=https://www.theglobeandmail.com/report-on-business/small-business/starting-out/toronto-startup-has-a-faster-way-to-discover-effective-medicines/article25660419/ |website=The Globe and Mail |access-date=2015-11-09 |archive-date=2015-10-20 |archive-url=https://web.archive.org/web/20151020040115/http://www.theglobeandmail.com/report-on-business/small-business/starting-out/toronto-startup-has-a-faster-way-to-discover-effective-medicines/article25660419/ |url-status=live }}&lt;/ref&gt; and [[multiple sclerosis]].&lt;ref&gt;{{cite web |title=Startup Harnesses Supercomputers to Seek Cures |url=https://www.kqed.org/futureofyou/3461/startup-harnesses-supercomputers-to-seek-cures |website=KQED Future of You |access-date=2015-11-09 |language=en-us |date=2015-05-27 |archive-date=2018-12-06 |archive-url=https://web.archive.org/web/20181206234956/https://www.kqed.org/futureofyou/3461/startup-harnesses-supercomputers-to-seek-cures |url-status=live }}&lt;/ref&gt;<br /> <br /> === Checkers game ===<br /> CNNs have been used in the game of [[draughts|checkers]]. From 1999 to 2001, [[David B. Fogel|Fogel]] and Chellapilla published papers showing how a convolutional neural network could learn to play '''checker''' using co-evolution. The learning process did not use prior human professional games, but rather focused on a minimal set of information contained in the checkerboard: the location and type of pieces, and the difference in number of pieces between the two sides. Ultimately, the program ([[Blondie24]]) was tested on 165 games against players and ranked in the highest 0.4%.&lt;ref&gt;{{cite journal |pmid=18252639 |doi=10.1109/72.809083 |volume=10 |issue=6 |title=Evolving neural networks to play checkers without relying on expert knowledge |journal=IEEE Trans Neural Netw |pages=1382–91 |last1=Chellapilla |first1=K |last2=Fogel |first2=DB |year=1999}}&lt;/ref&gt;&lt;ref&gt;{{cite journal |doi=10.1109/4235.942536 |title=Evolving an expert checkers playing program without using human expertise |journal=IEEE Transactions on Evolutionary Computation |volume=5 |issue=4 |pages=422–428 |year=2001 |last1=Chellapilla |first1=K. |last2=Fogel |first2=D.B.}}&lt;/ref&gt; It also earned a win against the program [[Chinook (draughts player)|Chinook]] at its &quot;expert&quot; level of play.&lt;ref&gt;{{cite book |last=Fogel |first=David |date=2001 |title=Blondie24: Playing at the Edge of AI |location=San Francisco, CA |publisher=Morgan Kaufmann |isbn=978-1558607835 |author-link=David B. Fogel}}&lt;/ref&gt;<br /> <br /> === Go ===<br /> CNNs have been used in [[computer Go]]. In December 2014, Clark and [[Amos Storkey|Storkey]] published a paper showing that a CNN trained by supervised learning from a database of human professional games could outperform [[GNU Go]] and win some games against [[Monte Carlo tree search]] Fuego 1.1 in a fraction of the time it took Fuego to play.&lt;ref&gt;{{cite arXiv |eprint=1412.3409 |last1=Clark |first1=Christopher |title=Teaching Deep Convolutional Neural Networks to Play Go |last2=Storkey |first2=Amos |class=cs.AI |year=2014}}&lt;/ref&gt; Later it was announced that a large 12-layer convolutional neural network had correctly predicted the professional move in 55% of positions, equalling the accuracy of a [[Go ranks and ratings|6 dan]] human player. When the trained convolutional network was used directly to play games of Go, without any search, it beat the traditional search program [[GNU Go]] in 97% of games, and matched the performance of the [[Monte Carlo tree search]] program Fuego simulating ten thousand playouts (about a million positions) per move.&lt;ref&gt;{{cite arXiv |eprint=1412.6564 |last1=Maddison |first1=Chris J. |title=Move Evaluation in Go Using Deep Convolutional Neural Networks |last2=Huang |first2=Aja |last3=Sutskever |first3=Ilya |last4=Silver |first4=David |class=cs.LG |year=2014}}&lt;/ref&gt;<br /> <br /> A couple of CNNs for choosing moves to try (&quot;policy network&quot;) and evaluating positions (&quot;value network&quot;) driving MCTS were used by [[AlphaGo]], the first to beat the best human player at the time.&lt;ref&gt;{{cite web |url=https://www.deepmind.com/alpha-go.html |title=AlphaGo – Google DeepMind |access-date=30 January 2016 |archive-url=https://web.archive.org/web/20160130230207/http://www.deepmind.com/alpha-go.html |archive-date=30 January 2016 |url-status=dead}}&lt;/ref&gt;<br /> <br /> === Time series forecasting ===<br /> Recurrent neural networks are generally considered the best neural network architectures for time series forecasting (and sequence modeling in general), but recent studies show that convolutional networks can perform comparably or even better.&lt;ref&gt;{{cite arXiv |last1=Bai |first1=Shaojie |last2=Kolter |first2=J. Zico |last3=Koltun |first3=Vladlen |title=An Empirical Evaluation of Generic Convolutional and Recurrent Networks for Sequence Modeling |date=2018-04-19 |eprint=1803.01271 |class=cs.LG}}&lt;/ref&gt;&lt;ref name=&quot;Tsantekidis 7–12&quot;/&gt; Dilated convolutions&lt;ref&gt;{{cite arXiv |last1=Yu |first1=Fisher |last2=Koltun |first2=Vladlen |title=Multi-Scale Context Aggregation by Dilated Convolutions |date=2016-04-30 |eprint=1511.07122 |class=cs.CV}}&lt;/ref&gt; might enable one-dimensional convolutional neural networks to effectively learn time series dependences.&lt;ref&gt;{{cite arXiv |last1=Borovykh |first1=Anastasia |last2=Bohte |first2=Sander |last3=Oosterlee |first3=Cornelis W. |title=Conditional Time Series Forecasting with Convolutional Neural Networks |date=2018-09-17 |eprint=1703.04691 |class=stat.ML}}&lt;/ref&gt; Convolutions can be implemented more efficiently than RNN-based solutions, and they do not suffer from vanishing (or exploding) gradients.&lt;ref&gt;{{cite arXiv |last=Mittelman |first=Roni |title=Time-series modeling with undecimated fully convolutional neural networks |date=2015-08-03 |eprint=1508.00317 |class=stat.ML}}&lt;/ref&gt; Convolutional networks can provide an improved forecasting performance when there are multiple similar time series to learn from.&lt;ref&gt;{{cite arXiv |last1=Chen |first1=Yitian |last2=Kang |first2=Yanfei |last3=Chen |first3=Yixiong |last4=Wang |first4=Zizhuo |title=Probabilistic Forecasting with Temporal Convolutional Neural Network |date=2019-06-11 |eprint=1906.04397 |class=stat.ML}}&lt;/ref&gt; CNNs can also be applied to further tasks in time series analysis (e.g., time series classification&lt;ref&gt;{{cite journal |last1=Zhao |first1=Bendong |last2=Lu |first2=Huanzhang |last3=Chen |first3=Shangfeng |last4=Liu |first4=Junliang |last5=Wu |first5=Dongya |date=2017-02-01 |title=Convolutional neural networks for time series classi |journal=Journal of Systems Engineering and Electronics |volume=28 |issue=1 |pages=162–169 |doi=10.21629/JSEE.2017.01.18}}&lt;/ref&gt; or quantile forecasting&lt;ref&gt;{{cite arXiv |last=Petneházi |first=Gábor |title=QCNN: Quantile Convolutional Neural Network |date=2019-08-21 |eprint=1908.07978 |class=cs.LG}}&lt;/ref&gt;).<br /> <br /> === Cultural Heritage and 3D-datasets ===<br /> As archaeological findings like [[clay tablet]]s with [[cuneiform|cuneiform writing]] are increasingly acquired using [[3D scanner]]s first benchmark datasets are becoming available like ''HeiCuBeDa''&lt;ref name=&quot;HeiCuBeDa_Hilprecht&quot;/&gt; providing almost 2.000 normalized 2D- and 3D-datasets prepared with the [[GigaMesh Software Framework]].&lt;ref name=&quot;ICDAR19&quot;/&gt; So [[curvature]]-based measures are used in conjunction with Geometric Neural Networks (GNNs) e.g. for period classification of those clay tablets being among the oldest documents of human history.&lt;ref name=&quot;ICFHR20&quot;/&gt;&lt;ref name=&quot;ICFHR20_Presentation&quot;/&gt;<br /> <br /> == Fine-tuning ==<br /> For many applications, the training data is less available. Convolutional neural networks usually require a large amount of training data in order to avoid [[overfitting]]. A common technique is to train the network on a larger data set from a related domain. Once the network parameters have converged an additional training step is performed using the in-domain data to fine-tune the network weights, this is known as [[transfer learning]]. Furthermore, this technique allows convolutional network architectures to successfully be applied to problems with tiny training sets.&lt;ref&gt;Durjoy Sen Maitra; Ujjwal Bhattacharya; S.K. Parui, [https://ieeexplore.ieee.org/document/7333916 &quot;CNN based common approach to handwritten character recognition of multiple scripts&quot;] {{Webarchive|url=https://web.archive.org/web/20231016190918/https://ieeexplore.ieee.org/document/7333916 |date=2023-10-16 }}, in Document Analysis and Recognition (ICDAR), 2015 13th International Conference on, vol., no., pp.1021–1025, 23–26 Aug. 2015&lt;/ref&gt;<br /> <br /> == Human interpretable explanations ==<br /> End-to-end training and prediction are common practice in [[computer vision]]. However, human interpretable explanations are required for [[safety-critical system|critical systems]] such as a [[self-driving car]]s.&lt;ref name=&quot;Interpretable ML Symposium 2017&quot;&gt;{{cite web |title=NIPS 2017 |website=Interpretable ML Symposium |date=2017-10-20 |url=http://interpretable.ml/ |access-date=2018-09-12 |archive-date=2019-09-07 |archive-url=https://web.archive.org/web/20190907063237/http://interpretable.ml/ |url-status=dead }}&lt;/ref&gt; With recent advances in [[salience (neuroscience)|visual salience]], [[visual spatial attention|spatial attention]], and [[visual temporal attention|temporal attention]], the most critical spatial regions/temporal instants could be visualized to justify the CNN predictions.&lt;ref name=&quot;Zang Wang Liu Zhang 2018 pp. 97–108&quot;&gt;{{cite book |last1=Zang |first1=Jinliang |last2=Wang |first2=Le |last3=Liu |first3=Ziyi |last4=Zhang |first4=Qilin |last5=Hua |first5=Gang |last6=Zheng |first6=Nanning |title=Artificial Intelligence Applications and Innovations |series=IFIP Advances in Information and Communication Technology |volume=519 |chapter=Attention-Based Temporal Weighted Convolutional Neural Network for Action Recognition |publisher=Springer International Publishing |location=Cham |year=2018 |isbn=978-3-319-92006-1 |issn=1868-4238 |doi=10.1007/978-3-319-92007-8_9 |pages=97–108 |arxiv=1803.07179 |s2cid=4058889}}&lt;/ref&gt;&lt;ref name=&quot;Wang Zang Zhang Niu p=1979&quot;&gt;{{cite journal |last1=Wang |first1=Le |last2=Zang |first2=Jinliang |last3=Zhang |first3=Qilin |last4=Niu |first4=Zhenxing |last5=Hua |first5=Gang |last6=Zheng |first6=Nanning |title=Action Recognition by an Attention-Aware Temporal Weighted Convolutional Neural Network |journal=Sensors |volume=18 |issue=7 |date=2018-06-21 |issn=1424-8220 |doi=10.3390/s18071979 |pmid=29933555 |pmc=6069475 |page=1979 |bibcode=2018Senso..18.1979W |url=https://qilin-zhang.github.io/_pages/pdfs/sensors-18-01979-Action_Recognition_by_an_Attention-Aware_Temporal_Weighted_Convolutional_Neural_Network.pdf |doi-access=free |access-date=2018-09-14 |archive-date=2018-09-13 |archive-url=https://web.archive.org/web/20180913040055/https://qilin-zhang.github.io/_pages/pdfs/sensors-18-01979-Action_Recognition_by_an_Attention-Aware_Temporal_Weighted_Convolutional_Neural_Network.pdf |url-status=live }}&lt;/ref&gt;<br /> <br /> == Related architectures ==<br /> <br /> === Deep Q-networks ===<br /> A deep Q-network (DQN) is a type of deep learning model that combines a deep neural network with [[Q-learning]], a form of [[reinforcement learning]]. Unlike earlier reinforcement learning agents, DQNs that utilize CNNs can learn directly from high-dimensional sensory inputs via reinforcement learning.&lt;ref name=&quot;Ong Chavez Hong 2015&quot;&gt;{{cite arXiv |last1=Ong |first1=Hao Yi |last2=Chavez |first2=Kevin |last3=Hong |first3=Augustus |title=Distributed Deep Q-Learning |date=2015-08-18 |class=cs.LG |eprint=1508.04186v2}}&lt;/ref&gt;<br /> <br /> Preliminary results were presented in 2014, with an accompanying paper in February 2015.&lt;ref name=&quot;DQN&quot;&gt;{{cite journal |last1=Mnih |first1=Volodymyr |display-authors=etal |date=2015 |title=Human-level control through deep reinforcement learning |journal=Nature |volume=518 |issue=7540 |pages=529–533 |doi=10.1038/nature14236 |pmid=25719670 |bibcode=2015Natur.518..529M |s2cid=205242740}}&lt;/ref&gt; The research described an application to [[Atari 2600]] gaming. Other deep reinforcement learning models preceded it.&lt;ref&gt;{{cite journal |last1=Sun |first1=R. |last2=Sessions |first2=C. |date=June 2000 |title=Self-segmentation of sequences: automatic formation of hierarchies of sequential behaviors |journal=IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics |volume=30 |issue=3 |pages=403–418 |doi=10.1109/3477.846230 |pmid=18252373 |issn=1083-4419 |citeseerx=10.1.1.11.226}}&lt;/ref&gt;<br /> <br /> === Deep belief networks ===<br /> {{Main|Deep belief network}}<br /> [[Convolutional deep belief network|Convolutional deep belief networks]] (CDBN) have structure very similar to convolutional neural networks and are trained similarly to deep belief networks. Therefore, they exploit the 2D structure of images, like CNNs do, and make use of pre-training like [[deep belief network]]s. They provide a generic structure that can be used in many image and signal processing tasks. Benchmark results on standard image datasets like CIFAR&lt;ref name=&quot;CDBN-CIFAR&quot;&gt;{{Cite web|url=http://www.cs.toronto.edu/~kriz/conv-cifar10-aug2010.pdf|title=Convolutional Deep Belief Networks on CIFAR-10|access-date=2017-08-18|archive-date=2017-08-30|archive-url=https://web.archive.org/web/20170830060223/http://www.cs.toronto.edu/~kriz/conv-cifar10-aug2010.pdf|url-status=live}}&lt;/ref&gt; have been obtained using CDBNs.&lt;ref name=&quot;CDBN&quot;&gt;{{cite book |last1=Lee |first1=Honglak |last2=Grosse |first2=Roger |last3=Ranganath |first3=Rajesh |last4=Ng |first4=Andrew Y. |title=Proceedings of the 26th Annual International Conference on Machine Learning |chapter=Convolutional deep belief networks for scalable unsupervised learning of hierarchical representations |date=1 January 2009 |publisher=ACM |pages=609–616 |doi=10.1145/1553374.1553453 |isbn=9781605585161 |citeseerx=10.1.1.149.6800 |s2cid=12008458}}&lt;/ref&gt;<br /> <br /> == Notable libraries ==<br /> *[[Caffe (software)|Caffe]]: A library for convolutional neural networks. Created by the Berkeley Vision and Learning Center (BVLC). It supports both CPU and GPU. Developed in [[C++]], and has [[Python (programming language)|Python]] and [[MATLAB]] wrappers.<br /> *[[Deeplearning4j]]: Deep learning in [[Java (programming language)|Java]] and [[Scala (programming language)|Scala]] on multi-GPU-enabled [[Apache Spark|Spark]]. A general-purpose deep learning library for the JVM production stack running on a C++ scientific computing engine. Allows the creation of custom layers. Integrates with Hadoop and Kafka.<br /> *[[Dlib]]: A toolkit for making real world machine learning and data analysis applications in C++.<br /> *[[Microsoft Cognitive Toolkit]]: A deep learning toolkit written by Microsoft with several unique features enhancing scalability over multiple nodes. It supports full-fledged interfaces for training in C++ and Python and with additional support for model inference in [[C Sharp (programming language)|C#]] and Java.<br /> *[[TensorFlow]]: [[Apache License#Version 2.0|Apache 2.0]]-licensed Theano-like library with support for CPU, GPU, Google's proprietary [[tensor processing unit]] (TPU),&lt;ref&gt;{{cite news |url=https://www.wired.com/2016/05/google-tpu-custom-chips/ |title=Google Built Its Very Own Chips to Power Its AI Bots |author=Cade Metz |date=May 18, 2016 |newspaper=Wired |access-date=March 6, 2017 |archive-date=January 13, 2018 |archive-url=https://web.archive.org/web/20180113150305/https://www.wired.com/2016/05/google-tpu-custom-chips/ |url-status=live }}&lt;/ref&gt; and mobile devices.<br /> *[[Theano (software)|Theano]]: The reference deep-learning library for Python with an API largely compatible with the popular [[NumPy]] library. Allows user to write symbolic mathematical expressions, then automatically generates their derivatives, saving the user from having to code gradients or backpropagation. These symbolic expressions are automatically compiled to [[CUDA]] code for a fast, [[Compute kernel|on-the-GPU]] implementation.<br /> *[[Torch (machine learning)|Torch]]: A [[scientific computing]] framework with wide support for machine learning algorithms, written in [[C (programming language)|C]] and [[Lua (programming language)|Lua]].<br /> <br /> == See also ==<br /> * [[Attention (machine learning)]]<br /> * [[Convolution]]<br /> * [[Deep learning]]<br /> * [[Natural-language processing]]<br /> * [[Neocognitron]]<br /> * [[Scale-invariant feature transform]]<br /> * [[Time delay neural network]]<br /> * [[Vision processing unit]]<br /> <br /> == Notes ==<br /> {{Reflist|group=nb}}<br /> <br /> == References ==<br /> {{reflist|30em|refs=<br /> &lt;ref name=&quot;ICDAR19&quot;&gt;<br /> {{citation |surname1=Hubert Mara and Bartosz Bogacz |periodical=Proceedings of the 15th International Conference on Document Analysis and Recognition (ICDAR) |title=Breaking the Code on Broken Tablets: The Learning Challenge for Annotated Cuneiform Script in Normalized 2D and 3D Datasets |location=Sydney, Australien |date=2019 |pages=148–153 |language=de |doi=10.1109/ICDAR.2019.00032 |isbn=978-1-7281-3014-9 |s2cid=211026941}}<br /> &lt;/ref&gt;<br /> &lt;ref name=&quot;HeiCuBeDa_Hilprecht&quot;&gt;<br /> {{citation |surname1=[[Hubert Mara]] |title=HeiCuBeDa Hilprecht – Heidelberg Cuneiform Benchmark Dataset for the Hilprecht Collection |publisher=heiDATA – institutional repository for research data of Heidelberg University |date=2019-06-07 |language=de |doi=10.11588/data/IE8CCN}}<br /> &lt;/ref&gt;&lt;ref name=&quot;ICFHR20&quot;&gt;<br /> {{citation<br /> |last1=Bogacz|first1=Bartosz<br /> |last2=Mara|first2=Hubert<br /> |periodical=Proceedings of the 17th International Conference on Frontiers of Handwriting Recognition (ICFHR)<br /> |title=Period Classification of 3D Cuneiform Tablets with Geometric Neural Networks<br /> |location=Dortmund, Germany<br /> |date=2020<br /> }}&lt;/ref&gt;<br /> &lt;ref name=&quot;ICFHR20_Presentation&quot;&gt;{{YouTube<br /> |id=-iFntE51HRw<br /> |title=Presentation of the ICFHR paper on Period Classification of 3D Cuneiform Tablets with Geometric Neural Networks<br /> }}&lt;/ref&gt;<br /> }}<br /> <br /> == External links ==<br /> * [https://cs231n.github.io/ CS231n: Convolutional Neural Networks for Visual Recognition] — [[Andrej Karpathy]]'s [[Stanford University|Stanford]] computer science course on CNNs in computer vision<br /> * [https://ujjwalkarn.me/2016/08/11/intuitive-explanation-convnets/ An Intuitive Explanation of Convolutional Neural Networks] — A beginner level introduction to what Convolutional Neural Networks are and how they work<br /> * [https://www.completegate.com/2017022864/blog/deep-machine-learning-images-lenet-alexnet-cnn/all-pages Convolutional Neural Networks for Image Classification] {{Webarchive|url=https://web.archive.org/web/20180121184329/https://www.completegate.com/2017022864/blog/deep-machine-learning-images-lenet-alexnet-cnn/all-pages |date=2018-01-21 }} — Literature Survey<br /> {{Differentiable computing}}<br /> {{Authority control}}<br /> <br /> [[Category:Neural network architectures]]<br /> [[Category:Computer vision]]<br /> [[Category:Computational neuroscience]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Fanaroff%E2%80%93Riley_classification&diff=1181039994 Fanaroff–Riley classification 2023-10-20T13:38:25Z <p>205.189.94.9: </p> <hr /> <div>[[File:Emmaalexander fr blue.png|thumb|300px|right|FR1 (top) vs FR2 (bottom)]]<br /> The '''Fanaroff–Riley classification''' is a scheme created by [[Bernie Fanaroff|B.L. Fanaroff]] and [[Julia Riley|J.M. Riley]] in 1974,&lt;ref name=Fanaroff74&gt;{{cite journal |author=Fanaroff, Bernard L., Riley Julia M. |date=May 1974 |title= The morphology of extragalactic radio sources of high and low luminosity |journal=[[Monthly Notices of the Royal Astronomical Society]] |volume=167 |pages=31P–36P|bibcode=1974MNRAS.167P..31F |last2=Riley |doi=10.1093/mnras/167.1.31p|doi-access= }}&lt;/ref&gt; which is used to distinguish [[Radio galaxy|radio galaxies]] with [[Active galactic nucleus|active nuclei]] based on their [[luminosity|radio luminosity]] or brightness of their [[Radio waves|radio emissions]] in relation to their hosting environment. Fanaroff and Riley noticed that the relative positions high/low surface brightness regions in the lobes of extragalactic radio sources are correlated with their radio luminosity. Their conclusion was based on a set of 57 radio galaxies and quasars that were clearly resolved at 1.4&amp;nbsp;GHz or 5&amp;nbsp;GHz into two or more components. Fanaroff and Riley divided this sample into two classes using the ratio of the distance between the regions of highest surface brightness on opposite sides of the central galaxy or quasar to the total extent of the source up to the lowest brightness contour. ''Class I'' (abbreviated FR-I) are sources whose luminosity decreases as the distance from the central galaxy or quasar host increase, while ''Class II'' (FR-II) sources exhibit increasing luminosity in the lobes. This distinction is important because it presents a direct link between the galaxy's luminosity and the way in which energy is transported from the central region and converted to radio emission in the outer parts.&lt;ref name=&quot;Tsinganos&quot;&gt;{{cite book|last1=Tsinganos|first1=Kanaris C.|last2=T. Thomas P.|first2=Ray|last3=Stute|first3=Matthias|title=Protostellar Jets in Context|volume=13|url=https://books.google.com/books?id=OvInhhp88HMC|access-date=2013-01-11|series=Astrophysics and Space Science Proceedings|date=2009|publisher=Springer|isbn=9783642005763|page=276|bibcode=2009ASSP...13.....T|doi=10.1007/978-3-642-00576-3}}&lt;/ref&gt;&lt;ref&gt;{{cite web|url=http://ned.ipac.caltech.edu/level5/Glossary/Essay_fanaroff.html|title=Fanaroff-Riley Classification|work=NASA/IPAC Extragalactic Database (NED)|publisher=Caltech|access-date=11 January 2013}}&lt;/ref&gt;<br /> <br /> Usage of [[convolutional neural network]] (CNNs) and task distribution in assigning static collection of many real and theorized storage fields for maps and formations of present and hypothetical radio galaxies and other similar objectives including assessing full photosets of all known radar positions, i.e. for a stated output, static radio galaxy [i.e. RGalaxy1] will have many similar data points with other images, and the reference points will show all versions of looking at a [[galaxy image]] and output into graphical functions of all known images and result images of a future galaxy of points across a linear stratum of galactic future destination shaping.<br /> <br /> == Fanaroff-Riley Class I (FR-I) ==<br /> These sources are brighter towards their central galaxy or quasar and become fainter toward the outer extremities of the lobes (also called ''edge-darkened''). The spectra here are steepest, indicating that the radiating particles have aged the most. Jets are detected in a large majority of FR-I galaxies, and these hosts also tend to be bright, large galaxies often located in rich clusters with extreme X-ray emitting gas. As the galaxy moves through the cluster, the gas can sweep back and distort the radio structure through [[ram pressure]].<br /> <br /> == Fanaroff-Riley Class II (FR-II) ==<br /> This class of sources are also known as ''edge-brightened'' and are more luminous than their counterparts, with bright hotspots at the ends of their lobes. The jets are often one-sided due to [[relativistic beaming]]. <br /> <br /> ==See also==<br /> * [[:Category:3C objects]]<br /> * [[:Category:4C objects]]<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> {{DEFAULTSORT:Fanaroff-Riley classification}}<br /> [[Category:Astronomical classification systems]]<br /> [[Category:Galaxies]]<br /> [[Category:Radio astronomy]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=NGC_1316&diff=1181039101 NGC 1316 2023-10-20T13:29:46Z <p>205.189.94.9: </p> <hr /> <div>{{Short description|Lenticular radio galaxy in the constellation Fornax}}<br /> {{Infobox Galaxy<br /> | name = Fornax A<br /> | image = Ngc1316 hst.jpg<br /> | image_size = 350px<br /> | caption = A [[Hubble Space Telescope]] (HST) image of NGC 1316.<br /> | credit= <br /> | epoch = [[J2000]]<br /> | type = (R')SAB(s)0&lt;sup&gt;0&lt;/sup&gt;&lt;ref name=&quot;ned&quot; /&gt;<br /> | ra = {{RA|03|22|41.7}}&lt;ref name=&quot;ned&quot; /&gt;<br /> | dec = {{DEC|-37|12|30}}&lt;ref name=&quot;ned&quot; /&gt;<br /> | dist_ly = [[1 E23 m|62.0 ± 2.9]] [[light-year|Mly]] (19.0 ± 0.9 [[parsec|Mpc]])&lt;ref name=&quot;jensenetal2003&quot; /&gt;&lt;ref name=&quot;Feldmeier06&quot; /&gt;{{Ref_label|A|a|none}}<br /> | z = 1760 ± 10 km/[[second|s]]&lt;ref name=&quot;ned&quot; /&gt;<br /> | appmag_v = 9.4&lt;ref name=&quot;ned&quot; /&gt;<br /> | size_v = 12′.0 × 8′.5&lt;ref name=&quot;ned&quot; /&gt;<br /> | constellation name = [[Fornax]]<br /> | notes = Very bright at radio 1.4 GHz<br /> | names = Fornax A, 3FHL J0322.6-3712e, WMAP J0322-3711, AM 0320-372, GSC 07026-00055, WMAP J0322-3712, APG 154, IRAS 03208-3723, PKS 0320-374, [CAC2009] S0373 CTA 23, IRAS F03207-3723, PKS 0320-37, [CHM2007] HDC 234 J032241.78-3712295, DUGRS 357-001, 1Jy 0320-374, PMN J0321-3658, [CHM2007] LDC 249 J032241.78-3712295, 1E 0320.7-3722, 1Jy 0320-373, PSCz Q03208-3723, [FWB89] Galaxy 479, 2E 761, 1Jy 0320-37, RR95 75a, [KFM98] 5, 2E 0320.7-3723, LEDA 12651, RX J032242-37125, [LB2005] NGC 1316 X1, ESO 357-22, 2MASX J03224178-3712295, RX J0322.7-3712, [VDD93] 28, ESO-LV 357-0220, MCG-06-08-005, 1RXS J032241.8-371239, [WCO2009] J032152-370824, FCC 21, MOST 0320-373, SGC 032047-3723.2, 2FGL J0322.4-3717, MRC 0320-373, VSOP J0322-3712, 3FGL J0322.5-3721, MSH 03-3-01, WMAP 138&lt;ref&gt;{{Cite web|url=http://simbad.u-strasbg.fr/simbad/sim-id?Ident=ngc+1316&amp;NbIdent=1&amp;Radius=2&amp;Radius.unit=arcmin&amp;submit=submit+id|title = NGC 1316}}&lt;/ref&gt;<br /> }}<br /> <br /> '''NGC 1316''' (also known as '''''Fornax A''''') is a [[lenticular galaxy]] about 60 million [[light-year]]s (18.4 millioon [[parsecs]]) away in the [[constellation]] [[Fornax]]. It is a [[radio galaxy]] and at 1400 [[MHz]] is the fourth-brightest [[Radio wave|radio]] source in the sky.&lt;ref name=&quot;schweizer80&quot; /&gt;<br /> <br /> ==Structure and formation==<br /> In the [[late seventies]] of the [[twentieth century]], François Schweizer studied [[NGC 1316]] extensively and found that the galaxy appeared to look like a small [[elliptical galaxy]] with some unusual [[Cosmic dust|dust]] lanes embedded within a much larger [[Common envelope|envelope of stars]]. The outer envelope contained many ripples, loops, and arcs. He also identified the presence of a compact disk of gas near the [[Galactic Center|center]] that appeared inclined relative to the stars and that appeared to rotate faster than the stars (the mass-to-light ratio run in the center of NGC 1316 resembles that of many other giant ellipticals).&lt;ref&gt;{{Cite journal|last1=Shaya|first1=E. J.|last2=Dowling|first2=D. M.|last3=Currie|first3=D. G.|last4=Faber|first4=S. M.|last5=Ajhar|first5=E. A.|last6=Lauer|first6=T. R.|last7=Groth|first7=E. J.|last8=Grillmair|first8=C. J.|last9=Lynd|first9=R.|last10=O'Neil|first10=E. J., Jr.|date=1996-06-01|title=Hubble Space Telescope Planetary Camera Images of NGC 1316 (Fornax A)|url=http://adsabs.harvard.edu/abs/1996AJ....111.2212S|journal=The Astronomical Journal|volume=111|pages=2212|doi=10.1086/117955|issn=0004-6256|arxiv=astro-ph/9603056|bibcode=1996AJ....111.2212S|s2cid=119337718}}&lt;/ref&gt; Based on these results, Schweizer considered that NGC 1316 was built up through the merger of several smaller galaxies. Such merger events may have fueled the central [[supermassive black hole]], that has a mass estimated in 130–150 million of [[solar mass]]es&lt;ref name=&quot;Nowak2008&quot; /&gt; with gas, causing the galaxy to become a [[radio galaxy]]. He also states that NGC 1316 is comparable to the giant elliptical galaxies found in the centers of other [[cluster of galaxies|clusters of galaxies]].&lt;ref name=&quot;schweizer80&quot; /&gt; Using spectroscopy of its brightest [[globular cluster]]s, the merger is estimated to have occurred ~3 billion years ago.&lt;ref name=&quot;Goudfrooij2001&quot; /&gt; <br /> NGC 1316 spans about 50 000 light-years.&lt;ref name=&quot;APOD Jan2021&quot; /&gt;<br /> <br /> It has been proposed too that NGC 1316 may be a galaxy in evolution that eventually will become a [[Sombrero galaxy|Sombrero]]-like system dominated by a large [[Galactic bulge|bulge]].&lt;ref name=&quot;Freeman2012&quot; /&gt; Accuracy of such destinations and build of current and future galaxies and galactic formations and research methods are well established.<br /> <br /> ==Companions and environment==<br /> <br /> [[File:ESO Fornax Galaxy Cluster.jpg|thumb|left|The Fornax galaxy cluster with NGC 1316 (large, near middle)]]<br /> <br /> NGC 1316 is located within the [[Fornax Cluster]], a [[cluster of galaxies]] in the constellation [[Fornax]]. However, in contrast to [[Messier 87]], which is a similar [[elliptical galaxy]] that is located in the center of the [[Virgo Cluster]], NGC 1316 is located at the edge of the Fornax Cluster.&lt;ref name=&quot;ferguson89&quot; /&gt;<br /> <br /> NGC 1316 appears to be interacting with [[NGC 1317]], a small spiral galaxy to the north. However, that small spiral galaxy does not appear to be sufficiently large enough to cause the distortions seen in the structure of this galaxy.&lt;ref name=&quot;schweizer80&quot; /&gt;<br /> <br /> NGC 1316 has hosted four [[Supernova|supernovae]] (all [[Supernova#Type I|type Ia]]): 1980N, 1981D, 2006dd and 2006mr.&lt;ref name=&quot;Feldmeier06&quot; /&gt;&lt;ref name=&quot;Rochester Supernova&quot; /&gt;<br /> <br /> ==Distance estimates==<br /> <br /> At least two methods have been used to estimate the distance to NGC 1316: [[surface brightness fluctuation]] (SBF) in 2003&lt;ref name=&quot;jensenetal2003&quot; /&gt; and [[planetary nebula luminosity function]] (PNLF) in 2006.&lt;ref name=&quot;Feldmeier06&quot; /&gt; Being a lenticular galaxy, it is not suitable to apply the [[cepheid variable]] method{{why|date=December 2022}}. Using SBF, a distance estimate of 20.0 ± 1.6 [[parsec|Mpc]]&lt;ref name=&quot;jensenetal2003&quot; /&gt; was computed. Using PNLF, 45 [[planetary nebula]] candidates were located and a distance estimate of 17.9 {{±|0.8|0.9}} Mpc was computed.&lt;ref name=&quot;Feldmeier06&quot; /&gt; Averaged together, these two distance measurements give a combined distance estimate of 62.0 ± 2.9 Mly (19.0 ± 0.9 Mpc).{{Ref_label|A|a|none}}<br /> <br /> ==See also==<br /> * [[Centaurus A]] <br /> * [[Messier 87]]<br /> * [[NGC 1097]]<br /> <br /> ==Notes==<br /> {{Refbegin}}<br /> &lt;ol type=&quot;a&quot;&gt;<br /> &lt;li&gt;{{Note_label|A|a|none}}average(20.0 ± 1.6, 17.9 {{±|0.8|0.9}}) = ((20.0 + 17.9) / 2) ± ((1.6&lt;sup&gt;2&lt;/sup&gt; + 0.8&lt;sup&gt;2&lt;/sup&gt;)&lt;sup&gt;0.5&lt;/sup&gt; / 2) = 19.0 ± 0.9&lt;/li&gt;<br /> &lt;/ol&gt;<br /> {{Refend}}<br /> <br /> == Citations ==<br /> <br /> {{reflist|30em|refs=<br /> &lt;ref name=&quot;ned&quot;&gt;{{cite web<br /> | title=NASA/IPAC Extragalactic Database<br /> | work=Results for NGC 1316<br /> | url=http://nedwww.ipac.caltech.edu/<br /> | access-date=2006-07-10 }}&lt;/ref&gt;<br /> &lt;ref name=&quot;schweizer80&quot;&gt;{{cite journal<br /> | author=F. Schweizer<br /> | title=An Optical Study of the Giant Radio Galaxy NGC 1316 (Fornax A)<br /> | journal=Astrophysical Journal<br /> | date=1980<br /> | volume=237<br /> | pages=303–318<br /> | bibcode=1980ApJ...237..303S<br /> | doi=10.1086/157870}}&lt;/ref&gt;<br /> &lt;ref name=&quot;Nowak2008&quot;&gt;{{cite journal<br /> | author=NOWAK N. |display-authors=4 | author2=SAGLIA R.P. | author3=THOMAS J. | author4=BENDER R. | author5=DAVIES R.I. | author6=GEBHARDT K.<br /> | title=The supermassive black hole of Fornax A.<br /> | journal=[[Monthly Notices of the Royal Astronomical Society]]<br /> | date=2008<br /> | volume=391<br /> | pages=1629–1649<br /> | bibcode = 2008MNRAS.391.1629N<br /> | doi=10.1111/j.1365-2966.2008.13960.x|arxiv = 0809.0696 |s2cid=16677289 }}&lt;/ref&gt;<br /> &lt;ref name=&quot;Goudfrooij2001&quot;&gt;{{cite journal<br /> | author=Goudfrooij, Paul | author2=Alonso, M. Victoria | author3=Maraston, Claudia | author4=Minniti, Dante<br /> | title=The star cluster system of the 3-Gyr-old merger remnant NGC 1316: clues from optical and near-infrared photometry<br /> | journal=Monthly Notices of the Royal Astronomical Society<br /> |date=November 2001<br /> | volume=328<br /> | issue=1<br /> | pages=237–256<br /> | bibcode=2001MNRAS.328..237G<br /> | doi=10.1046/j.1365-8711.2001.04860.x|arxiv = astro-ph/0107533 | s2cid=11981354 }}&lt;/ref&gt;<br /> &lt;ref name=&quot;Freeman2012&quot;&gt;{{cite journal<br /> | author= McNeil-Moylan, E. K. | author2=Freeman, K. C. | author3=Arnaboldi, M. | author4=Gerhard, O. E..<br /> | title= Planetary nebula kinematics in NGC 1316: a young Sombrero<br /> | journal= Astronomy &amp; Astrophysics<br /> | date=2012<br /> | volume=539<br /> | bibcode = 2012A&amp;A...539A..11M <br /> | doi = 10.1051/0004-6361/201117875 | pages=A11|arxiv = 1201.6010 | s2cid=54215156 }}&lt;/ref&gt;<br /> &lt;ref name=&quot;Rochester Supernova&quot;&gt;{{Cite web|url=http://www.rochesterastronomy.org/sn2006/sn2006dd.html|title=Supernova 2006dd and 2006mr in NGC 1316|website=www.rochesterastronomy.org|access-date=2020-02-03}}&lt;/ref&gt;<br /> &lt;ref name=&quot;ferguson89&quot;&gt;{{cite journal<br /> | author=H. C. Ferguson<br /> | title=Population studies in groups and clusters of galaxies. II - A catalog of galaxies in the central 3.5 deg of the Fornax Cluster<br /> | journal=Astronomical Journal<br /> | date=1989<br /> | volume=98<br /> | pages=367–418<br /> | bibcode=1989AJ.....98..367F<br /> | doi=10.1086/115152<br /> | doi-access=free<br /> }}&lt;/ref&gt;<br /> &lt;ref name=&quot;Feldmeier06&quot;&gt;{{cite journal<br /> | author=Feldmeier, John J. | author2=Jacoby, George H. | author3=Phillips, Mark M.<br /> | title=Calibrating Type Ia Supernovae using the Planetary Nebula Luminosity Function I. Initial Results<br /> | journal=The Astrophysical Journal<br /> |date=2007<br /> | bibcode=2007ApJ...657...76F<br /> | doi=10.1086/510897<br /> | volume=657<br /> | issue=1<br /> | pages=76–94<br /> |arxiv = astro-ph/0611231 | s2cid=17109095 }}&lt;/ref&gt;<br /> &lt;ref name=&quot;jensenetal2003&quot;&gt;{{cite journal<br /> | author=Jensen, Joseph B. |display-authors=4 | author2=Tonry, John L. | author3=Barris, Brian J. | author4=Thompson, Rodger I. | author5=Liu, Michael C. | author6=Rieke, Marcia J. | author7=Ajhar, Edward A. | author8=Blakeslee, John P.<br /> | title=Measuring Distances and Probing the Unresolved Stellar Populations of Galaxies Using Infrared Surface Brightness Fluctuations<br /> | journal=Astrophysical Journal<br /> |date=February 2003<br /> | volume=583<br /> | issue=2<br /> | pages=712–726<br /> | bibcode=2003ApJ...583..712J<br /> | doi=10.1086/345430<br /> |arxiv = astro-ph/0210129 |s2cid=551714 }}&lt;/ref&gt;<br /> <br /> &lt;!-- Not in use<br /> &lt;ref name=&quot;ESO Oct2017&quot;&gt;{{cite web|title=Revealing Galactic Secrets|url=http://www.eso.org/public/news/eso1734/|website=www.eso.org|access-date=25 October 2017}}&lt;/ref&gt;<br /> Not in use--&gt;<br /> <br /> &lt;!-- Not in use<br /> &lt;ref name=&quot;ESO April 2014&quot;&gt;{{cite news|title=Galactic Serial Killer|url=http://www.eso.org/public/news/eso1411/|access-date=7 April 2014|newspaper=ESO Press Release}}&lt;/ref&gt;<br /> Not in use--&gt;<br /> <br /> &lt;ref name=&quot;APOD Jan2021&quot;&gt;{{APOD|date=25 January 2021|title=Central NGC 1316: After Galaxies Collide}}&lt;/ref&gt;<br /> }}<br /> <br /> ==External links==<br /> {{commonscat}}<br /> * {{APOD |date=25 January 2021 |title=Central NGC 1316: After Galaxies Collide}}<br /> * {{APOD |date=22 February 1999 |title=After Galaxies Collide}}<br /> * [https://web.archive.org/web/20070927184654/http://www.spacetelescope.org/bin/images.pl?searchtype=freesearch&amp;string=+1316 NGC 1316 at ESA/Hubble]<br /> <br /> {{Sky|03|22|41.7|-|37|12|30|67000000}}<br /> {{ngc15}}<br /> {{Fornax}}<br /> <br /> {{DEFAULTSORT:NGC 1316}}<br /> [[Category:Lenticular galaxies]]<br /> [[Category:Shell galaxies]]<br /> [[Category:Fornax Cluster]]<br /> [[Category:Fornax]]<br /> [[Category:NGC objects|1316]]<br /> [[Category:Principal Galaxies Catalogue objects|12651]]<br /> [[Category:Arp objects|154]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Darren_Rumble&diff=1180788588 Darren Rumble 2023-10-18T21:04:23Z <p>205.189.94.9: /* Coaching career */ hired as assistant coach in July 2023.</p> <hr /> <div>{{Short description|Canadian ice hockey player and coach}}<br /> {{for|the Australian rules footballer|Darren Rumble (footballer)}}<br /> {{Infobox ice hockey player<br /> | name = Darren Rumble<br /> | played_for = [[Philadelphia Flyers]]&lt;br /&gt;[[Ottawa Senators]]&lt;br /&gt;[[St. Louis Blues]]&lt;br /&gt;[[Tampa Bay Lightning]]<br /> | position = [[Defenceman (ice hockey)|Defence]]<br /> | shoots = Left<br /> | height_ft = 6<br /> | height_in = 1<br /> | weight_lb = 210<br /> | image = Darren Rumble.jpg<br /> | image_size = 230 px<br /> | caption = Rumble with the [[Springfield Falcons]] in 2004<br /> | birth_date = {{birth date and age|1969|1|23|mf=y}}<br /> | birth_place = [[Barrie]], [[Ontario]], Canada<br /> | draft = 20th overall<br /> | draft_year = 1987<br /> | draft_team = [[Philadelphia Flyers]]<br /> | career_start = 1989<br /> | career_end = 2005<br /> }}<br /> <br /> '''Darren William Rumble''' (born January 23, 1969) is a [[Canadians|Canadian]] professional [[ice hockey]] coach and former professional ice hockey player, presently the Interim Coach of the [[Ontario Hockey League]] [[Owen Sound Attack]]. <br /> Rumble played for the [[Philadelphia Flyers]], [[Ottawa Senators]], [[St. Louis Blues]] and [[Tampa Bay Lightning]] of the [[National Hockey League]], but played most of his career with various minor league teams. In 2003–04 season Rumble spent majority of the season in the NHL, played only 5 games for Tampa Bay. Rumble spent most of the season as a healthy reserve. [[Tampa Bay Lightning]] still had his name inscribed on the [[Stanley Cup]] even though he did not officially qualify. The following year he played a handful of games for the Lightnings' AHL Affiliate [[Springfield Falcons]] before [[Retirement|retiring]] and becoming assistant coach of the team. Rumble later became head coach of the [[Norfolk Admirals (AHL)|Norfolk Admirals]] of the [[American Hockey League]] (AHL), holding the position until January 2010.<br /> In 2013, he was assistant coach for the [[Iceland men's national ice hockey team|Icelandic National hockey team]] in the IIHF Hockey World Championship Div.II in Croatia.<br /> <br /> ==Playing career==<br /> As a youth, Rumble played in the 1982 [[Quebec International Pee-Wee Hockey Tournament]] with a [[minor ice hockey]] team from [[Barrie]].&lt;ref&gt;{{cite web|url=https://www.publicationsports.com/ressources/files/439/Joueurs_Pro.pdf|title=Pee-Wee players who have reached NHL or WHA|year=2018|website=Quebec International Pee-Wee Hockey Tournament|access-date=2019-01-18|archive-date=2019-03-06|archive-url=https://web.archive.org/web/20190306085544/https://www.publicationsports.com/ressources/files/439/Joueurs_Pro.pdf|url-status=dead}}&lt;/ref&gt;<br /> <br /> Rumble was selected 20th overall by the [[Philadelphia Flyers]] in the [[1987 NHL Entry Draft]]. Rumble turned professional with the [[Hershey Bears]] in 1989–90. He played three seasons with the Bears, managing three games with the Flyers. He was selected in the [[1992 NHL Expansion Draft]] by the [[Ottawa Senators]]. He played two seasons with Ottawa, before returning to the AHL with the [[Prince Edward Island Senators]]. In 1995, he became the property of the Flyers for the second time, and mostly played for their affiliates the Hershey Bears and the Philadelphia Phantoms. He did manage 15 games in the NHL.{{cn|date=January 2019}}<br /> <br /> In 1997, he left North America to play one season for the [[Adler Mannheim]] in the [[Deutsche Eishockey Liga]]. After that season, Rumble would spend the following seven seasons with various AHL and [[International Hockey League (1945-2001)|IHL]] teams, with occasional callups to NHL clubs [[St. Louis Blues]] and [[Tampa Bay Lightning]], including five games with the Lightning in the 2003–04 season for which the club won the [[Stanley Cup]]. His final playing season was with Springfield in 2004–05, becoming their assistant coach as a mid-season replacement. Over his career, Rumble played 193 career NHL games, scoring 10 goals and 26 assists for 36 points.{{cn|date=January 2019}}<br /> <br /> ==Coaching career==<br /> <br /> In 2007, he joined the [[Norfolk Admirals (AHL)|Norfolk Admirals]] of the AHL as an assistant coach. In July 2008, the [[Tampa Bay Lightning]] named Rumble Head Coach of their [[American Hockey League|AHL]] affiliate the [[Norfolk Admirals (AHL)|Norfolk Admirals]] replacing [[Steve Stirling]]. On January 12, 2010, he was fired by the Lightning. He subsequently became an assistant coach for the [[Seattle Thunderbirds]]. In July 2013, he was named head coach of the [[Moncton Wildcats]] (QMJHL).&lt;ref&gt;{{Cite web|url=http://moncton-wildcats.com/article/rumble-new-head-coach|title=Rumble new head coach – Moncton Wildcats}}&lt;/ref&gt;.<br /> On October 18, 2023, he was named the Interim Head Coach of the Owen Sound Attack, where was was serving since July as an assistant coach, before the replacement of previous coach [[Greg Walters]] on October 16th. <br /> &lt;ref&gt; {{Cite web|url=http://chl.ca/ohl-attack/article/darren-rumble-named-interim-head-coach/}} &lt;/ref&gt;<br /> <br /> ==Career statistics==<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; style=&quot;text-align:center; width:60em&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! colspan=&quot;3&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! colspan=&quot;5&quot; | [[Regular season]] <br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! colspan=&quot;5&quot; | [[Playoffs]] <br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! [[Season (sports)|Season]] <br /> ! Team <br /> ! League <br /> ! GP <br /> ! [[Goal (ice hockey)|G]] <br /> ! [[Assist (ice hockey)|A]] <br /> ! [[Point (ice hockey)|Pts]] <br /> ! [[Penalty (ice hockey)|PIM]] <br /> ! GP <br /> ! G <br /> ! A <br /> ! Pts <br /> ! PIM <br /> |-<br /> | 1985–86<br /> | [[Barrie Colts]]<br /> | [[Central Junior B Hockey League|CJHL]]<br /> | 46<br /> | 14<br /> | 32<br /> | 46<br /> | 91<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1986–87 OHL season|1986–87]]<br /> | [[Kitchener Rangers]]<br /> | [[Ontario Hockey League|OHL]]<br /> | 64<br /> | 11<br /> | 32<br /> | 43<br /> | 44<br /> | 4<br /> | 0<br /> | 1<br /> | 1<br /> | 9<br /> |- <br /> | [[1987–88 OHL season|1987–88]]<br /> | Kitchener Rangers<br /> | OHL<br /> | 55<br /> | 15<br /> | 50<br /> | 65<br /> | 64<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1988–89 OHL season|1988–89]]<br /> | Kitchener Rangers<br /> | OHL<br /> | 46<br /> | 11<br /> | 29<br /> | 40<br /> | 25<br /> | 5<br /> | 1<br /> | 0<br /> | 1<br /> | 2<br /> |- <br /> | [[1989–90 AHL season|1989–90]]<br /> | [[Hershey Bears]]<br /> | [[American Hockey League|AHL]]<br /> | 57<br /> | 2<br /> | 13<br /> | 15<br /> | 31<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1990–91 AHL season|1990–91]]<br /> | Hershey Bears <br /> | AHL<br /> | 73<br /> | 6<br /> | 35<br /> | 41<br /> | 48<br /> | 3<br /> | 0<br /> | 5<br /> | 5<br /> | 2<br /> |- <br /> | [[1990–91 AHL season|1990–91]]<br /> | [[Philadelphia Flyers]] <br /> | [[National Hockey League|NHL]]<br /> | 3<br /> | 1<br /> | 0<br /> | 1<br /> | 0<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1991–92 AHL season|1991–92]]<br /> | Hershey Bears <br /> | AHL<br /> | 79<br /> | 12<br /> | 54<br /> | 66<br /> | 118<br /> | 6<br /> | 0<br /> | 3<br /> | 3<br /> | 2<br /> |- <br /> | [[1992–93 AHL season|1992–93]]<br /> | [[New Haven Senators]] <br /> | AHL<br /> | 2<br /> | 1<br /> | 0<br /> | 1<br /> | 0<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1992–93 NHL season|1992–93]]<br /> | [[Ottawa Senators]]<br /> | NHL<br /> | 69<br /> | 3<br /> | 13<br /> | 16<br /> | 61<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[1993–94 AHL season|1993–94]]<br /> | [[PEI Senators]] <br /> | AHL<br /> | 3<br /> | 2<br /> | 0<br /> | 2<br /> | 0<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1993–94 NHL season|1993–94]]<br /> | Ottawa Senators<br /> | NHL<br /> | 70<br /> | 6<br /> | 9<br /> | 15<br /> | 116<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[1994–95 AHL season|1994–95]]<br /> | PEI Senators<br /> | AHL<br /> | 70<br /> | 7<br /> | 46<br /> | 53<br /> | 77<br /> | 11<br /> | 0<br /> | 6<br /> | 6<br /> | 4<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1995–96 AHL season|1995–96]]<br /> | Hershey Bears<br /> | AHL<br /> | 58 <br /> | 13 <br /> | 37 <br /> | 50<br /> | 83<br /> | 5<br /> | 0<br /> | 0<br /> | 0<br /> | 6<br /> |- <br /> | [[1995–96 NHL season|1995–96]]<br /> | Philadelphia Flyers<br /> | NHL<br /> | 5<br /> | 0<br /> | 0<br /> | 0<br /> | 4<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1996–97 AHL season|1996–97]]<br /> | [[Philadelphia Phantoms]]<br /> | AHL<br /> | 72<br /> | 18<br /> | 44<br /> | 62<br /> | 83<br /> | 7<br /> | 0<br /> | 3<br /> | 3<br /> | 19<br /> |- <br /> | [[1996–97 NHL season|1996–97]]<br /> | Philadelphia Flyers<br /> | NHL<br /> | 10<br /> | 0<br /> | 0<br /> | 0<br /> | 0<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1997–98 DEL season|1997–98]]<br /> | [[Adler Mannheim]]<br /> | [[Deutsche Eishockey Liga|DEL]]<br /> | 21<br /> | 2<br /> | 7<br /> | 9<br /> | 18<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[1997–98 IHL season|1997–98]]<br /> | [[San Antonio Dragons]]<br /> | [[International Hockey League (1945–2001)|IHL]]<br /> | 46<br /> | 7<br /> | 22<br /> | 29<br /> | 47<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1998–99 IHL season|1998–99]]<br /> | [[Grand Rapids Griffins]]<br /> | IHL<br /> | 53<br /> | 6<br /> | 22<br /> | 28<br /> | 44<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | 1998–99<br /> | [[Utah Grizzlies (1995–2005)|Utah Grizzlies]]<br /> | IHL<br /> | 10<br /> | 1<br /> | 4<br /> | 5<br /> | 10<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1999–2000 IHL season|1999–2000]]<br /> | Grand Rapids Griffins<br /> | IHL<br /> | 29<br /> | 3<br /> | 10<br /> | 13<br /> | 20<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[1999–2000 AHL season|1999–2000]]<br /> | [[Worcester IceCats]] <br /> | AHL<br /> | 39<br /> | 0<br /> | 17<br /> | 17<br /> | 31<br /> | 9<br /> | 0<br /> | 2<br /> | 2<br /> | 6<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[2000–01 AHL season|2000–01]]<br /> | Worcester IceCats <br /> | AHL<br /> | 53<br /> | 6<br /> | 24<br /> | 30<br /> | 65<br /> | 8<br /> | 0<br /> | 1<br /> | 1<br /> | 10<br /> |- <br /> | [[2000–01 NHL season|2000–01]]<br /> | [[St. Louis Blues]] <br /> | NHL<br /> | 12<br /> | 0<br /> | 4<br /> | 4<br /> | 27<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[2001–02 AHL season|2001–02]]<br /> | Worcester IceCats <br /> | AHL<br /> | 60<br /> | 3<br /> | 29<br /> | 32<br /> | 48<br /> | 3<br /> | 0<br /> | 4<br /> | 4<br /> | 2<br /> |- <br /> | [[2002–03 AHL season|2002–03]]<br /> | [[Springfield Falcons]] <br /> | AHL<br /> | 33<br /> | 5<br /> | 17<br /> | 22<br /> | 18<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[2002–03 NHL season|2002–03]]<br /> | [[Tampa Bay Lightning]] <br /> | NHL<br /> | 19<br /> | 0<br /> | 0<br /> | 0<br /> | 6<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[2003–04 AHL season|2003–04]]<br /> | Hershey Bears<br /> | AHL<br /> | 5<br /> | 2<br /> | 0<br /> | 2<br /> | 6<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[2003–04 NHL season|2003–04]]<br /> | Tampa Bay Lightning<br /> | NHL<br /> | 5<br /> | 0<br /> | 0<br /> | 0<br /> | 2<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[2004–05 AHL season|2004–05]]<br /> | Springfield Falcons<br /> | AHL<br /> | 10<br /> | 0<br /> | 1<br /> | 1<br /> | 4<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! colspan=&quot;3&quot; | AHL totals<br /> ! 614<br /> ! 77<br /> ! 317<br /> ! 394<br /> ! 612<br /> ! 52<br /> ! 0<br /> ! 24<br /> ! 24<br /> ! 51<br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! colspan=&quot;3&quot; | NHL totals<br /> ! 193<br /> ! 10<br /> ! 26<br /> ! 36<br /> ! 216<br /> ! —<br /> ! —<br /> ! —<br /> ! —<br /> ! —<br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! colspan=&quot;3&quot; | IHL totals<br /> ! 138<br /> ! 17<br /> ! 58<br /> ! 75<br /> ! 121<br /> ! —<br /> ! —<br /> ! —<br /> ! —<br /> ! —<br /> |}<br /> <br /> ==Awards==<br /> * 1997 - [[Eddie Shore Award]] - AHL top defenceman<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==External links==<br /> * {{Ice hockey stats}}<br /> <br /> {{s-start}}<br /> {{succession box | before = [[Kerry Huffman]] | title = [[List of Philadelphia Flyers draft picks|Philadelphia Flyers' first round draft pick]] | years = [[1987 NHL Entry Draft|1987]] | after = [[Claude Boivin]]}}<br /> {{s-end}}<br /> <br /> {{DEFAULTSORT:Rumble, Darren}}<br /> [[Category:1969 births]]<br /> [[Category:Living people]]<br /> [[Category:Adler Mannheim players]]<br /> [[Category:Barrie Colts players]]<br /> [[Category:Canadian ice hockey defencemen]]<br /> [[Category:Grand Rapids Griffins (IHL) players]]<br /> [[Category:Hershey Bears players]]<br /> [[Category:Kitchener Rangers players]]<br /> [[Category:Moncton Wildcats coaches]]<br /> [[Category:National Hockey League first-round draft picks]]<br /> [[Category:New Haven Senators players]]<br /> [[Category:Ottawa Senators players]]<br /> [[Category:Philadelphia Flyers draft picks]]<br /> [[Category:Philadelphia Flyers players]]<br /> [[Category:Philadelphia Phantoms players]]<br /> [[Category:Prince Edward Island Senators players]]<br /> [[Category:St. Louis Blues players]]<br /> [[Category:San Antonio Dragons players]]<br /> [[Category:Ice hockey people from Barrie]]<br /> [[Category:Springfield Falcons players]]<br /> [[Category:Stanley Cup champions]]<br /> [[Category:Tampa Bay Lightning players]]<br /> [[Category:Utah Grizzlies (IHL) players]]<br /> [[Category:Worcester IceCats players]]<br /> [[Category:Canadian expatriate ice hockey players in Germany]]<br /> [[Category:Canadian ice hockey coaches]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Darren_Rumble&diff=1180786911 Darren Rumble 2023-10-18T20:54:30Z <p>205.189.94.9: named Interim Head Coach Owen Sound Attack OHL. 2013-2023??</p> <hr /> <div>{{Short description|Canadian ice hockey player and coach}}<br /> {{for|the Australian rules footballer|Darren Rumble (footballer)}}<br /> {{Infobox ice hockey player<br /> | name = Darren Rumble<br /> | played_for = [[Philadelphia Flyers]]&lt;br /&gt;[[Ottawa Senators]]&lt;br /&gt;[[St. Louis Blues]]&lt;br /&gt;[[Tampa Bay Lightning]]<br /> | position = [[Defenceman (ice hockey)|Defence]]<br /> | shoots = Left<br /> | height_ft = 6<br /> | height_in = 1<br /> | weight_lb = 210<br /> | image = Darren Rumble.jpg<br /> | image_size = 230 px<br /> | caption = Rumble with the [[Springfield Falcons]] in 2004<br /> | birth_date = {{birth date and age|1969|1|23|mf=y}}<br /> | birth_place = [[Barrie]], [[Ontario]], Canada<br /> | draft = 20th overall<br /> | draft_year = 1987<br /> | draft_team = [[Philadelphia Flyers]]<br /> | career_start = 1989<br /> | career_end = 2005<br /> }}<br /> <br /> '''Darren William Rumble''' (born January 23, 1969) is a [[Canadians|Canadian]] professional [[ice hockey]] coach and former professional ice hockey player, presently the Interim Coach of the [[Ontario Hockey League]] [[Owen Sound Attack]]. <br /> Rumble played for the [[Philadelphia Flyers]], [[Ottawa Senators]], [[St. Louis Blues]] and [[Tampa Bay Lightning]] of the [[National Hockey League]], but played most of his career with various minor league teams. In 2003–04 season Rumble spent majority of the season in the NHL, played only 5 games for Tampa Bay. Rumble spent most of the season as a healthy reserve. [[Tampa Bay Lightning]] still had his name inscribed on the [[Stanley Cup]] even though he did not officially qualify. The following year he played a handful of games for the Lightnings' AHL Affiliate [[Springfield Falcons]] before [[Retirement|retiring]] and becoming assistant coach of the team. Rumble later became head coach of the [[Norfolk Admirals (AHL)|Norfolk Admirals]] of the [[American Hockey League]] (AHL), holding the position until January 2010.<br /> In 2013, he was assistant coach for the [[Iceland men's national ice hockey team|Icelandic National hockey team]] in the IIHF Hockey World Championship Div.II in Croatia.<br /> <br /> ==Playing career==<br /> As a youth, Rumble played in the 1982 [[Quebec International Pee-Wee Hockey Tournament]] with a [[minor ice hockey]] team from [[Barrie]].&lt;ref&gt;{{cite web|url=https://www.publicationsports.com/ressources/files/439/Joueurs_Pro.pdf|title=Pee-Wee players who have reached NHL or WHA|year=2018|website=Quebec International Pee-Wee Hockey Tournament|access-date=2019-01-18|archive-date=2019-03-06|archive-url=https://web.archive.org/web/20190306085544/https://www.publicationsports.com/ressources/files/439/Joueurs_Pro.pdf|url-status=dead}}&lt;/ref&gt;<br /> <br /> Rumble was selected 20th overall by the [[Philadelphia Flyers]] in the [[1987 NHL Entry Draft]]. Rumble turned professional with the [[Hershey Bears]] in 1989–90. He played three seasons with the Bears, managing three games with the Flyers. He was selected in the [[1992 NHL Expansion Draft]] by the [[Ottawa Senators]]. He played two seasons with Ottawa, before returning to the AHL with the [[Prince Edward Island Senators]]. In 1995, he became the property of the Flyers for the second time, and mostly played for their affiliates the Hershey Bears and the Philadelphia Phantoms. He did manage 15 games in the NHL.{{cn|date=January 2019}}<br /> <br /> In 1997, he left North America to play one season for the [[Adler Mannheim]] in the [[Deutsche Eishockey Liga]]. After that season, Rumble would spend the following seven seasons with various AHL and [[International Hockey League (1945-2001)|IHL]] teams, with occasional callups to NHL clubs [[St. Louis Blues]] and [[Tampa Bay Lightning]], including five games with the Lightning in the 2003–04 season for which the club won the [[Stanley Cup]]. His final playing season was with Springfield in 2004–05, becoming their assistant coach as a mid-season replacement. Over his career, Rumble played 193 career NHL games, scoring 10 goals and 26 assists for 36 points.{{cn|date=January 2019}}<br /> <br /> ==Coaching career==<br /> <br /> In 2007, he joined the [[Norfolk Admirals (AHL)|Norfolk Admirals]] of the AHL as an assistant coach. In July 2008, the [[Tampa Bay Lightning]] named Rumble Head Coach of their [[American Hockey League|AHL]] affiliate the [[Norfolk Admirals (AHL)|Norfolk Admirals]] replacing [[Steve Stirling]]. On January 12, 2010, he was fired by the Lightning. He subsequently became an assistant coach for the [[Seattle Thunderbirds]]. In July 2013, he was named head coach of the [[Moncton Wildcats]] (QMJHL).&lt;ref&gt;{{Cite web|url=http://moncton-wildcats.com/article/rumble-new-head-coach|title=Rumble new head coach – Moncton Wildcats}}&lt;/ref&gt;.<br /> On October 18, 2023, he was named the Interim Head Coach of the Owen Sound Attack, where was was serving as an assistant coach, before the replacement of previous coach [[Greg Walters]] on October 16th. <br /> &lt;ref&gt; {{Cite web|url=http://chl.ca/ohl-attack/article/darren-rumble-named-interim-head-coach/}} &lt;/ref&gt; <br /> ==Career statistics==<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; style=&quot;text-align:center; width:60em&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! colspan=&quot;3&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! colspan=&quot;5&quot; | [[Regular season]] <br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! colspan=&quot;5&quot; | [[Playoffs]] <br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! [[Season (sports)|Season]] <br /> ! Team <br /> ! League <br /> ! GP <br /> ! [[Goal (ice hockey)|G]] <br /> ! [[Assist (ice hockey)|A]] <br /> ! [[Point (ice hockey)|Pts]] <br /> ! [[Penalty (ice hockey)|PIM]] <br /> ! GP <br /> ! G <br /> ! A <br /> ! Pts <br /> ! PIM <br /> |-<br /> | 1985–86<br /> | [[Barrie Colts]]<br /> | [[Central Junior B Hockey League|CJHL]]<br /> | 46<br /> | 14<br /> | 32<br /> | 46<br /> | 91<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1986–87 OHL season|1986–87]]<br /> | [[Kitchener Rangers]]<br /> | [[Ontario Hockey League|OHL]]<br /> | 64<br /> | 11<br /> | 32<br /> | 43<br /> | 44<br /> | 4<br /> | 0<br /> | 1<br /> | 1<br /> | 9<br /> |- <br /> | [[1987–88 OHL season|1987–88]]<br /> | Kitchener Rangers<br /> | OHL<br /> | 55<br /> | 15<br /> | 50<br /> | 65<br /> | 64<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1988–89 OHL season|1988–89]]<br /> | Kitchener Rangers<br /> | OHL<br /> | 46<br /> | 11<br /> | 29<br /> | 40<br /> | 25<br /> | 5<br /> | 1<br /> | 0<br /> | 1<br /> | 2<br /> |- <br /> | [[1989–90 AHL season|1989–90]]<br /> | [[Hershey Bears]]<br /> | [[American Hockey League|AHL]]<br /> | 57<br /> | 2<br /> | 13<br /> | 15<br /> | 31<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1990–91 AHL season|1990–91]]<br /> | Hershey Bears <br /> | AHL<br /> | 73<br /> | 6<br /> | 35<br /> | 41<br /> | 48<br /> | 3<br /> | 0<br /> | 5<br /> | 5<br /> | 2<br /> |- <br /> | [[1990–91 AHL season|1990–91]]<br /> | [[Philadelphia Flyers]] <br /> | [[National Hockey League|NHL]]<br /> | 3<br /> | 1<br /> | 0<br /> | 1<br /> | 0<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1991–92 AHL season|1991–92]]<br /> | Hershey Bears <br /> | AHL<br /> | 79<br /> | 12<br /> | 54<br /> | 66<br /> | 118<br /> | 6<br /> | 0<br /> | 3<br /> | 3<br /> | 2<br /> |- <br /> | [[1992–93 AHL season|1992–93]]<br /> | [[New Haven Senators]] <br /> | AHL<br /> | 2<br /> | 1<br /> | 0<br /> | 1<br /> | 0<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1992–93 NHL season|1992–93]]<br /> | [[Ottawa Senators]]<br /> | NHL<br /> | 69<br /> | 3<br /> | 13<br /> | 16<br /> | 61<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[1993–94 AHL season|1993–94]]<br /> | [[PEI Senators]] <br /> | AHL<br /> | 3<br /> | 2<br /> | 0<br /> | 2<br /> | 0<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1993–94 NHL season|1993–94]]<br /> | Ottawa Senators<br /> | NHL<br /> | 70<br /> | 6<br /> | 9<br /> | 15<br /> | 116<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[1994–95 AHL season|1994–95]]<br /> | PEI Senators<br /> | AHL<br /> | 70<br /> | 7<br /> | 46<br /> | 53<br /> | 77<br /> | 11<br /> | 0<br /> | 6<br /> | 6<br /> | 4<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1995–96 AHL season|1995–96]]<br /> | Hershey Bears<br /> | AHL<br /> | 58 <br /> | 13 <br /> | 37 <br /> | 50<br /> | 83<br /> | 5<br /> | 0<br /> | 0<br /> | 0<br /> | 6<br /> |- <br /> | [[1995–96 NHL season|1995–96]]<br /> | Philadelphia Flyers<br /> | NHL<br /> | 5<br /> | 0<br /> | 0<br /> | 0<br /> | 4<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1996–97 AHL season|1996–97]]<br /> | [[Philadelphia Phantoms]]<br /> | AHL<br /> | 72<br /> | 18<br /> | 44<br /> | 62<br /> | 83<br /> | 7<br /> | 0<br /> | 3<br /> | 3<br /> | 19<br /> |- <br /> | [[1996–97 NHL season|1996–97]]<br /> | Philadelphia Flyers<br /> | NHL<br /> | 10<br /> | 0<br /> | 0<br /> | 0<br /> | 0<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1997–98 DEL season|1997–98]]<br /> | [[Adler Mannheim]]<br /> | [[Deutsche Eishockey Liga|DEL]]<br /> | 21<br /> | 2<br /> | 7<br /> | 9<br /> | 18<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[1997–98 IHL season|1997–98]]<br /> | [[San Antonio Dragons]]<br /> | [[International Hockey League (1945–2001)|IHL]]<br /> | 46<br /> | 7<br /> | 22<br /> | 29<br /> | 47<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1998–99 IHL season|1998–99]]<br /> | [[Grand Rapids Griffins]]<br /> | IHL<br /> | 53<br /> | 6<br /> | 22<br /> | 28<br /> | 44<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | 1998–99<br /> | [[Utah Grizzlies (1995–2005)|Utah Grizzlies]]<br /> | IHL<br /> | 10<br /> | 1<br /> | 4<br /> | 5<br /> | 10<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[1999–2000 IHL season|1999–2000]]<br /> | Grand Rapids Griffins<br /> | IHL<br /> | 29<br /> | 3<br /> | 10<br /> | 13<br /> | 20<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[1999–2000 AHL season|1999–2000]]<br /> | [[Worcester IceCats]] <br /> | AHL<br /> | 39<br /> | 0<br /> | 17<br /> | 17<br /> | 31<br /> | 9<br /> | 0<br /> | 2<br /> | 2<br /> | 6<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[2000–01 AHL season|2000–01]]<br /> | Worcester IceCats <br /> | AHL<br /> | 53<br /> | 6<br /> | 24<br /> | 30<br /> | 65<br /> | 8<br /> | 0<br /> | 1<br /> | 1<br /> | 10<br /> |- <br /> | [[2000–01 NHL season|2000–01]]<br /> | [[St. Louis Blues]] <br /> | NHL<br /> | 12<br /> | 0<br /> | 4<br /> | 4<br /> | 27<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[2001–02 AHL season|2001–02]]<br /> | Worcester IceCats <br /> | AHL<br /> | 60<br /> | 3<br /> | 29<br /> | 32<br /> | 48<br /> | 3<br /> | 0<br /> | 4<br /> | 4<br /> | 2<br /> |- <br /> | [[2002–03 AHL season|2002–03]]<br /> | [[Springfield Falcons]] <br /> | AHL<br /> | 33<br /> | 5<br /> | 17<br /> | 22<br /> | 18<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[2002–03 NHL season|2002–03]]<br /> | [[Tampa Bay Lightning]] <br /> | NHL<br /> | 19<br /> | 0<br /> | 0<br /> | 0<br /> | 6<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[2003–04 AHL season|2003–04]]<br /> | Hershey Bears<br /> | AHL<br /> | 5<br /> | 2<br /> | 0<br /> | 2<br /> | 6<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot; <br /> | [[2003–04 NHL season|2003–04]]<br /> | Tampa Bay Lightning<br /> | NHL<br /> | 5<br /> | 0<br /> | 0<br /> | 0<br /> | 2<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- <br /> | [[2004–05 AHL season|2004–05]]<br /> | Springfield Falcons<br /> | AHL<br /> | 10<br /> | 0<br /> | 1<br /> | 1<br /> | 4<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! colspan=&quot;3&quot; | AHL totals<br /> ! 614<br /> ! 77<br /> ! 317<br /> ! 394<br /> ! 612<br /> ! 52<br /> ! 0<br /> ! 24<br /> ! 24<br /> ! 51<br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! colspan=&quot;3&quot; | NHL totals<br /> ! 193<br /> ! 10<br /> ! 26<br /> ! 36<br /> ! 216<br /> ! —<br /> ! —<br /> ! —<br /> ! —<br /> ! —<br /> |- bgcolor=&quot;#e0e0e0&quot; <br /> ! colspan=&quot;3&quot; | IHL totals<br /> ! 138<br /> ! 17<br /> ! 58<br /> ! 75<br /> ! 121<br /> ! —<br /> ! —<br /> ! —<br /> ! —<br /> ! —<br /> |}<br /> <br /> ==Awards==<br /> * 1997 - [[Eddie Shore Award]] - AHL top defenceman<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==External links==<br /> * {{Ice hockey stats}}<br /> <br /> {{s-start}}<br /> {{succession box | before = [[Kerry Huffman]] | title = [[List of Philadelphia Flyers draft picks|Philadelphia Flyers' first round draft pick]] | years = [[1987 NHL Entry Draft|1987]] | after = [[Claude Boivin]]}}<br /> {{s-end}}<br /> <br /> {{DEFAULTSORT:Rumble, Darren}}<br /> [[Category:1969 births]]<br /> [[Category:Living people]]<br /> [[Category:Adler Mannheim players]]<br /> [[Category:Barrie Colts players]]<br /> [[Category:Canadian ice hockey defencemen]]<br /> [[Category:Grand Rapids Griffins (IHL) players]]<br /> [[Category:Hershey Bears players]]<br /> [[Category:Kitchener Rangers players]]<br /> [[Category:Moncton Wildcats coaches]]<br /> [[Category:National Hockey League first-round draft picks]]<br /> [[Category:New Haven Senators players]]<br /> [[Category:Ottawa Senators players]]<br /> [[Category:Philadelphia Flyers draft picks]]<br /> [[Category:Philadelphia Flyers players]]<br /> [[Category:Philadelphia Phantoms players]]<br /> [[Category:Prince Edward Island Senators players]]<br /> [[Category:St. Louis Blues players]]<br /> [[Category:San Antonio Dragons players]]<br /> [[Category:Ice hockey people from Barrie]]<br /> [[Category:Springfield Falcons players]]<br /> [[Category:Stanley Cup champions]]<br /> [[Category:Tampa Bay Lightning players]]<br /> [[Category:Utah Grizzlies (IHL) players]]<br /> [[Category:Worcester IceCats players]]<br /> [[Category:Canadian expatriate ice hockey players in Germany]]<br /> [[Category:Canadian ice hockey coaches]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Talk:Darren_Rumble&diff=1180784989 Talk:Darren Rumble 2023-10-18T20:41:59Z <p>205.189.94.9: /* 2013-2023 Coaching background.... */ new section</p> <hr /> <div>{{WikiProject banner shell|1=<br /> {{WikiProject Biography<br /> |living=yes<br /> |class=Start<br /> |listas=Rumble, Darren<br /> |sports-work-group=yes<br /> |sports-priority=low<br /> }}<br /> {{WikiProject Ice Hockey|bio=yes |class=Start|phi=yes|needs-photo=yes}}<br /> {{WikiProject Canada|sport=yes|on=yes |class=Start|importance=low}}<br /> | blp=yes<br /> }}<br /> <br /> == 2013-2023 Coaching background.... ==<br /> <br /> Just announced as Interim Head Coach of the Owen Sound Attack (OHL) where he was serving as an Assistant....<br /> SO.... [[Special:Contributions/205.189.94.9|205.189.94.9]] ([[User talk:205.189.94.9|talk]]) 20:41, 18 October 2023 (UTC)</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Rasashastra&diff=1179960952 Rasashastra 2023-10-13T15:48:52Z <p>205.189.94.9: /* See also */</p> <hr /> <div>{{Short description|Compilation of traditional ancient Indian medicine practice}}<br /> {{italic title}}<br /> {{Use dmy dates|date=December 2019}}<br /> {{POV|date=December 2020}}<br /> {{Alternative medicine sidebar}}<br /> In [[Ayurvedic medicine]], the compilation of traditional ancient Indian medicine practice is called '''''rasaśāstra''''' '''(रसशासत्र )''', which details processes by which various [[metal]]s, [[minerals]] and other substances, including [[mercury (element)|mercury]], are purified and combined with herbs in an attempt to treat illnesses.&lt;ref&gt;{{cite web|url=http://www.shrifreedom.org/alchemy.html |archive-url=https://web.archive.org/web/20121114192941/http://www.shrifreedom.org/alchemy.html |url-status=dead |archive-date=2012-11-14 |title=Rasa Shastra – Freedom Vidya |website=Shrifreedom.org}}&lt;/ref&gt;&lt;ref&gt;{{cite book|url=https://books.google.com/books?id=qo0VPGr0RF4C&amp;pg=PA26|title=Scientific Basis for Ayurvedic Therapies |date=2003 |isbn=9780203498583 |last1=Mishra |first1=Lakshmi C. }}&lt;/ref&gt; Rasashastra is a pharmaceutical branch of Indian system of medicine which mainly deals with the metals, minerals, animal origin product, toxic herbs and their use in therapeutics.&lt;ref&gt;{{Cite journal|last1=Savrikar|first1=S|last2=Ravishankar|first2=B|date=2011-07-15|title=Introduction to 'Rasashaastra'- The Iatrochemistry of Ayurveda|url=http://www.ajol.info/index.php/ajtcam/article/view/67955|journal=African Journal of Traditional, Complementary and Alternative Medicines|volume=8|issue=5S|pages=66–82|doi=10.4314/ajtcam.v8i5S.1|pmid=22754059|issn=0189-6016|pmc=3252715}}&lt;/ref&gt;<br /> <br /> == History of development ==<br /> <br /> The credit of developing ''rasashastra'' as a stream of classical Ayurveda, especially in fulfilling its healthcare-related goals, goes to [[Nāgārjuna]] (5th century CE).&lt;ref&gt;{{cite journal|last1=Savrikar|first1=SS|last2=Ravishankar|first2=B|title=Introduction to 'Rasashaastra' the Iatrochemistry of Ayurveda.|journal=African Journal of Traditional, Complementary and Alternative Medicines|date=2011|volume=8|issue=5 Suppl|pages=66–82|pmid=22754059|doi=10.4314/ajtcam.v8i5S.1|pmc=3252715}}&lt;/ref&gt;<br /> <br /> == Classical textbooks ==<br /> <br /> === ''Rasendra Mangala'' ===<br /> It was composed by Nāgārjuna Siddha in the Sanskrit language. P. C. Rây considered this work (which he erroneously called ''Rasaratnākara''&lt;ref&gt;{{cite journal|last=Wujastyk|first=Dominik|date=July 1984|title=An Alchemical Ghost: The Rasaratnâkara by Nâgârjuna|journal=Ambix|volume=31|issue=2|pages=70–83|doi=10.1179/amb.1984.31.2.70|pmid=11615977}}&lt;/ref&gt;) to be amongst the earliest surviving alchemical works, perhaps from as early as the 7th or 8th century.&lt;ref&gt;{{cite book|url=https://archive.org/details/b24871734_0002|title=A history of Hindu chemistry from the earliest times to the middle of the sixteenth century, A.D|last=Ray|first=Prafulla Chandra|date=1907|publisher=London : Williams and Norgate|others=Wellcome Library}}&lt;/ref&gt; The text actually dates to the 12th century.&lt;ref&gt;{{Cite book|last=White|first=David Gordon|title=The Alchemical Body|publisher=Chicago|pages=166}}&lt;/ref&gt; Rasendramangala originally comprised eight chapters, only four of which are found in the manuscripts available today. Manuscripts of the work are found at Gujarat Ayurveda University, Jamnagar, at Rajasthan Prachya Vidya Pratishthan, Govt. office Bikaner and elsewhere. An edition and translation was published in 2003.&lt;ref&gt;{{cite book|title=Nāgārjunaviracitam Rasendra Maṅgalam ... Anuvādaka Kavirāja Hari {Ś}aṅkara {Ś}armā = Rasendramaṅgalam of Nāgārjuna Edited with Aihore Hindiī Vimarśa, Bhāvānuvāva and English Translation and Notes (chapters 1–4)|publisher=Chaukhambha Orientalia|year=2003|editor-last=Śarmā|editor-first=Hari Śaṅkar|location=Vārāṇasī}}&lt;/ref&gt;<br /> <br /> === ''Rasa Hridaya Tantra'' ===<br /> It was created by Shrimad Govind Bhagvatapad, guru of Shankaracharya, around in the 10th century.&lt;ref&gt;{{Cite book|last=Meulenbeld|first=Jan|title=History of Indian Medical Literature|pages=616–621}}&lt;/ref&gt; It contains elaborate description of ''dhatuvada'' ( metallurgical processes to transform mercury into higher metals as gold or silver). A Sanskrit commentary on this text was contributed by Shri Chaturbhuj Mishra under the name of ''Mugdhavabodhini.''<br /> <br /> === ''Rasarnava'' ===<br /> Edited and published in 1908–1910.<br /> Written by Bhairava.&lt;ref&gt;{{cite book|title=The Ras̄ārnavam: or the ocean of mercury and other metals and minerals|last1=Kaviratna|first1=Hariścandra|last2=Ray|first2=Praphulla Chandra|date=1910|publisher=Asiatic Society|location=Calcutta|language=en|oclc = 700001604}}&lt;/ref&gt;<br /> <br /> === ''Rasa Prakasha Sudhakara'' ===<br /> [[File:Yadav Ji Trikam Ji Acharya 2019 stamp of India.jpg|thumb|Acharya Yadavji Trikam on a 2019 stamp of India]]<br /> It is a 13th-century text by Acharya Yashodhar Bhatt.&lt;ref&gt;{{Cite journal|title=A CRITICAL BOOK REVIEW ON RASA PRAKASHA SUDHAKARA {{!}} Journal of Ayurveda and Holistic Medicine (JAHM)|date=16 April 2021 |url=http://jahm.co.in/index.php/jahm/article/view/289|language=en-US}}&lt;/ref&gt; It was first published by Acharya Yadavji Trikam in 1910. Its second edition was published in 1912 under guidance of Shri Jivaram Kalidas Vyas. It contains 13 chapters describing both Lauhavad (use of metallurgical processes to convert lower metals to higher metals), and Chikitsavad ( use of metals and minerals for therapeutic use). It also describes the origin of mercury, its properties, 18 samskaras (processing techniques) of mercury as well as the tools and techniques involved in them.<br /> <br /> === ''Rasendra Chudamani'' ===<br /> It was created by Aacharya Somadeva approximately in 12th–13th century. It contains 13 chapters which give elaborate description of mercury and its processing for medicinal use; definition and description of equipments and ''Puta'' (temperature requirements and regulation during processing). It can be inferred that at the time of its creation the Gurukula system of education was prevalent as it contains description about ''Shishyopnayana samskara'' ( ritual performed at the time of admission ). Specific contribution of this text include high degree of organization followed during compilation and description of 64 ''divya-aushadis'' (drugs with miraculous effect) for the very first time.<br /> <br /> ==='' Rasa Ratna Samuccaya'' ===<br /> It was created by Vāgbhaṭa, son of Vaidyapati Siṃhagupta, around the end of the 13th century or beginning of the 14th century. It contains around 30 chapters. It is considered one of the best treatises written in the field of ''rasashastra''. It contains vivid description of Yantras (tools, equipments), Puta (temperature related processing details), classification of metals and minerals into Rasa, Uprasa, Lauha, Dhatu, Updhatu etc. as well as their processing details. It also describes clinical aspects of Rasa aushadhis. However, it is not considered as an original text. Rather it is considered as a compilation of works of other Acharyas. Though it may contain some original work of Vagbhatta, but it is difficult to differentiate. It derives much of its contributions from Rasendrachudamani of Somadeva. Rasaprabha and Vijnanbodhini are two of the Hindi commentaries available. Saralarthprakashini is one of the Sanskrit commentary available by Sriyut Shastri Khare.<br /> <br /> ==Methods==<br /> The methods of ''rasashastra'' are contained in a number of Ayurvedic texts, including the ''[[Charaka Samhita]]'' and ''[[Susruta Samhita]]''. An important feature is the use of metals, including several that are considered to be [[toxicity|toxic]] in [[evidence-based medicine]]. In addition to mercury, [[gold]], [[silver]], [[iron]], [[copper]], [[tin]], [[lead]], [[zinc]] and [[bell metal]] are used. In addition to these metals, [[salt]]s and other substances such as [[coral]], [[seashell]]s, and [[feather]]s are also used.&lt;ref&gt;Mishra, p. 86&lt;/ref&gt;<br /> <br /> The usual means used to administer these substances is by preparations called ''bhasma'', [[Sanskrit]] for &quot;ash&quot;. [[Calcination]], which is described in the literature of the art as ''shodhana'', &quot;purification&quot;, is the process used to prepare these ''bhasma'' for administration. [[Sublimation (phase transition)|Sublimation]] and the preparation of a mercury [[sulfide]] are also in use in the preparation of its ''materia medica''. A variety of methods are used to achieve this. One involves the heating of thin sheets of metal and then immersing them in oil (''taila''), extract (''takra''), cow urine (''gomutra'') and other substances.&lt;ref&gt;Mishra, pp. 86–88&lt;/ref&gt; Others are calcined in [[crucible]]s heated with fires of cow dung (''puttam'').&lt;ref&gt;Mishra, pp. 87–88&lt;/ref&gt; Ayurvedic practitioners believe that this process of purification removes undesirable qualities and enhances their therapeutic power.&lt;ref&gt;Mishra, pp. 88&lt;/ref&gt;<br /> <br /> ==Toxicity==<br /> Modern medicine finds that [[mercury poisoning|mercury is inherently toxic]], and that its toxicity is not due to the presence of impurities. While mercury does have anti-microbial properties, and used to be [[Mercury (element)#Medicine|widely used]] in Western medicine, its toxicity does not warrant the risk of using it as a health product in most circumstances.&lt;ref&gt;[https://web.archive.org/web/20090507131958/http://www.atsdr.cdc.gov/toxprofiles/tp46.html Toxicological Profile for Mercury]. U.S. Agency for Toxic Substances and Disease Registry. March 1999&lt;/ref&gt; The [[Centers for Disease Control and Prevention]] have also reported a number of cases of lead poisoning associated with Ayurvedic medicine.&lt;ref&gt;{{cite web|url=https://www.cdc.gov/mmwr/preview/mmwrhtml/mm5326a3.htm |title=Lead Poisoning Associated with Ayurvedic Medications – Five States, 2000–2003 |website=Cdc.gov |access-date=25 February 2015}}&lt;/ref&gt; Other incidents of [[heavy metal poisoning]] have been attributed to the use of ''rasashastra'' compounds in the United States, and [[arsenic]] has also been found in some of the preparations, which have been marketed in the [[United States]] under trade names such as &quot;AyurRelief&quot;, &quot;GlucoRite&quot;, &quot;Acnenil&quot;, &quot;Energize&quot;, &quot;Cold Aid&quot;, and &quot;Lean Plus&quot;.&lt;ref&gt;Hammett-Stabler, Catherine A. (2011) ''Herbal Supplements: Efficacy, Toxicity, Interactions with Western Drugs, and Effects on Clinical Laboratory Tests''. John Wiley and Sons. pp. 202–205. {{ISBN|0-470-43350-7}}&lt;/ref&gt;<br /> <br /> Ayurvedic practitioners claim that these reports of toxicity are due to failure to follow traditional practices in the mass production of these preparations for sale,&lt;ref&gt;Hammett-Stabler, Catherine A. (2011) ''Herbal Supplements: Efficacy, Toxicity, Interactions with Western Drugs, and Effects on Clinical Laboratory Tests''. John Wiley and Sons. pp. 205–206. {{ISBN|0-470-43350-7}}&lt;/ref&gt; but modern science finds that not only mercury, but also [[lead poisoning|lead is inherently toxic]]. The [[government of India]] has ordered that Ayurvedic products must specify their metallic content directly on the labels of the product;&lt;ref name=&quot;Valiathan06&quot; /&gt; however, M. S. Valiathan noted that &quot;the absence of post-market surveillance and the paucity of test laboratory facilities [in India] make the quality control of Ayurvedic medicines exceedingly difficult at this time.&lt;ref name=&quot;Valiathan06&quot;&gt;{{cite journal|last=Valiathan|first=M. S.|title=Ayurveda: Putting the House in Order|journal=Current Science|volume=90|issue=1|pages=5–6|year=2006|url=http://www.currentscience.ac.in/php/toc.php?vol=090&amp;issue=01}}&lt;/ref&gt;<br /> <br /> ==See also==<br /> <br /> * [[History of metallurgy in the Indian subcontinent]]<br /> * [[Mercury poisoning]]<br /> * [[Rasayana]]<br /> * [[Siddha medicine]]<br /> <br /> ==References==<br /> {{Reflist}}<br /> <br /> {{Ayurveda}}<br /> <br /> ==See also==<br /> *[[History of metallurgy in the Indian subcontinent]]<br /> *[[Rasayana]]<br /> *[[Raseśvara]]<br /> *[[Siddha medicine]]<br /> <br /> {{DEFAULTSORT:Rasashastra}}<br /> [[Category:Ayurveda]]<br /> [[Category:Alchemical traditions]]<br /> [[Category:Alternative medicine]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Betty_Davis&diff=1179554266 Betty Davis 2023-10-10T22:01:37Z <p>205.189.94.9: /* Retirement */</p> <hr /> <div>{{Short description|American singer, songwriter, and model (1944–2022)}}<br /> {{About|the singer|her 1973 album|Betty Davis (album)|the actress|Bette Davis|other people}}<br /> {{Use mdy dates|date=February 2022}}<br /> {{Infobox musical artist<br /> | name = Betty Davis <br /> | image = Photo of Betty Davis.jpg<br /> | caption =<br /> | image_size = &lt;!-- Only for images narrower than 220 pixels. Set the value as a number without &quot;px&quot;. --&gt;<br /> | birth_name = Betty Gray Mabry<br /> | alias =<br /> | birth_date = {{birth date|1944|7|26}}<br /> | birth_place = [[Durham, North Carolina]], U.S.<br /> | death_date = {{death date and age|2022|2|9|1944|7|26}}<br /> | death_place = [[Homestead, Pennsylvania]], U.S.<br /> | origin = [[New York City]], NY, U.S.<br /> | instrument = Vocals<br /> | label = {{hlist|[[Columbia Records|Columbia]]|[[Just Sunshine Records|Just Sunshine]]|[[Island Records|Island]]|[[Light in the Attic Records|Light in the Attic]]}}<br /> | genre = {{flatlist|<br /> * [[Funk]]<br /> * [[Rhythm and blues|R&amp;B]]<br /> * [[soul music|soul]]<br /> * rock<br /> }}<br /> | occupation = {{hlist|Singer|songwriter|model}}<br /> | associated_acts = {{hlist|[[Miles Davis]]|[[The Chambers Brothers]]}}<br /> | years_active = {{hlist|1964–1979|2019}}<br /> }}<br /> <br /> '''Betty Davis''' (born '''Betty Gray Mabry'''; July 26, 1944 &amp;ndash; February 9, 2022) was an American singer, songwriter, and model. She was known for her controversial sexually oriented lyrics and performance style and was the second wife of trumpeter [[Miles Davis]].&lt;ref name=nytobit&gt;{{Cite news|url=https://www.nytimes.com/2022/02/10/arts/music/betty-davis-dead.html|title=Betty Davis, Raw Funk Innovator, Is Dead at 77|last=Pareles|first=Jon|newspaper=The New York Times|date=February 10, 2022|access-date=February 10, 2022}}&lt;/ref&gt; Her [[AllMusic]] profile describes her as &quot;a wildly flamboyant funk diva with few equals ... [who] combined the gritty emotional realism of [[Tina Turner]], the futurist fashion sense of [[David Bowie]], and the trendsetting flair of Miles Davis&quot;.&lt;ref name=allmusic&gt;{{AllMusic|class=artist|id=p16873}}&lt;/ref&gt;<br /> <br /> == Early life ==<br /> Betty Gray Mabry was born in [[Durham, North Carolina]], on July 26, 1944.&lt;ref&gt;{{cite news |title=Betty Davis obituary |url=https://www.thetimes.co.uk/article/betty-davis-obituary-hvmrnr66j |access-date=February 11, 2022 |work=The Times |date=February 10, 2022}}&lt;/ref&gt;&lt;ref&gt;{{cite news|url=http://www.post-gazette.com/ae/music/2019/07/22/Betty-Davis-interview-funk-legend-Homestead-Miles-A-Little-Bit-Hot-Tonight/stories/201907220086|title=Betty Davis, a funk icon living in Homestead, releases first song in nearly 40 years |last=Mervis|first=Scott|work=[[Pittsburgh Post-Gazette]]|date=July 22, 2019|access-date=July 22, 2019|url-access=subscription}}&lt;/ref&gt; She developed an interest in music when she was about ten, and was introduced to various blues musicians by her grandmother, Beulah Blackwell, while staying at her farm in [[Reidsville, North Carolina|Reidsville]].&lt;ref name=MOJO05/&gt; At 12, she wrote one of her first songs, &quot;I'm Going to Bake That Cake of Love&quot;.&lt;ref&gt;{{Cite book|url=https://books.google.com/books?id=F91KDwAAQBAJ&amp;q=%22I%E2%80%99m+Going+to+Bake+That+Cake+of+Love.%22&amp;pg=PT296|title=Women Who Rock: Bessie to Beyonce. Girl Groups to Riot Grrrl|last=McDonnell|first=Evelyn|publisher=Black Dog &amp; Leventhal|year=2018|isbn=978-0316558877|location=New York|quote=She penned her first song 'I'm going to bake that Cake of Love' when she was 12 years old.}}&lt;/ref&gt; The family relocated to [[Homestead, Pennsylvania]], so her father, Henry Mabry, could work at a [[Pennsylvania]] steel mill. Davis attended and graduated Homestead High School.&lt;ref&gt;{{cite web|url=https://www.thisisdig.com/betty-davis-funk-pioneer-dies-aged-77/ |first=Joe |last=Tiller|title=Betty Davis, Funk Pioneer, Dies Aged 77|work=Dig!|date=February 10, 2022|access-date= February 10, 2022}}&lt;/ref&gt; She decided to pursue a career in showbusiness after watching her father dance like [[Elvis Presley]].&lt;ref name=MOJO05&gt;{{cite web|url=https://www.rocksbackpages.com/Library/Article/betty-davis-shes-gotta-have-it|title=Betty Davis: She's Gotta Have It|first=James|last=Maycock|date=February 2005|work=MOJO|via=[[Rock's Backpages]]|access-date=February 10, 2022}}&lt;/ref&gt;<br /> <br /> ==Career ==<br /> When she was 16, Betty left Homestead for New York City, enrolling at the [[Fashion Institute of Technology]] (FIT) while living with her aunt. She soaked up the [[Greenwich Village]] culture and [[folk music]] of the early 1960s. She associated herself with frequenters of the Cellar, a hip uptown club where young and stylish people congregated. It was a multiracial, artsy crowd of models, design students, actors, and singers. At the Cellar she played records and chatted people up. She was a friend and early muse to fashion designer [[Stephen Burrows (designer)|Stephen Burrows]], who also studied at the FIT at the time.&lt;ref&gt;{{Cite book|last=Mahon|first=Maureen|url=https://www.worldcat.org/oclc/1141516276|title=Black diamond queens : African American women and rock and roll|date=2020|isbn=978-1-4780-1019-7|location=Durham|oclc=1141516276}}&lt;/ref&gt; She also worked as a model, appearing in photo spreads in ''[[Seventeen (American magazine)|Seventeen]]'', ''[[Ebony (magazine)|Ebony]]'' and ''[[Glamour (magazine)|Glamour]]''.&lt;ref&gt;{{cite web|url=http://www.soulwalking.co.uk/Betty%20Davis.html |title=Betty Davis|publisher=Soulwalking.co.uk|access-date=June 10, 2012}}&lt;/ref&gt;<br /> <br /> In New York, she met musicians including [[Jimi Hendrix]] and [[Sly Stone]].&lt;ref name=stylist&gt; Crowhurst, Anna-Marie (March 7, 2018) [https://www.stylist.co.uk/visible-women/betty-davis-funk-soul-singer-feminist-history/194313 &quot;Forgotten Women: The taboo-smashing queen of funk&quot;], ''Stylist.co.uk''. Retrieved February 10, 2022&lt;/ref&gt; The seeds of her musical career were planted through her friendship with soul singer [[Lou Courtney]], who reputedly produced her first single, &quot;The Cellar&quot;, though the existence of that record has been questioned.&lt;ref name=ontherecord&gt;[http://ontherecordshow.blogspot.com/2011/11/betty-mabry-get-ready-for-betty.html &quot;Betty Mabry: 'Get Ready for Betty'&quot;, ''On the Record'', November 8, 2011]. Retrieved February 10, 2022&lt;/ref&gt; She secured a contract with [[Don Costa]], who had written arrangements for [[Frank Sinatra]].&lt;ref name=MOJO05/&gt; As Betty Mabry, she recorded &quot;Get Ready For Betty&quot; b/w &quot;I'm Gonna Get My Baby Back&quot; in 1964 for Costa's DCP International label.&lt;ref name=ontherecord/&gt; Around the same time, she recorded a single, &quot;I'll Be There&quot;, with Roy Arlington for Safice Records, under the joint name &quot;Roy and Betty&quot;.&lt;ref&gt;[https://www.soulandjazzandfunk.com/news/betty-davis-rip/ Charles Waring, &quot;Betty Davis RIP&quot;, ''Soul&amp;Jazz&amp;Funk'', February 10, 2022]. Retrieved February 10, 2022&lt;/ref&gt;<br /> <br /> Her first professional gig came after she wrote &quot;Uptown (to Harlem)&quot; for the [[The Chambers Brothers]]. Their 1967 album was a major success, but Mabry focused on her modeling career. She was successful as a model but felt bored by the work—&quot;I didn't like modeling because you didn't need brains to do it. It's only going to last as long as you look good.&quot;&lt;ref&gt;Wang, O., liner notes to ''They Say I’m Different'', Betty Davis, [[Michael Lang (producer)#Just Sunshine Records|Just Sunshine]] – JSS-3500, LP, 1974.&lt;/ref&gt;&lt;ref&gt;Wang, O., [https://magazine.vinylmeplease.com/magazine/betty-davis-liner-notes/ &quot;The Music and Mystique of Betty Davis—Read the Liner Notes to the Funk Singer’s Debut LP&quot;], ''vinylmeplease.com'', June 22, 2017.&lt;/ref&gt;<br /> <br /> In 1968, when she was in a relationship with [[Hugh Masekela]], she recorded several songs for [[Columbia Records]], with Masekela doing the arrangements.&lt;ref name=&quot;pitchfork Nate Patrin 2016&quot; /&gt; Two of them were released as a single: &quot;Live, Love, Learn&quot; b/w &quot;It's My Life&quot;. Her relationship with Miles Davis began soon after her breakup from Masekela. She featured on the cover of Miles Davis' album ''[[Filles de Kilimanjaro]]'', which included his tribute to her, &quot;Mademoiselle Mabry&quot;, and she introduced him to [[psychedelic rock]] and the flamboyant clothing styles of the era.&lt;ref name=allmusic/&gt;&lt;ref name=stylist/&gt; In the spring of 1969, Betty returned to Columbia's 52nd St. Studios to record a series of demo tracks, with Miles and [[Teo Macero]] producing. At least five songs were taped during those sessions, three of which were Mabry originals, two of which were covers of [[Cream (band)|Cream]] and [[Creedence Clearwater Revival]]. Miles attempted to use these demo songs to secure an album deal for Betty, but neither Columbia nor Atlantic were interested and they were archived until 2016, when they were released in the compilation ''The Columbia Years, 1968–1969'' by Seattle's [[Light in the Attic Records]].&lt;ref name=&quot;pitchfork Evan Minsker 2016&quot;&gt;{{Cite web|first1=Evan |last1=Minsker |date=June 28, 2016|title=Lost Betty Davis 1969 Sessions With Miles Davis Released|url=https://pitchfork.com/news/66427-lost-betty-davis-1969-sessions-with-miles-davis-released/|access-date=February 10, 2022|website=Pitchfork|language=en-US}}&lt;/ref&gt;<br /> <br /> After the end of her marriage with Miles, Betty moved to London, probably around 1971, to pursue her modeling career. She wrote music while in the UK and, after about a year, returned to the US with the intention of recording songs with [[Santana (band)|Santana]]. Instead, she recorded her own songs with a group of West Coast funk musicians including [[Larry Graham]], [[Greg Errico]], the [[Pointer Sisters]], and members of [[Tower of Power]].&lt;ref name=allmusic/&gt; Davis wrote and arranged all her songs.&lt;ref name=&quot;:1&quot; /&gt; Her first record, ''[[Betty Davis (album)|Betty Davis]]'', was released in 1973. She released two more studio albums, ''[[They Say I'm Different]]'' (1974)&lt;ref&gt;{{Cite web|url=https://pitchfork.com/reviews/albums/11903-betty-davis-they-say-im-different/|title=Betty Davis: Betty Davis / They Say I'm Different|website=Pitchfork|date=May 22, 2007 |access-date=December 23, 2018}}&lt;/ref&gt; and her major label debut on Island Records ''[[Nasty Gal (album)|Nasty Gal]]'' (1975). None of the three albums were a commercial success,&lt;ref name=allmusic/&gt; but she had two minor hits on the ''[[Billboard (magazine)|Billboard]]'' [[R&amp;B chart]]: &quot;If I'm in Luck I Might Get Picked Up&quot;, which reached no. 66 in 1973, and &quot;Shut Off the Lights&quot;, which reached no. 97 in 1975.&lt;ref name=&quot;:4&quot;&gt;{{Cite magazine|url=https://www.billboard.com/music/betty-davis|title=Betty Davis Chart History|magazine=Billboard}}&lt;/ref&gt;&lt;ref name=&quot;whitburnr&amp;b&quot;&gt;{{cite book |title= Top R&amp;B/Hip-Hop Singles: 1942–1995|last=Whitburn |first=Joel |author-link=Joel Whitburn |year=1996 |publisher=Record Research |page=104}}&lt;/ref&gt;<br /> <br /> Davis remained a cult figure as a singer, due in part to her unabashedly sexual lyrics and performance style, which were both controversial for the time. She had success in Europe, but in the U.S. she was barred from performing on television because of her sexually aggressive stage persona.&lt;ref&gt;{{Cite journal|date=April 15, 1976|title=Her Act Too Spicy For U.S. Tastes; Betty Davis Finds Success In Europe|url=https://books.google.com/books?id=dcADAAAAMBAJ&amp;q=betty+davis+miles+davis+jet&amp;pg=PA57|journal=Jet|volume= 50| issue = 4|pages=57}}&lt;/ref&gt; Some of her shows were boycotted, and her songs were not played on the radio due to pressure by religious groups and the [[NAACP]].&lt;ref&gt;{{cite journal |last=Mahon|first=Maureen|title=They Say She's Different: Race, Gender, Genre, and the Liberated Black Femininity of Betty Davis|journal=Journal of Popular Music Studies|volume=23|issue=2|pages=146–165|date=June 15, 2011|doi=10.1111/j.1533-1598.2011.01277.x}}&lt;/ref&gt; [[Carlos Santana]] recalled Betty as &quot;indomitable – she couldn't be tamed. Musically, philosophically and physically, she was extreme and attractive.&quot;&lt;ref name=&quot;:0&quot; /&gt;<br /> <br /> ==Retirement==<br /> In 1976, Davis completed another album for Island Records (which was shelved and unreleased for 33 years), before being dropped by the label. She spent a year in Japan, spending time with silent monks.&lt;ref name=RS22/&gt; <br /> <br /> In 1980, Davis' father died, which prompted her return to the US to live with her mother in Homestead, Pennsylvania. Davis struggled to overcome her father's death, and subsequent mental illness. She acknowledged that she suffered a setback at the time, but stayed in Homestead, accepted the end of her career, and lived a quiet life.&lt;ref name=WP18&gt;{{cite news| url = http://www.washingtonpost.com/entertainment/music/after-40-years-betty-davis-70s-queen-of-funk-and-wife-of-miles-is-rediscovered/2018/09/11/a878104a-b486-11e8-a2c5-3187f427e253_story.html| title = 'I didn't just fade off the planet.' Reconnecting with '70s funk queen Betty Davis| newspaper = [[The Washington Post]]|date=September 11, 2018}}&lt;/ref&gt;<br /> <br /> The tracks from Davis' final recording sessions in 1979 were released on two bootleg albums, ''Crashin' From Passion'' (1995) and ''Hangin' Out in Hollywood'' (1996).&lt;ref name=&quot;:5&quot;&gt;{{cite book|url=https://www.worldcat.org/oclc/52312236|title=All Music Guide to Soul: The Definitive Guide to R&amp;B and Soul|date=2003|publisher=Hal Leonard Corporation|isbn=0879307447|editor1-last=Bogdanov|editor1-first=Vladimir|location=Ann Arbor, MI|page=178|oclc=52312236|access-date=February 11, 2022}}&lt;/ref&gt; A [[greatest hits]] album, ''Anti Love: The Best of Betty Davis,'' was also released in 1995.&lt;ref&gt;{{Citation|title=Betty Davis - Anti Love - The Best Of Betty Davis|date=September 6, 1995 |url=https://www.discogs.com/release/7812247-Betty-Davis-Anti-Love-The-Best-Of-Betty-Davis|language=en}}&lt;/ref&gt;<br /> <br /> In 2007, ''Betty Davis'' (1973) and ''They Say I'm Different'' (1974) were reissued by [[Light in the Attic Records]].&lt;ref name=&quot;1979a&quot;&gt;{{Cite web|last=Matos|first=Michaelangelo|date=June 14, 2007|title=Why lost funk queen Betty Davis doesn't live up to the hype.|url=https://slate.com/culture/2007/06/why-lost-funk-queen-betty-davis-doesn-t-live-up-to-the-hype.html|access-date=February 10, 2022|website=Slate Magazine|language=en}}&lt;/ref&gt; In 2009, the label reissued ''[[Nasty Gal (album)|Nasty Gal]]'' and her unreleased fourth studio album recorded in 1976, re-titled ''Is It Love or Desire?'' Both reissues contained extensive liner notes and shed some light on the mystery of why her fourth album, considered possibly to be her best work by members of her last band ([[Herbie Hancock]], [[Chuck Rainey]], and [[Alphonse Mouzon]]), was shelved and remained unreleased for 33 years.&lt;ref&gt;{{cite news |last1=Condon |first1=Dan |title=Betty Davis, the controversial queen of raw funk, has died at 76 |url=https://www.abc.net.au/doublej/music-reads/music-news/betty-davis-died-funk-soul-76-miles-davis-chambers-brothers-hug/13748796 |access-date=February 11, 2022 |work=ABC.au |date=February 10, 2022}}&lt;/ref&gt;<br /> <br /> In 2017, an independent documentary directed by Phil Cox entitled ''Betty: They Say I'm Different'' was released, which renewed interest in her life and music career &lt;ref&gt;https://www.imdb.com/title/tt8267204/&lt;/ref&gt;&lt;ref&gt;{{Cite web|url=https://www.youtube.com/watch?v=NebguvtS_Y0 |archive-url=https://ghostarchive.org/varchive/youtube/20211222/NebguvtS_Y0 |archive-date=December 22, 2021 |url-status=live|title=Betty Davis – They Say I'm Different Symposium at NYU Tandon May 25, 2018|via=YouTube}}{{cbignore}}&lt;/ref&gt;<br /> When Cox tracked Davis down, he found her living in the basement of a house with no internet, cell phone, or car. He said: &quot;This wasn't a woman with riches or luxury. She was living on the bare essentials.&quot;&lt;ref name=WP18/&gt;<br /> <br /> In 2019, Davis released &quot;A Little Bit Hot Tonight&quot;, her first new song in over 40 years, which was performed and sung by Danielle Maggio, an [[ethnomusicologist]] who was a close friend and associate producer on ''Betty: They Say I'm Different''.&lt;ref&gt;{{Cite web|url=http://www.thewire.co.uk/news/55837/betty-davis-releases-first-song-in-40-years|title=Betty Davis releases first new song since 1979|website=Thewire.co.uk|date=July 24, 2019|access-date=February 18, 2020}}&lt;/ref&gt;<br /> <br /> In 2023, to mark the 50th anniversary of Betty Davis’ self-titled debut, [[Light in the Attic Records]] reissued three of her albums: ''Betty Davis'', ''They Say I’m Different'', ''Is It Love Or Desire?'', as well as the first official release of Davis’ final 1979 sessions, ''Crashin’ From Passion''. &lt;ref&gt;https://lightintheattic.net/shelves/betty%20davis%20forever&lt;/ref&gt; &lt;ref&gt;https://www.rollingstone.com/music/music-news/betty-davis-four-essential-albums-reissued-1234776098/&lt;/ref&gt;<br /> <br /> == Personal life and death ==<br /> As a model in 1966, Betty met jazz musician [[Miles Davis]], who was 19 years her senior.&lt;ref name=&quot;:3&quot;&gt;{{Cite journal|date=October 17, 1968|title=One Of Sexiest Men Alive|url=https://books.google.com/books?id=hTgDAAAAMBAJ&amp;q=betty+davis+miles+davis+jet+1968&amp;pg=PA48|journal=Jet|volume= 35| issue = 2|pages=48}}&lt;/ref&gt; He was separated from his first wife, dancer [[Frances Taylor Davis|Frances Davis]], and was dating actress [[Cicely Tyson]]. Betty began dating Miles in early 1968, and they were married that September.&lt;ref name=&quot;:3&quot; /&gt; During their year of marriage, she introduced him to the fashions and popular music trends of the era that influenced his music. In his autobiography, Miles credited Betty with helping to plant the seeds of his further musical explorations by introducing the trumpeter to [[psychedelic rock]] guitarist [[Jimi Hendrix]] and funk innovator [[Sly Stone]].&lt;ref name=&quot;pitchfork Nate Patrin 2016&quot; /&gt; The Miles Davis album ''[[Filles de Kilimanjaro]]'' (1968) features Betty on the cover and includes a song named after her.&lt;ref&gt;{{cite news |last1=Kenny |first1=Jack |title=MILES DAVIS - Filles De Kilimanjaro: A Re-evaluation |url=https://www.jazzviews.net/miles-davis---filles-de-kilimanjaro-a-re-evaluation.html |access-date=February 11, 2022 |work=Jazz Views}}&lt;/ref&gt;<br /> <br /> In his autobiography, Miles said Betty was &quot;too young and wild&quot;, and accused her of having an affair with Jimi Hendrix, which hastened the end of their marriage.&lt;ref&gt;{{Cite book|title=Miles: The Autobiography|author1=Davis, Miles|author2=Troupe, Quincy|publisher=Simon &amp; Schuster|year=1990|isbn=978-0-671-72582-2|url-access=registration|url=https://archive.org/details/milesautobiograp0000davi}}&lt;/ref&gt; Betty denied the affair stating, &quot;I was so angry with Miles when he wrote that. It was disrespectful to Jimi and to me. Miles and I broke up because of his violent temper.&quot;&lt;ref name=&quot;:0&quot;&gt;{{Cite web|url=https://www.theguardian.com/music/2010/sep/05/miles-davis-bitches-brew-reissue|title=Miles Davis: The muse who changed him, and the heady Brew that rewrote jazz|last=Spencer|first=Neil|author-link=Neil Spencer|date=September 4, 2010|website=The Obserrver}}&lt;/ref&gt; After accusing her of adultery, he filed for divorce in 1969.&lt;ref&gt;{{Cite book|title=Miles : The Autobiography|last=Miles Davis|first=Quincy Troupe|publisher=Macmillan|year=2012|isbn=9781447218371}}&lt;/ref&gt; Miles told ''[[Jet (magazine)|Jet]]'' magazine that the divorce was obtained on a &quot;temperament&quot; charge. He added, &quot;I'm just not the kind of cat to be married.&quot;&lt;ref&gt;{{Cite journal|date=March 12, 1970|title=Miles Davis Signs $300,000 Record Pact; Sheds Wife|url=https://books.google.com/books?id=TjkDAAAAMBAJ&amp;q=betty+davis+miles+davis+divorce&amp;pg=PA53|journal=Jet|volume=37|issue=24|pages=53}}&lt;/ref&gt; Hendrix and Miles remained close, planning to record, until Hendrix's death. The influence of Hendrix and especially Sly Stone on Miles Davis was obvious on the album ''[[Bitches Brew]]'' (1970), which ushered in the era of [[jazz fusion]]. It has been said that he wanted to call the album ''Witches Brew'' but Betty convinced him to change it.&lt;ref&gt;{{cite web|url=http://www.thedailymaverick.co.za/article/2010-09-07-madonna-before-madonna-the-woman-who-introduced-miles-to-hendrix-finally-speaks|title=Madonna before Madonna: The woman who introduced Miles to Hendrix finally speaks|publisher=Thedailymaverick.co.za|access-date=June 10, 2012|archive-url=https://web.archive.org/web/20100908075148/http://www.thedailymaverick.co.za/article/2010-09-07-madonna-before-madonna-the-woman-who-introduced-miles-to-hendrix-finally-speaks|archive-date=September 8, 2010|url-status=dead}}&lt;/ref&gt;<br /> <br /> Davis briefly dated musician [[Eric Clapton]], but she refused to collaborate with him.&lt;ref name=&quot;:1&quot;&gt;{{Cite web|url=https://www.esquire.com/entertainment/music/a3047/betty-davis-053107/|title=The Soul Singer in the Shadows|last=Dremousis|first=Lisa|date=May 31, 2007|website=[[Esquire (magazine)|Esquire]]}}&lt;/ref&gt;&lt;ref name=&quot;:2&quot;&gt;{{Cite web|url=http://www.dazeddigital.com/music/article/20269/1/nasty-gal-betty-davis|title=The singer, whose sexually potent 70s funk blueprint virtually created its own genre, talks about her personal soul revolution|last=Hundley|first=Jessica|date=June 15, 2014|website=[[Dazed]]}}&lt;/ref&gt;<br /> <br /> In 1975 Davis' lover [[Robert Palmer (singer)|Robert Palmer]] helped her secure a deal with [[Island Records]]. Shortly thereafter she released her album ''[[Nasty Gal (album)|Nasty Gal]]''.&lt;ref name=&quot;:2&quot; /&gt;<br /> <br /> Davis died from cancer at her home in [[Homestead, Pennsylvania]], on February 9, 2022, at the age of 77.&lt;ref name=nytobit/&gt;&lt;ref name=RS22&gt;{{cite magazine|last=Kreps|first=Daniel|title=Betty Davis, trailblazing queen of funk, dead at 77|url=https://www.rollingstone.com/music/music-news/betty-davis-dead-obit-1297372/|magazine=Rolling Stone|date=February 9, 2022}}&lt;/ref&gt;&lt;ref&gt;{{cite news|url = https://www.npr.org/2022/02/09/1079644021/betty-davis-funk-pioneer-dies|title = Betty Davis, funk pioneer and fashion icon, dies at 77|last = Limbong|first = Andrew|work = [[NPR]]|date = February 9, 2022|accessdate = February 11, 2022}}&lt;/ref&gt;<br /> <br /> ==Legacy==<br /> The live action/animated TV series ''[[Mike Judge Presents: Tales from the Tour Bus]]'' ended its 2018 season with an episode focusing on Davis' controversial career.&lt;ref&gt;{{cite web |url=http://www.cinemax.com/mike-judge-presents-tales-from-the-tour-bus/season-2/episode-8.html |url-status=dead |archive-url=https://web.archive.org/web/20200922053106/https://www.cinemax.com/mike-judge-presents-tales-from-the-tour-bus/season-2/episode-8.html |archive-date=September 22, 2020 |title=Mike Judge Presents: Tales From The Tour Bus Season 2 Episode 8 Betty Davis}}&lt;/ref&gt;<br /> <br /> Davis' music has been featured in television series including ''[[Orange Is the New Black]]'',&lt;ref&gt;{{Cite web|url=https://open.spotify.com/playlist/0Xh9GxO4HlVYewDTWcvPZK|title=Orange is the New Black Soundtrack, a playlist by 22sbbj4p6nqxnurgbrpg555oy on Spotify|website=Spotify}}&lt;/ref&gt; ''[[Girlboss (TV series)|Girlboss]]'',&lt;ref&gt;{{cite web |url=http://tidal.com/browse/playlist/3c5a2782-ad42-4fcb-a7bb-5b20d9b8cd12 |url-status=dead |archive-url=https://web.archive.org/web/20210521020328/https://tidal.com/browse/playlist/3c5a2782-ad42-4fcb-a7bb-5b20d9b8cd12 |archive-date=May 21, 2021 |title=Girlboss: The Soundtrack (Season One) on TIDAL}}&lt;/ref&gt; ''[[Mixed-ish]]'',&lt;ref&gt;{{Cite web|url=https://open.spotify.com/playlist/5b5ad8oS24H58uF5oNHqBO|title=Mixed'ish (ABC) TV Soundtrack, a playlist by GrooveScene on Spotify|website=Spotify}}&lt;/ref&gt; ''[[High Fidelity (TV series)|High Fidelity]]''&lt;ref&gt;{{Cite web|url=https://www.popsugar.com/node/47232772|title=&quot;They Say I'm Different&quot; by Betty Davis|first=Amanda|last=Prahl|date=February 23, 2020|website=POPSUGAR Entertainment}}&lt;/ref&gt; and ''[[Pistol (miniseries)|Pistol]]''.&lt;ref&gt;{{cite web | url=https://open.spotify.com/playlist/4vBMNuf998v7QFebN8J5mZ | title=Pistol soundtrack | website=[[Spotify]] }}&lt;/ref&gt;<br /> <br /> == Discography ==<br /> === Studio albums ===<br /> {| class=&quot;wikitable&quot;<br /> |-<br /> !Year !! Album !! Label <br /> ![[Top R&amp;B/Hip-Hop Albums|US R&amp;B]]&lt;ref name=&quot;:4&quot; /&gt;<br /> ![[Kent Music Report|AUS]]&lt;ref name=aus&gt;{{cite book|last=Kent|first=David|author-link=David Kent (historian)|title=Australian Chart Book 1970–1992|edition=illustrated|publisher=Australian Chart Book|location=St Ives, N.S.W.|year=1993|isbn=0-646-11917-6|page=83}}&lt;/ref&gt;<br /> ! Notes<br /> |-<br /> |align=&quot;center&quot;| 1973 ||align=&quot;center&quot;| ''[[Betty Davis (album)|Betty Davis]]'' ||align=&quot;center&quot;| Just Sunshine &lt;br /&gt; Light in the Attic (2007 re-release) <br /> | ||align=&quot;center&quot;| - || align=&quot;center&quot; |Produced by [[Greg Errico]]<br /> |-<br /> |align=&quot;center&quot;| 1974 ||align=&quot;center&quot;| ''[[They Say I'm Different]]'' ||align=&quot;center&quot;| Just Sunshine &lt;br /&gt; Light in the Attic (2007 re-release) <br /> |align=&quot;center&quot;| 46 ||align=&quot;center&quot; | - || align=&quot;center&quot; |Produced by Betty Davis<br /> |-<br /> |align=&quot;center&quot;| 1975 ||align=&quot;center&quot;| ''[[Nasty Gal (album)|Nasty Gal]]'' ||align=&quot;center&quot;| Island &lt;br /&gt; Light in the Attic (2009 re-release) <br /> |align=&quot;center&quot;|54 || align=&quot;center&quot;|96|| align=&quot;center&quot; |Produced by Betty Davis<br /> |-<br /> |align=&quot;center&quot;| 2009 ||align=&quot;center&quot;| ''Is It Love or Desire?'' ||align=&quot;center&quot;| Light in the Attic <br /> | ||align=&quot;center&quot;| - || align=&quot;center&quot; |Recorded in 1976 and released in 2009<br /> |}<br /> <br /> === Singles ===<br /> {| class=&quot;wikitable&quot;<br /> |-<br /> !Year !! Single !! Label <br /> ![[Hot R&amp;B/Hip-Hop Songs|US R&amp;B]]&lt;ref name=&quot;:4&quot; /&gt;<br /> ! Notes<br /> |-<br /> |align=&quot;center&quot;| 1963? ||align=&quot;center&quot;| &quot;The Cellar&quot;/&quot;???&quot; ||align=&quot;center&quot;| Independent Release <br /> | || align=&quot;center&quot; |Produced by Lou Courtney<br /> |-<br /> |align=&quot;center&quot;| 1964 ||align=&quot;center&quot;| &quot;Get Ready for Betty&quot; / &quot;I'm Gonna Get My Baby Back&quot; ||align=&quot;center&quot;| DCP <br /> | || align=&quot;center&quot; |Produced by Don Costa<br /> |-<br /> |align=&quot;center&quot;| 1968 ||align=&quot;center&quot;| &quot;It's My Life&quot; / &quot;Live, Love, Learn&quot; ||align=&quot;center&quot;| Columbia <br /> | || align=&quot;center&quot; |Produced by Jerry Fuller<br /> |-<br /> |align=&quot;center&quot;| 1973 ||align=&quot;center&quot;| &quot;If I'm in Luck I Might Get Picked Up&quot; / &quot;Steppin in Her I. Miller Shoes&quot;||align=&quot;center&quot;| Just Sunshine <br /> |align=&quot;center&quot;|66|| align=&quot;center&quot; |Produced by Gregg Errico<br /> |-<br /> |align=&quot;center&quot;| 1973 ||align=&quot;center&quot;| &quot;Ooh Yea&quot; / &quot;In the Meantime&quot; ||align=&quot;center&quot;| Just Sunshine <br /> | || align=&quot;center&quot; | <br /> |-<br /> |align=&quot;center&quot;| 1974 ||align=&quot;center&quot;| &quot;Shoo-B-Doop and Cop Him&quot; / &quot;He Was a Big Freak&quot; ||align=&quot;center&quot;| Just Sunshine <br /> | || align=&quot;center&quot; | <br /> |-<br /> |align=&quot;center&quot;| 1974 ||align=&quot;center&quot;| &quot;Git in There&quot; /&quot;They Say I'm Different&quot; ||align=&quot;center&quot;| Just Sunshine <br /> | || align=&quot;center&quot; | <br /> |-<br /> |align=&quot;center&quot;| 1975 ||align=&quot;center&quot;| &quot;Shut Off the Light&quot; / &quot;He Was a Big Freak&quot; ||align=&quot;center&quot;| Island <br /> |align=&quot;center&quot; |97|| align=&quot;center&quot; | <br /> |}<br /> <br /> === Compilation ===<br /> {| class=&quot;wikitable&quot;<br /> |-<br /> !Year !! Album !! Label !! Notes<br /> |-<br /> |align=&quot;center&quot;| 2016 ||align=&quot;center&quot;| ''The Columbia Years 1968-69'' ||align=&quot;center&quot;| Light in the Attic ||align=&quot;center&quot;| Tracks recorded in 1968 – 1969 and released in 2016; &lt;br&gt;produced by Miles Davis &amp; Teo Macero&lt;ref name=&quot;pitchfork Nate Patrin 2016&quot;&gt;{{cite news |last1=Patrin |first1=Nate |title=The Columbia Years 1968-69 |url=https://pitchfork.com/reviews/albums/22111-the-columbia-years-1968-69/ |access-date=February 11, 2022 |work=Pitchfork |date=July 15, 2016}}&lt;/ref&gt;&lt;ref name=&quot;pitchfork Evan Minsker 2016&quot; /&gt;<br /> |}<br /> <br /> === Unofficial releases ===<br /> *''Crashin' from Passion'' (1995) (Razor &amp; Tie)&lt;ref&gt;{{Cite web|title=Crashin' from Passion - Betty Davis|url=https://www.allmusic.com/album/crashin-from-passion-mw0000181395|publisher=AllMusic|language=en|access-date=February 11, 2022}}&lt;/ref&gt; / ''Hangin' Out in Hollywood'' (1995) (Charly Records)&lt;ref&gt;{{Cite web|title=Hangin' Out in Hollywood - Betty Davis|url=https://www.allmusic.com/album/hangin-out-in-hollywood-mw0000083218|publisher=AllMusic|language=en|access-date=February 11, 2022}}&lt;/ref&gt; – Compilation of material recorded in 1979&lt;ref name=&quot;:5&quot; /&gt;<br /> &lt;!-- Albums below need to be checked if they were officially released or not --&gt;<br /> * ''Anti Love: The Best of Betty Davis'' (2001) (MPC limited) – Compilation&lt;ref&gt;{{Cite web|title=Anti Love: The Best of Betty Davis|url=https://www.allmusic.com/album/anti-love-the-best-of-betty-davis-mw0000459996|publisher=AllMusic|language=en|access-date=February 11, 2022}}&lt;/ref&gt;<br /> * ''This Is It! Anthology'' (2005) (Vampisoul) – Compilation&lt;ref&gt;{{Cite web|title=This Is It! - Betty Davis|url=https://www.allmusic.com/album/this-is-it!-mw0000387128|publisher=AllMusic|language=en|access-date=February 11, 2022}}&lt;/ref&gt;<br /> <br /> == References ==<br /> {{Reflist}}<br /> <br /> == Literature ==<br /> * Liner notes to [[Light in the Attic Records]]' 2007 re-issue of Betty Davis' self-titled 1973 debut album.<br /> <br /> ==External links==<br /> * {{AllMusic|class=artist|id=p16873|label=Betty Davis}}<br /> * [https://musicians.allaboutjazz.com/bettydavis Betty Davis] at [[AllAboutJazz]]<br /> * {{Discogs artist|Betty Davis}}<br /> * {{IMDb name|10140175}}<br /> * [http://www.maximumfun.org/blog/2007/06/podcast-tsoya-betty-davis.html ''The Sound of Young America''] – interview on Maximum Fun June 21, 2007 (her first radio interview in 30 years)<br /> * [https://www.nodepression.com/the-beautiful-dichotomy-of-betty-davis-a-rare-conversation-with-the-elusive-mistress-of-funk/ ''The Beautiful Dichotomy of Betty Davis: A Rare Conversation with the Elusive Mistress of Funk''] – interview on No Depression, February 2010, by J. Hayes<br /> * Neil Spencer, [https://www.theguardian.com/music/2010/sep/05/miles-davis-bitches-brew-reissue &quot;Miles Davis: The muse who changed him, and the heady Brew that rewrote jazz&quot;], ''[[The Guardian]]'', September 5, 2010 (including 2010 interview).<br /> *[https://www.newyorker.com/culture/culture-desk/the-artful-erotic-and-still-misunderstood-funk-of-betty-davis Emily Lordi, &quot;The Artful, Erotic and Still Misunderstood Funk of Betty Davis&quot;], ''New Yorker'', May 2, 2018<br /> <br /> {{Authority control}}<br /> <br /> {{DEFAULTSORT:Davis, Betty}}<br /> [[Category:1944 births]]<br /> [[Category:2022 deaths]]<br /> [[Category:20th-century African-American women singers]]<br /> [[Category:African-American rock musicians]]<br /> [[Category:African-American women singer-songwriters]]<br /> [[Category:American expatriates in the United Kingdom]]<br /> [[Category:American funk singers]]<br /> [[Category:American rhythm and blues singer-songwriters]]<br /> [[Category:American rock singers]]<br /> [[Category:American rock songwriters]]<br /> [[Category:American soul singers]]<br /> [[Category:Deaths from cancer in Pennsylvania]]<br /> [[Category:Fashion Institute of Technology alumni]]<br /> [[Category:Female models from North Carolina]]<br /> [[Category:Female models from Pittsburgh]]<br /> [[Category:Island Records artists]]<br /> [[Category:Miles Davis]]<br /> [[Category:Musicians from Durham, North Carolina]]<br /> [[Category:Singers from Pittsburgh]]<br /> [[Category:People from Homestead, Pennsylvania]]<br /> [[Category:Singer-songwriters from North Carolina]]<br /> [[Category:Singer-songwriters from Pennsylvania]]<br /> [[Category:Light in the Attic Records artists]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Talk:Quantum_algorithm&diff=1177056582 Talk:Quantum algorithm 2023-09-25T17:37:10Z <p>205.189.94.9: /* The use of the adverb &quot;probably&quot; */ new section</p> <hr /> <div>{{WikiProject Physics|class=C|importance=mid}}<br /> {{WikiProject Computer science|class=C|importance=high}}<br /> <br /> == Merging with the Quantum Algorithm Zoo? ==<br /> <br /> I've been in contact with Stephen Jordan of the [http://quantumalgorithmzoo.org/ Quantum Algorithm Zoo], and he is open to the possibility of merging the very extensive information available there into Wikipedia.<br /> <br /> What do you think? &lt;!-- Template:Unsigned --&gt;&lt;small class=&quot;autosigned&quot;&gt;—&amp;nbsp;Preceding [[Wikipedia:Signatures|unsigned]] comment added by [[User:Shai mach|Shai mach]] ([[User talk:Shai mach#top|talk]] • [[Special:Contributions/Shai mach|contribs]]) 00:35, 4 November 2019 (UTC)&lt;/small&gt; &lt;!--Autosigned by SineBot--&gt;<br /> <br /> : Typically one adds a section at the *bottom* of the talk page; adding it at the top may cause it to be lost.<br /> : Regarding the Quantum Algorithm Zoo, it looks like a great resource, yet it looks to me to give much more detail than is appropriate for a Wikipedia article. [[User:Sanpitch|Sanpitch]] ([[User talk:Sanpitch|talk]]) 23:56, 5 November 2019 (UTC)<br /> <br /> == Experimental Quantum Computing to Solve Systems of Linear Equations ==<br /> <br /> Maybe we should add something about this:<br /> [http://prl.aps.org/abstract/PRL/v110/i23/e230501 Experimental Quantum Computing to Solve Systems of Linear Equations]<br /> <br /> --[[User:Vitalij zad|Vitalij zad]] ([[User talk:Vitalij zad|talk]]) 14:39, 29 August 2013 (UTC)<br /> <br /> : It took six years, yet I just added something on this. [[Special:Contributions/107.0.94.194|107.0.94.194]] ([[User talk:107.0.94.194|talk]]) 21:45, 14 February 2019 (UTC)<br /> <br /> == Quantum / Classical 'equivalence'? ==<br /> <br /> &quot;All problems which can be solved on a quantum computer can be solved on a classical computer.&quot;<br /> <br /> Although the above may be true in some abstract mathematical sense, I don't think it is true in reality.<br /> <br /> David Deutsch describes, in 'The Fabric of Reality' some problems which could be rapidly solved by a quantum computer that could not be solved by any classical computer that could ever conceivably be constructed. (Put another way: classical computers could solve any problem a quantum computer could solve, it the universe didn't impose the constraints that it actually *does* impose.) At the moment, I think the lead into this article might suggest that quantum computers are a bit (or a *lot*) faster than classical computers - but really, it is more than that: quantum computers can actually solve, in the physical universe, problems that will never, ever be solved (in our actual, real universe - as opposed to an abstract, hypothetical one) by a classical computer. [[Special:Contributions/62.232.250.50|62.232.250.50]] ([[User talk:62.232.250.50|talk]]) 18:37, 10 January 2014 (UTC)<br /> <br /> == BQP-complete ==<br /> Perhaps the article should explain the BQP acronym (I think I can guess, but...) [[Special:Contributions/62.232.250.50|62.232.250.50]] ([[User talk:62.232.250.50|talk]]) 19:00, 10 January 2014 (UTC)<br /> :Yep, just added a description. [[User:Jaydavidmartin|Jaydavidmartin]] ([[User talk:Jaydavidmartin|talk]]) 22:58, 7 April 2020 (UTC)<br /> <br /> == &quot;Exponential speedup&quot; ==<br /> <br /> It is extremely common to use &quot;exponential speedup&quot; very loosely, and say that Shor's algorithm provides an exponential speedup over GNFS. However, given that GNFS already provides sub-exponential factoring (that is, faster than &lt;math&gt;c^n&lt;/math&gt; for any &lt;math&gt;c &gt; 0&lt;/math&gt;), we should probably change the last sentence in the leading paragraph which claims that Shor's algorithm provides an &quot;exponential speedup&quot;. Thoughts?<br /> <br /> : I changed the lede to say &quot;much (almost exponentially) faster&quot; rather than &quot;exponentially faster&quot;. [[User:Sanpitch|Sanpitch]] ([[User talk:Sanpitch|talk]]) 20:31, 19 March 2019 (UTC)<br /> <br /> == Quadratically faster than linear? ==<br /> <br /> The introduction says: ''Grover's algorithm runs quadratically faster than the best possible classical algorithm for the same task, a linear search.''<br /> <br /> What does that even mean? Surely it doesn't want to imply Grover's algorithm runs in &lt;math&gt;O \left(\frac{1}{N}\right)&lt;/math&gt;, making it faster the more there is to search in.<br /> [[Special:Contributions/62.216.5.216|62.216.5.216]] ([[User talk:62.216.5.216|talk]]) 15:28, 21 April 2021 (UTC)<br /> <br /> == Quanta Magazine article ==<br /> <br /> I just saw this: https://www.quantamagazine.org/quantum-algorithms-conquer-a-new-kind-of-problem-20220711/ [[User:Billymac00|Billymac00]] ([[User talk:Billymac00|talk]]) 21:51, 14 July 2022 (UTC)<br /> <br /> == The use of the adverb &quot;probably&quot; ==<br /> <br /> There are numerous cases of the use of the adverb &quot;probably&quot; in the article that are imprecise, inappropriate, and take away from the professionalism and encyclopediac tone of the article. I'd consider removing them. [[Special:Contributions/205.189.94.9|205.189.94.9]] ([[User talk:205.189.94.9|talk]]) 17:37, 25 September 2023 (UTC)</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Rob_Pearson&diff=1176606352 Rob Pearson 2023-09-22T20:35:05Z <p>205.189.94.9: GM Head Coach of Pickering Panthers; OJHL.</p> <hr /> <div>{{short description|Canadian ice hockey player}}<br /> {{Infobox ice hockey player<br /> | image = <br /> | image_size = <br /> | position = [[Winger (ice hockey)|Right Wing]]<br /> | played_for = [[Toronto Maple Leafs]] &lt;br&gt; [[Washington Capitals]] &lt;br&gt; [[St. Louis Blues]]<br /> | shoots = Right<br /> | height_ft = 6<br /> | height_in = 1<br /> | weight_lb = 198<br /> | birth_date = {{birth date and age|1971|8|3|mf=y}}<br /> | birth_place = [[Oshawa]], [[Ontario]], Canada<br /> | career_start = 1990<br /> | career_end = 2002<br /> | draft = 12th overall<br /> | draft_year = 1989<br /> | draft_team = [[Toronto Maple Leafs]]<br /> }}<br /> <br /> '''Robert Gordon Pearson''' (born August 3, 1971) is a [[Canadian]] former professional [[ice hockey]] [[winger (ice hockey)|right winger]] who played in the [[National Hockey League]] (NHL).&lt;ref&gt;{{cite web |url=http://www.hockeydb.com/ihdb/stats/pdisplay.php?pid=4204 |title = Rob Pearson (b.1971) Hockey Stats and Profile at hockeydb.com}}&lt;/ref&gt;, and is now [[general manager]] and head [[coach]] of [[Pickering Panthers]] of the [[Ontario Junior Hockey League]]. <br /> <br /> ==Biography==<br /> Pearson was born in [[Oshawa]], [[Ontario]]. As a youth, he played in the 1984 [[Quebec International Pee-Wee Hockey Tournament]] with a [[minor ice hockey]] team from Oshawa.&lt;ref&gt;{{cite web|url=https://www.publicationsports.com/ressources/files/439/Joueurs_Pro.pdf|title=Pee-Wee players who have reached NHL or WHA|year=2018|website=Quebec International Pee-Wee Hockey Tournament|access-date=2019-01-20}}&lt;/ref&gt;<br /> <br /> Pearson was drafted 12th overall by the [[Toronto Maple Leafs]] in the [[1989 NHL Entry Draft]]. He played in 269 career NHL games, scoring 56 goals and 54 assists for 110 points. Pearson also played for the [[St. Louis Blues]] and the [[Washington Capitals]].{{cn|date=January 2019}}<br /> <br /> On April 24, 2006, Pearson signed on as an assistant coach for the [[University of Ontario Institute of Technology Ridgebacks]] in Oshawa, Ontario,&lt;ref&gt;{{cite journal |title=FORMER NHLER ROB PEARSON JOINS UOIT HOCKEY STAFF |url=http://oua.ca/sports/mice/2006-07/releases/180.html |website=oua.ca |accessdate=July 2, 2018 |date=April 24, 2006}}&lt;/ref&gt; but later resigned from that position. In 2017, he was named the new head coach of [[Whitby Fury]].&lt;ref&gt;{{cite web |title=Rob Pearson new head coach of Whitby Fury |url=https://www.durhamregion.com/sports-story/7201727-rob-pearson-new-head-coach-of-whitby-fury/ |website=durhamregion.com |accessdate=July 2, 2018 |date=March 21, 2017}}&lt;/ref&gt;<br /> <br /> ==Career statistics==<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; style=&quot;text-align:center; width:60em&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; bgcolor=&quot;#ffffff&quot; | &amp;nbsp;<br /> ! rowspan=&quot;100&quot; bgcolor=&quot;#ffffff&quot; | &amp;nbsp;<br /> ! colspan=&quot;5&quot; | [[Regular season|Regular&amp;nbsp;season]]<br /> ! rowspan=&quot;100&quot; bgcolor=&quot;#ffffff&quot; | &amp;nbsp;<br /> ! colspan=&quot;5&quot; | [[Playoffs]]<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! [[Season (sports)|Season]]<br /> ! Team<br /> ! League<br /> ! GP<br /> ! [[Goal (ice hockey)|G]]<br /> ! [[Assist (ice hockey)|A]]<br /> ! [[Point (ice hockey)|Pts]]<br /> ! [[Penalty (ice hockey)|PIM]]<br /> ! GP<br /> ! G<br /> ! A<br /> ! Pts<br /> ! PIM<br /> |-<br /> | 1987–88<br /> | Oshawa Kiwanis AAA<br /> | [[Eastern AAA Hockey League|Midget]]<br /> | 72<br /> | 68<br /> | 65<br /> | 133<br /> | 188<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1988–89 OHL season|1988–89]]<br /> | [[Belleville Bulls]]<br /> | [[Ontario Hockey League|OHL]]<br /> | 26<br /> | 8<br /> | 12<br /> | 20<br /> | 51<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |-<br /> | [[1989–90 OHL season|1989–90]]<br /> | Belleville Bulls<br /> | OHL<br /> | 58<br /> | 48<br /> | 40<br /> | 88<br /> | 174<br /> | 11<br /> | 5<br /> | 5<br /> | 10<br /> | 26<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1990–91 OHL season|1990–91]]<br /> | Belleville Bulls<br /> | OHL<br /> | 10<br /> | 6<br /> | 3<br /> | 9<br /> | 27<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |-<br /> | 1990–91 <br /> | [[Oshawa Generals]]<br /> | OHL<br /> | 41<br /> | 57<br /> | 52<br /> | 109<br /> | 76<br /> | 16<br /> | 16<br /> | 17<br /> | 33<br /> | 39<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1990–91 AHL season|1990–91]]<br /> | [[Newmarket Saints]]<br /> | [[American Hockey League|AHL]]<br /> | 3<br /> | 0<br /> | 0<br /> | 0<br /> | 29<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |-<br /> | [[1991–92 NHL season|1991–92]]<br /> | [[Toronto Maple Leafs]]<br /> | [[National Hockey League|NHL]]<br /> | 47<br /> | 14<br /> | 10<br /> | 24<br /> | 58<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1991–92 AHL season|1991–92]]<br /> | [[St. John's Maple Leafs]]<br /> | AHL<br /> | 27<br /> | 15<br /> | 14<br /> | 29<br /> | 107<br /> | 13<br /> | 5<br /> | 4<br /> | 9<br /> | 40<br /> |-<br /> | [[1992–93 NHL season|1992–93]]<br /> | Toronto Maple Leafs<br /> | NHL<br /> | 78<br /> | 23<br /> | 14<br /> | 37<br /> | 211<br /> | 14<br /> | 2<br /> | 2<br /> | 4<br /> | 31<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1993–94 NHL season|1993–94]]<br /> | Toronto Maple Leafs<br /> | NHL<br /> | 67<br /> | 12<br /> | 18<br /> | 30<br /> | 189<br /> | 14<br /> | 1<br /> | 0<br /> | 1<br /> | 32<br /> |-<br /> | [[1994–95 NHL season|1994–95]]<br /> | [[Washington Capitals]]<br /> | NHL<br /> | 32<br /> | 0<br /> | 6<br /> | 6<br /> | 96<br /> | 3<br /> | 1<br /> | 0<br /> | 1<br /> | 17<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1995–96 AHL season|1995–96]]<br /> | [[Portland Pirates]]<br /> | AHL<br /> | 44<br /> | 18<br /> | 24<br /> | 42<br /> | 143<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |-<br /> | [[1995–96 NHL season|1995–96]]<br /> | [[St. Louis Blues]]<br /> | NHL<br /> | 27<br /> | 6<br /> | 4<br /> | 10<br /> | 54<br /> | 2<br /> | 0<br /> | 0<br /> | 0<br /> | 14<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1996–97 NHL season|1996–97]]<br /> | St. Louis Blues<br /> | NHL<br /> | 18<br /> | 1<br /> | 2<br /> | 3<br /> | 37<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |-<br /> | [[1996–97 AHL season|1996–97]]<br /> | [[Worcester IceCats]]<br /> | AHL<br /> | 46<br /> | 11<br /> | 16<br /> | 27<br /> | 199<br /> | 5<br /> | 3<br /> | 0<br /> | 3<br /> | 16<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1997–98 IHL season|1997–98]]<br /> | [[Cleveland Lumberjacks]]<br /> | [[International Hockey League (1945–2001)|IHL]]<br /> | 46<br /> | 17<br /> | 14<br /> | 31<br /> | 118<br /> | 10<br /> | 6<br /> | 4<br /> | 10<br /> | 43<br /> |-<br /> | [[1998–99 IHL season|1998–99]]<br /> | Cleveland Lumberjacks<br /> | IHL<br /> | 20<br /> | 3<br /> | 10<br /> | 13<br /> | 27<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | 1998–99<br /> | [[Orlando Solar Bears (IHL)|Orlando Solar Bears]]<br /> | IHL<br /> | 11<br /> | 6<br /> | 2<br /> | 8<br /> | 41<br /> | 17<br /> | 8<br /> | 6<br /> | 14<br /> | 24<br /> |-<br /> | [[1999–2000 IHL season|1999–2000]]<br /> | [[Long Beach Ice Dogs]]<br /> | IHL<br /> | 60<br /> | 17<br /> | 23<br /> | 40<br /> | 145<br /> | 4<br /> | 0<br /> | 0<br /> | 0<br /> | 8<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[2001–02 DEL season|2001–02]]<br /> | [[Frankfurt Lions]]<br /> | [[Deutsche Eishockey Liga|DEL]]<br /> | 33<br /> | 5<br /> | 16<br /> | 21<br /> | 125<br /> | —<br /> | —<br /> | —<br /> | —<br /> | —<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; | AHL totals<br /> ! 120<br /> ! 44<br /> ! 54<br /> ! 98<br /> ! 478<br /> ! 18<br /> ! 8<br /> ! 4<br /> ! 12<br /> ! 56<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; | NHL totals<br /> ! 269<br /> ! 56<br /> ! 54<br /> ! 110<br /> ! 645<br /> ! 33<br /> ! 4<br /> ! 2<br /> ! 6<br /> ! 94<br /> |}<br /> <br /> ==References==<br /> {{Reflist}}<br /> <br /> ==External links==<br /> *[http://www.uoitridgebacks.com/news/april21-06.htm UOIT Athletics Media Release]<br /> * {{Ice hockey stats}}<br /> <br /> {{s-start}}<br /> {{succession box | before = [[Scott Thornton (ice hockey)|Scott Thornton]] | title = [[List of Toronto Maple Leafs draft picks|Toronto Maple Leafs first round draft pick]] | years = [[1989 NHL Entry Draft|1989]] | after = [[Steve Bancroft]]}}<br /> {{s-end}}<br /> <br /> {{DEFAULTSORT:Pearson, Rob}}<br /> [[Category:1971 births]]<br /> [[Category:Living people]]<br /> [[Category:Belleville Bulls players]]<br /> [[Category:Canadian ice hockey right wingers]]<br /> [[Category:Cleveland Lumberjacks players]]<br /> [[Category:Frankfurt Lions players]]<br /> [[Category:Ice hockey people from Oshawa]]<br /> [[Category:Long Beach Ice Dogs (IHL) players]]<br /> [[Category:National Hockey League first-round draft picks]]<br /> [[Category:Newmarket Saints players]]<br /> [[Category:Orlando Solar Bears (IHL) players]]<br /> [[Category:Oshawa Generals players]]<br /> [[Category:Portland Pirates players]]<br /> [[Category:St. John's Maple Leafs players]]<br /> [[Category:St. Louis Blues players]]<br /> [[Category:Toronto Maple Leafs players]]<br /> [[Category:Toronto Maple Leafs draft picks]]<br /> [[Category:Washington Capitals players]]<br /> [[Category:Worcester IceCats players]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Relativity_of_simultaneity&diff=1174315684 Relativity of simultaneity 2023-09-07T17:50:53Z <p>205.189.94.9: 1=1+1</p> <hr /> <div>[http://www.g.co.com/sriramvenkataramani time and dynamics of non-time]{{short description|Concept that distant simultaneity is not absolute, but depends on the observer's reference frame}}<br /> {{Lead too short|date=October 2022}}<br /> [[File:RoundTripToVega.gif|right|thumb|300px|On spaceships, map-clocks may look unsynchronized.]]<br /> [[File:Relativity of Simultaneity.svg|thumb|Event B is simultaneous with A in the green reference frame, but it occurred before in the blue frame, and will occur later in the red frame.]]<br /> [[File:Relativity of Simultaneity Animation.gif|thumb|Events A, B, and C occur in different order depending on the motion of the observer. The white line represents a plane of simultaneity being moved from the past to the future.]]<br /> In [[physics]], the '''relativity of simultaneity''' is the concept that ''distant [[wikt:simultaneity|simultaneity]]''&amp;nbsp;– whether two spatially separated events occur at the same [[Time in physics|time]]&amp;nbsp;– is not [[absolute time and space|absolute]], but depends on the [[observer's reference frame]]. This possibility was raised by mathematician [[Henri Poincaré]] in 1900, and thereafter became a central idea in the [[special theory of relativity]].<br /> <br /> ==Description==<br /> According to the [[Special relativity|special theory of relativity introduced by Albert Einstein]], it is impossible to say in an ''absolute'' sense that two distinct [[event (relativity)|events]] occur at the same time if those events are separated in space. If one reference frame assigns precisely the same time to two events that are at different points in space, a reference frame that is moving relative to the first will generally assign different times to the two events (the only exception being when motion is exactly perpendicular to the line connecting the locations of both events).<br /> <br /> For example, a car crash in London and another in New York appearing to happen at the same time to an observer on Earth, will appear to have occurred at slightly different times to an observer on an airplane flying between London and New York. Furthermore, if the two events cannot be causally connected, depending on the state of motion, the crash in London may appear to occur first in a given frame, and the New York crash may appear to occur first in another. However, if the events are causally connected, precedence order is preserved in all frames of reference.&lt;ref&gt;{{Citation|author=Mamone-Capria, Marco|title=Simultaneity as an invariant equivalence relation|year=2012|journal=[[Foundations of Physics]]|volume=42|issue=11 |pages=1365–1383|url=https://link.springer.com/article/10.1007/s10701-012-9674-4|doi=10.1007/s10701-012-9674-4|arxiv=1202.6578 |bibcode=2012FoPh...42.1365M |s2cid=254513121 }}&lt;/ref&gt;<br /> <br /> ==History==<br /> {{Main|History of special relativity|History of Lorentz transformations|Lorentz ether theory}}<br /> <br /> In 1892 and 1895, [[Hendrik Lorentz]] used a mathematical method called &quot;local time&quot; ''t' = t – v x/c''&lt;sup&gt;2&lt;/sup&gt; for explaining the negative [[luminiferous aether|aether]] drift experiments.&lt;ref&gt;{{Citation|author=Lorentz, Hendrik Antoon|year=1895|title=Versuch einer Theorie der electrischen und optischen Erscheinungen in bewegten Körpern|place=Leiden|publisher=E.J. Brill|title-link=s:de:Versuch einer Theorie der electrischen und optischen Erscheinungen in bewegten Körpern}}&lt;/ref&gt; However, Lorentz gave no physical explanation of this effect. This was done by [[Henri Poincaré]] who already emphasized in 1898 the conventional nature of simultaneity and who argued that it is convenient to postulate the constancy of the speed of light in all directions. However, this paper did not contain any discussion of Lorentz's theory or the possible difference in defining simultaneity for observers in different states of motion.&lt;ref&gt;{{Citation|author=Poincaré, Henri|year=1898–1913|title=The foundations of science|chapter=[[s:The Measure of Time|The Measure of Time]]|place=New York|publisher=Science Press|pages=222–234}}&lt;/ref&gt;&lt;ref&gt;{{Citation|author=Galison, Peter|year=2003|title= Einstein's Clocks, Poincaré's Maps: Empires of Time|place=New York|publisher=W.W. Norton|isbn=0-393-32604-7}}&lt;/ref&gt;<br /> This was done in 1900, when Poincaré derived local time by assuming that the speed of light is invariant within the aether. Due to the &quot;principle of relative motion&quot;, moving observers within the aether also assume that they are at rest and that the speed of light is constant in all directions (only to first order in ''v/c''). Therefore, if they synchronize their clocks by using light signals, they will only consider the transit time for the signals, but not their motion in respect to the aether. So the moving clocks are not synchronous and do not indicate the &quot;true&quot; time. Poincaré calculated that this synchronization error corresponds to Lorentz's local time.&lt;ref&gt;{{Citation|author=Poincaré, Henri|year=1900|title=La théorie de Lorentz et le principe de réaction|journal=Archives Néerlandaises des Sciences Exactes et Naturelles|volume=5|pages=252–278|title-link=s:fr:La théorie de Lorentz et le principe de réaction}}. See also the [http://www.physicsinsights.org/poincare-1900.pdf English translation].&lt;/ref&gt;&lt;ref&gt;{{Citation|author=Darrigol, Olivier|title=The Genesis of the theory of relativity|year=2005|journal=Séminaire Poincaré|volume=1|pages=1–22|url=http://www.bourbaphy.fr/darrigol2.pdf|doi=10.1007/3-7643-7436-5_1|isbn=978-3-7643-7435-8|bibcode=2006eins.book....1D}}&lt;/ref&gt;<br /> In 1904, Poincaré emphasized the connection between the principle of relativity, &quot;local time&quot;, and light speed invariance; however, the reasoning in that paper was presented in a qualitative and conjectural manner.&lt;ref&gt;{{Citation|author=Poincaré, Henri|year=1904–1906|chapter=[[s:The Principles of Mathematical Physics|The Principles of Mathematical Physics]]|title=Congress of arts and science, universal exposition, St. Louis, 1904|volume=1|pages=604–622|publisher=Houghton, Mifflin and Company|place=Boston and New York}}&lt;/ref&gt;&lt;ref&gt;{{Citation|author=Holton, Gerald|year=1988|title= [[Thematic Origins of Scientific Thought: Kepler to Einstein]]|publisher=Harvard University Press|isbn=0-674-87747-0}}&lt;/ref&gt;<br /> <br /> [[Albert Einstein]] used a similar method in 1905 to derive the time transformation for all orders in ''v/c'', i.e., the complete Lorentz transformation. Poincaré obtained the full transformation earlier in 1905 but in the papers of that year he did not mention his synchronization procedure. This derivation was completely based on light speed invariance and the relativity principle, so Einstein noted that for the electrodynamics of moving bodies the aether is superfluous. Thus, the separation into &quot;true&quot; and &quot;local&quot; times of Lorentz and Poincaré vanishes&amp;nbsp;– all times are equally valid and therefore the relativity of length and time is a natural consequence.&lt;ref&gt;{{Citation|author=Einstein, Albert|year=1905|title=Zur Elektrodynamik bewegter Körper|journal=Annalen der Physik|volume=322|issue=10|pages=891–921|url=http://www.physik.uni-augsburg.de/annalen/history/einstein-papers/1905_17_891-921.pdf<br /> |doi=10.1002/andp.19053221004|bibcode = 1905AnP...322..891E |doi-access=free}}. See also: [http://www.fourmilab.ch/etexts/einstein/specrel/ English translation].&lt;/ref&gt;&lt;ref&gt;{{Citation|author=Miller, Arthur I.|year=1981|title=Albert Einstein's special theory of relativity. Emergence (1905) and early interpretation (1905–1911)|place=Reading|publisher=Addison–Wesley|isbn=0-201-04679-2|url-access=registration|url=https://archive.org/details/alberteinsteinss0000mill}}&lt;/ref&gt;&lt;ref&gt;{{Citation|author=Pais, Abraham|year=1982|title= [[Subtle is the Lord: The Science and the Life of Albert Einstein]]|place = New York|publisher=Oxford University Press|isbn=0-19-520438-7}}&lt;/ref&gt;<br /> <br /> In 1908, [[Hermann Minkowski]] introduced the concept of a [[world line]] of a particle&lt;ref&gt;{{Citation|author=Minkowski, Hermann|year=1909|title=Raum und Zeit|journal=Physikalische Zeitschrift|volume=10|pages=75–88|title-link=s:de:Raum und Zeit (Minkowski)}}<br /> *Various English translations on Wikisource: [[s:Space and Time|Space and Time]]&lt;/ref&gt; in his model of the cosmos called [[Minkowski space]]. In Minkowski's view, the naïve notion of [[velocity]] is replaced with [[rapidity]], and the ordinary sense of simultaneity becomes dependent on [[hyperbolic orthogonality]] of spatial directions to the worldline associated to the rapidity. Then every [[inertial frame of reference]] has a rapidity and a [[World line#Simultaneous hyperplane|simultaneous hyperplane]].<br /> <br /> In 1990 [[Robert Goldblatt]] wrote ''Orthogonality and Spacetime Geometry'', directly addressing the structure Minkowski had put in place for simultaneity.&lt;ref&gt;A.D. Taimanov (1989) &quot;Review of ''Orthogonality and Spacetime Geometry''&quot;, [[Bulletin of the American Mathematical Society]] 21(1)&lt;/ref&gt; In 2006 [[Max Jammer]], through [[Project MUSE]], published ''Concepts of Simultaneity: from antiquity to Einstein and beyond''. The book culminates in chapter 6, &quot;The transition to the relativistic conception of simultaneity&quot;. Jammer indicates that [[Ernst Mach]] demythologized the absolute time of Newtonian physics.<br /> <br /> Naturally the mathematical notions preceded physical interpretation. For instance [[conjugate diameters]] of a hyperbola, are related as space and time. The [[principle of relativity]] can be expressed as the arbitrariness of which pair are taken to represent space and time in a plane.&lt;ref&gt;{{Cite book | author=Whittaker, E.T. | authorlink=E. T. Whittaker | year=1910 | edition=1 | title=[[A History of the Theories of Aether and Electricity]] | page=[https://archive.org/details/historyoftheorie00whitrich/page/441 441] | location=Dublin | publisher=Longman, Green and Co.}}&lt;/ref&gt;<br /> <br /> ==Thought experiments==<br /> {{see also|Einstein's thought experiments}}<br /> <br /> ===Einstein's train===<br /> [[File:Einstein train relativity of simultaneity.png|right|thumb|250px|Einstein imagined a stationary observer who witnessed two lightning bolts simultaneously striking both ends of a moving train. He concluded that an observer standing on the train would measure the bolts to strike at different times.]]<br /> <br /> Einstein's version of the experiment&lt;ref name=&quot;Einsteins_train&quot; /&gt; presumed that one observer was sitting midway inside a speeding traincar and another was standing on a platform as the train moved past. As measured by the standing observer, the train is struck by two bolts of lightning simultaneously, but at different positions along the axis of train movement (back and front of the train car). In the inertial frame of the standing observer, there are three events which are spatially dislocated, but simultaneous: standing observer facing the moving observer (i.e., the center of the train), lightning striking the front of the train car, and lightning striking the back of the car.<br /> <br /> Since the events are placed along the axis of train movement, their time coordinates become projected to different time coordinates in the moving train's inertial frame. Events which occurred at space coordinates in the direction of train movement happen ''earlier'' than events at coordinates opposite to the direction of train movement. In the moving train's inertial frame, this means that lightning will strike the front of the train car ''before'' the two observers align (face each other).<br /> <br /> ===The train-and-platform===<br /> [[File:Traincar Relativity1.svg|right|thumb|The train-and-platform experiment from the reference frame of an observer on board the train]]<br /> [[File:Traincar Relativity2.svg|right|thumb|Reference frame of an observer standing on the platform (length contraction not depicted)]]<br /> <br /> A popular picture for understanding this idea is provided by a thought experiment similar to those suggested by [[Daniel Frost Comstock]] in 1910&lt;ref&gt;The thought experiment by Comstock described two platforms in relative motion. See: {{Citation | author=Comstock, D.F. | year=1910 | title= The principle of relativity | journal=Science | volume =31 | pages =767–772 | doi=10.1126/science.31.803.767 | pmid=17758464 | issue=803|bibcode = 1910Sci....31..767C | title-link=s:The Principle of Relativity (Comstock) | s2cid=33246058 }}.&lt;/ref&gt; and Einstein in 1917.&lt;ref&gt;Einstein's thought experiment used two light rays starting at both ends of the platform. See: {{Citation<br /> |author=Einstein A.|year=1917|title=Relativity: The Special and General Theory|publisher=Springer|title-link=s:Relativity: The Special and General Theory}}&lt;/ref&gt;&lt;ref name=&quot;Einsteins_train&quot;&gt;{{Citation<br /> |title=Relativity - The Special and General Theory<br /> |first1=Albert<br /> |last1=Einstein<br /> |publisher=Samaira Book Publishers<br /> |year=2017<br /> |isbn=978-81-935401-7-6<br /> |pages=30–33<br /> |url=https://books.google.com/books?id=DrA8DwAAQBAJ}}, [https://books.google.com/books?id=DrA8DwAAQBAJ&amp;pg=PT46 Chapter IX]<br /> &lt;/ref&gt; It also consists of one observer midway inside a speeding traincar and another observer standing on a platform as the train moves past. <br /> <br /> A flash of light is given off at the center of the traincar just as the two observers pass each other. For the observer on board the train, the front and back of the traincar are at fixed distances from the light source and as such, according to this observer, the light will reach the front and back of the traincar at the same time.<br /> <br /> For the observer standing on the platform, on the other hand, the rear of the traincar is moving (catching up) toward the point at which the flash was given off, and the front of the traincar is moving away from it. As the speed of light is finite and the same in all directions for all observers, the light headed for the back of the train will have less distance to cover than the light headed for the front. Thus, the flashes of light will strike the ends of the traincar at different times.<br /> <br /> [[File:TrainAndPlatformDiagram1.svg|thumb|250px|right|The spacetime diagram in the frame of the observer on the train.]]<br /> [[File:TrainAndPlatformDiagram2.svg|thumb|250px|right|The same diagram in the frame of an observer who sees the train moving to the right.]]<br /> <br /> ====Spacetime diagrams====<br /> It may be helpful to visualize this situation using [[spacetime diagram]]s. For a given observer, the ''t''-axis is defined to be a point traced out in time by the origin of the spatial coordinate ''x'', and is drawn vertically. The ''x''-axis is defined as the set of all points in space at the time ''t'' = 0, and is drawn horizontally. The statement that the speed of light is the same for all observers is represented by drawing a light ray as a 45° line, regardless of the speed of the source relative to the speed of the observer.<br /> <br /> In the first diagram, the two ends of the train are drawn as grey lines. Because the ends of the train are stationary with respect to the observer on the train, these lines are just vertical lines, showing their motion through time but not space. The flash of light is shown as the 45° red lines. The points at which the two light flashes hit the ends of the train are at the same level in the diagram. This means that the events are simultaneous.<br /> <br /> In the second diagram, the two ends of the train moving to the right, are shown by parallel lines. The flash of light is given off at a point exactly halfway between the two ends of the train, and again form two 45° lines, expressing the constancy of the speed of light. In this picture, however, the points at which the light flashes hit the ends of the train are ''not'' at the same level; they are ''not'' simultaneous.<br /> <br /> ==Lorentz transformation==<br /> The relativity of simultaneity can be demonstrated using the [[Lorentz transformation]], which relates the coordinates used by one observer to coordinates used by another in uniform relative motion with respect to the first.<br /> <br /> Assume that the first observer uses coordinates labeled ''t'', ''x'', ''y'', and ''z'', while the second observer uses coordinates labeled ''t&amp;prime;'', ''x&amp;prime;'', ''y&amp;prime;'', and ''z&amp;prime;''. Now suppose that the first observer sees the second observer moving in the ''x''-direction at a velocity ''v''. And suppose that the observers' coordinate axes are parallel and that they have the same origin. Then the Lorentz transformation expresses how the coordinates are related:<br /> &lt;math display=&quot;block&quot;&gt; t' = \frac{t - {v\,x/c^2}}{\sqrt{1-v^2/c^2}}\, ,&lt;/math&gt;<br /> &lt;math display=&quot;block&quot;&gt; x' = \frac{x - v \, t }{\sqrt{1-v^2/c^2}}\, ,&lt;/math&gt;<br /> &lt;math display=&quot;block&quot;&gt; y' = y\, ,&lt;/math&gt;<br /> &lt;math display=&quot;block&quot;&gt; z' = z\, ,&lt;/math&gt;<br /> where ''c'' is the [[speed of light]]. If two events happen at the same time in the frame of the first observer, they will have identical values of the ''t''-coordinate. However, if they have different values of the ''x''-coordinate (different positions in the ''x''-direction), they will have different values of the ''t''' coordinate, so they will happen at different times in that frame. The term that accounts for the failure of absolute simultaneity is the ''vx''/''c''&lt;sup&gt;2&lt;/sup&gt;.<br /> [[File:Simultaneity Lines.svg|thumb|250px|right|A spacetime diagram showing the set of points regarded as simultaneous by a stationary observer (horizontal dotted line) and the set of points regarded as simultaneous by an observer moving at v = 0.25c (dashed line)]]<br /> <br /> The equation ''t&amp;prime;'' = constant defines a &quot;line of simultaneity&quot; in the (''x&amp;prime;'', ''t&amp;prime;'') coordinate system for the second (moving) observer, just as the equation ''t'' = constant defines the &quot;line of simultaneity&quot; for the first (stationary) observer in the (''x'', ''t'') coordinate system. From the above equations for the Lorentz transform it can be seen that ''t''' is constant if and only if ''t'' − ''vx''/''c''&lt;sup&gt;2&lt;/sup&gt; = constant. Thus the set of points that make ''t'' constant are different from the set of points that makes ''t' ''constant. That is, the set of events which are regarded as simultaneous depends on the frame of reference used to make the comparison.<br /> <br /> Graphically, this can be represented on a spacetime diagram by the fact that a plot of the set of points regarded as simultaneous generates a line which depends on the observer. In the spacetime diagram, the dashed line represents a set of points considered to be simultaneous with the origin by an observer moving with a velocity ''v'' of one-quarter of the speed of light. The dotted horizontal line represents the set of points regarded as simultaneous with the origin by a stationary observer. This diagram is drawn using the (''x'', ''t'') coordinates of the stationary observer, and is scaled so that the speed of light is one, i.e., so that a ray of light would be represented by a line with a 45° angle from the ''x'' axis. From our previous analysis, given that ''v'' = 0.25 and ''c'' = 1, the equation of the dashed line of simultaneity is ''t'' − 0.25''x'' = 0 and with ''v'' = 0, the equation of the dotted line of simultaneity is ''t'' = 0.<br /> <br /> In general the second observer traces out a [[worldline]] in the spacetime of the first observer described by ''t'' = ''x''/''v'', and the set of simultaneous events for the second observer (at the origin) is described by the line ''t'' = ''vx''. Note the [[multiplicative inverse]] relation of the [[slope]]s of the worldline and simultaneous events, in accord with the principle of [[hyperbolic orthogonality]].<br /> <br /> == Accelerated observers ==<br /> <br /> [[File:TwentyFiveZones.png|right|thumb|250px|Roundtrip radar-time isocontours.]]<br /> <br /> The Lorentz-transform calculation above uses a definition of extended-simultaneity (i.e. of when and where events occur ''at which you were not present'') that might be referred to as the co-moving or &quot;tangent free-float-frame&quot; definition. This definition is naturally extrapolated to events in gravitationally-curved spacetimes, and to accelerated observers, through use of a radar-time/distance definition that (unlike the tangent free-float-frame definition for accelerated frames) assigns a unique time and position to any event.&lt;ref name=&quot;Dolby2001&quot;&gt;{{cite journal|last1=Dolby|first1=Carl E.|last2=Gull|first2=Stephen F.|title=On radar time and the twin &quot;paradox&quot;|journal=American Journal of Physics|date=December 2001|volume=69|issue=12|pages=1257–1261|doi=10.1119/1.1407254|arxiv=gr-qc/0104077|bibcode=2001AmJPh..69.1257D|s2cid=119067219}}&lt;/ref&gt;<br /> <br /> The radar-time definition of extended-simultaneity further facilitates visualization of the way that acceleration curves spacetime for travelers in the absence of any gravitating objects. This is illustrated in the figure at right, which shows radar time/position isocontours for events in flat spacetime as experienced by a traveler (red trajectory) taking a constant [[proper acceleration|proper-acceleration]] roundtrip. One caveat of this approach is that the time and place of remote events are not fully defined until light from such an event is able to reach our traveler.<br /> <br /> ==See also==+1<br /> *[[Andromeda paradox]]<br /> *[[Causal structure]]<br /> *[[Einstein's thought experiments]]<br /> *[[Ehrenfest's paradox]]<br /> *[[Einstein synchronisation]]<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==External links==<br /> * {{Wikibooks-inline|Special relativity}}<br /> <br /> {{Relativity}}<br /> <br /> [[Category:Special relativity]]<br /> [[Category:History of physics]]<br /> [[Category:Thought experiments in physics]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Wikipedia_talk:Example_of_a_broken_redirect&diff=1174162664 Wikipedia talk:Example of a broken redirect 2023-09-06T18:37:56Z <p>205.189.94.9: /* Protected edit request on 6 September 2023 */ new section</p> <hr /> <div>{{talk page header}}<br /> <br /> == Protected edit request on 6 September 2023 ==<br /> <br /> {{edit fully-protected|Wikipedia:Example of a broken redirect|answered=no}}<br /> the entry under list of corporate entities for shell corporation should read as&lt;shell&gt;,not&gt;&lt;shelf corporation. [[Special:Contributions/205.189.94.9|205.189.94.9]] ([[User talk:205.189.94.9|talk]]) 18:37, 6 September 2023 (UTC)</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Mongol_(film)&diff=1173482406 Mongol (film) 2023-09-02T18:29:22Z <p>205.189.94.9: </p> <hr /> <div>{{Short description|2007 historical epic film}}<br /> {{Other uses|Genghis Khan (disambiguation)}}<br /> {{Use dmy dates|date=April 2015}}<br /> {{Infobox film<br /> | name = Mongol<br /> | image = Mongol poster.jpg<br /> | caption = American theatrical release poster<br /> | director = [[Sergei Bodrov]]<br /> | writer = {{Plainlist|<br /> * {{ill|Arif Aliyev|ru|Алиев, Ариф Тагиевич}}<br /> * Sergei Bodrov<br /> }}<br /> | producer = {{Plainlist|<br /> * {{ill|Sergey Selyanov|ru|Сельянов, Сергей Михайлович}}<br /> * Sergei Bodrov<br /> * [[Anton Melnik]]<br /> }}<br /> | starring = {{Plainlist|<br /> * [[Tadanobu Asano]]<br /> * [[Sun Honglei]]<br /> * [[Chuluuny Khulan]]<br /> }}<br /> | cinematography = {{Plainlist|<br /> * [[Sergei Trofimov (cinematographer)|Sergei Trofimov]]<br /> * [[Rogier Stoffers]]<br /> }}<br /> | editing = {{Plainlist|<br /> * [[Zach Staenberg]]<br /> * [[Valdís Óskarsdóttir]]<br /> }}<br /> | music = [[Tuomas Kantelinen]]<br /> | studio = {{Plainlist|<br /> * {{ill|Kinokompaniya CTB|ru|СТВ (кинокомпания)}}<br /> * {{ill|Andreevski Flag|ru|Андреевский флаг (кинокомпания)}}<br /> * {{ill|X Filme Creative Pool|de}}<br /> * Kinofabrika<br /> * Eurasia Film<br /> }}<br /> | distributor = {{Plainlist|<br /> * Nashe Kino (Russia)<br /> * {{ill|X Verleih|de}} (Germany)<br /> }}<br /> | released = {{Film date|df=yes|2007|08|10|Vyborg|2007|09|20|Russia|2008|08|07|Germany}}<br /> | runtime = 125 minutes&lt;!--Theatrical runtime: 125:17--&gt;&lt;ref&gt;{{cite web | url=https://bbfc.co.uk/releases/mongol-2008-0 | title=''MONGOL'' (15) | work=[[British Board of Film Classification]] | date=31 March 2009 | access-date=21 April 2015}}&lt;/ref&gt;<br /> | country = {{Plainlist|<br /> * Kazakhstan<br /> * Russia<br /> * Mongolia<br /> * Germany&lt;ref&gt;{{cite web | url=https://lumiere.obs.coe.int/movie/30131# | title=Mongol | work=[[Lumiere (database)|Lumiere]] | access-date=15 January 2022}}&lt;/ref&gt;<br /> }}<br /> | language = {{Plainlist|<br /> * [[Mongolian language|Mongolian]]<br /> * [[Standard Chinese|Mandarin]]<br /> }}<br /> | budget = $18 million&lt;ref name=BoxOfficeMojo&gt;{{cite web |url=https://boxofficemojo.com/movies/?id=mongol.htm |title=Mongol |publisher=[[Box Office Mojo]] |access-date=2011-02-21}}&lt;/ref&gt;<br /> | gross = $26.5 million&lt;ref name=BoxOfficeMojo /&gt;<br /> }}<br /> '''''Mongol''''' ({{lang|ru|Монгол}}), also known as '''''Mongol: The Rise of Genghis Khan''''' in the United States and '''''Mongol: The Rise to Power of Genghis Khan''''' in the United Kingdom, is a 2007 [[Historical film|period]] [[epic film]] directed by [[Sergei Bodrov]], about the early life of [[Temüjin]], who later came to be known as [[Genghis Khan]]. The storyline was conceived from a [[screenplay]] written by Bodrov and Arif Aliev. It was produced by Bodrov, Sergei Selyanov, and Anton Melnik and stars [[Tadanobu Asano]], [[Sun Honglei]], and [[Chuluuny Khulan]] in the main roles. ''Mongol'' explores abduction, [[kinship]], and the repercussions of war.&lt;ref name=&quot;film&quot;/&gt; Perhaps it is so that the film itself is a Canadian regression into revisited borders. This is not a country known for establishing or keeping borders, nor are most [[iterations]] of the [[Empire]]. Check [[John Woo]]; [[Ivan Sergei]]; [[Once a Thief]] for current [[pollination]] of [[cultural hegemony]]. These appear to be hegemonies, do not mistake this for [[current]] [[geopolitical]] narrative]], because it is not is that narrative, according to the [[Charter of Rights and Freedoms]], 1981. This is a [[New American]] film, do not look at it as [[American]] or [[Canadian]], because this is [[probably]] an [[offshoot]] of the [[Chinois]] [[migration]] of the late [[20th century]].<br /> <br /> The film was a co-production between companies in Russia, Germany and Kazakhstan. Filming took place mainly in the [[China|People's Republic of China]], principally in [[Inner Mongolia]] (the [[Mongols|Mongol]] [[Autonomous regions of China|autonomous region]]), and in [[Kazakhstan]]. Shooting began in September 2005, and was completed in November 2006. After an initial screening at the Russian Film Festival in [[Vyborg]] on 10 August 2007, ''Mongol'' was released in Russia on [[20 September 2007]]. It saw a limited release in the United States on 6 June 2008 grossing $5.7 million in domestic ticket sales. It additionally earned nearly $21 million in sales through international release for a combined $26.5 million in gross revenue. The film was a minor financial success after its theatrical run, and was generally met with positive critical reviews. The film was nominated for the 2007 [[Academy Award for Best Foreign Language Film]] as a submission from Kazakhstan.&lt;ref&gt;{{cite press release | title = 80th Academy Awards Nominations Announced | publisher = [[Academy of Motion Picture Arts and Sciences]] | date = 2008-01-22 | url = http://www.oscars.org/press/pressreleases/2008/08.01.22.html | access-date = 2008-01-22}}&lt;/ref&gt;<br /> <br /> The film is intended to be the first part of a [[trilogy]] about Genghis Khan, and initial work on the second part began in 2008.&lt;ref&gt;{{cite news |title=Bodrov kicks off production unit |first=Tom |last=Birchenough |url=http://www.varietyasiaonline.com/content/view/6083/53/ |newspaper=[[Variety (magazine)|Variety Asia]] |publisher=[[Reed Business Information]] |date=14 May 2008 |access-date=25 January 2010 |archive-url=https://web.archive.org/web/20080515195418/http://www.varietyasiaonline.com/content/view/6083/53/ |archive-date=15 May 2008 }}&lt;/ref&gt; The trilogy project was eventually put on the shelf, but in July 2013, during a visit to the annual [[Naadam|Naadam Festival]] in [[Ulan Bator]], Bodrov told the press that the production of the sequel had started, and that it may be shot in [[Mongolia]],&lt;ref name=InfoMongolia&gt;[http://www.infomongolia.com/ct/ci/6420 InfoMongolia, 6 August 2013: &quot;Russian Producer Announces the Sequel to 'Mongol'&quot;] {{Webarchive|url=https://web.archive.org/web/20150709061616/http://www.infomongolia.com/ct/ci/6420 |date=9 July 2015 }} Linked 2013-08-29&lt;/ref&gt; as had been the intention for ''Mongol'', before local protests, fearing that the film would not correctly portray the Mongolian people and their national hero, Genghis Khan, caused the shooting to move to Inner Mongolia and Kazakhstan.&lt;ref name=Protests&gt;[https://www.variety.com/article/VR1117920791.html?categoryid=1019&amp;cs=1 ''Variety'', 10 April 2005: &quot;Mongols protest Khan project&quot;]. Retrieved 2011-02-15.&lt;/ref&gt;<br /> <br /> ==Plot==<br /> In 1192, [[Genghis Khan|Temüjin]], a prisoner in the [[Western Xia|Tangut kingdom]], recounts his story through a series of [[Flashback (narrative)|flashback]]s.<br /> <br /> Embarking on an expedition 20 years earlier (1172), nine-year-old Temüjin is accompanied by his father [[Yesugei|Yesügei]] to select a girl as his future wife. He meets and chooses [[Börte]], against his father's wishes. On their way home, Yesügei is poisoned by an enemy tribe; on his dying breath, he tells his son that he is now [[Khan (title)|Khan]]. However, Targutai, Yesügei's lieutenant, proclaims himself as Khan and is about to kill his young rival. Prevented from doing so by [[Hoelun|the boy's mother]], Targutai lets him go and vows to kill him as soon as he becomes an adult.<br /> <br /> After falling through a frozen lake, Temüjin is rescued by [[Jamukha]]. The two quickly become friends and take an oath as [[blood brother]]s. Targutai later captures him, but he escapes under the cover of night and roams the countryside.<br /> <br /> Years later (1186), Temüjin is once again apprehended by Targutai. He escapes a second time, finding Börte and presenting her to his family. Later that night, they are attacked by the [[Merkit]] tribe. While being chased on horseback, Temüjin is shot with an arrow but survives. Börte, however, is kidnapped and taken to the Merkit camp.&lt;ref name=&quot;film&quot;&gt;[[Sergei Bodrov]]. (2007). ''Mongol'' [Motion picture]. Russia: [[Picturehouse (company)|Picturehouse Entertainment]].&lt;/ref&gt;<br /> <br /> Temüjin goes to Jamukha—who is now his tribe's Khan—and seeks his help in rescuing his wife. Jamukha agrees, and after a year, they launch an attack on the Merkits and are successful. One night, while celebrating their victory, Temüjin demonstrates his generosity by allowing his troops to take an equal share of the [[Looting|plunder]]. Two of Jamukha's men see this as a stark contrast to their Khan's behavior and desert him the next morning by following their new master. Jamukha chases him down and demands that he give his men back, to which he refused. This act, aggravated by the inadvertent killing of his biological brother by one of Temüjin's men, leaves Jamukha (with Targutai as an ally) no choice but to declare war on him. Outnumbered, Temüjin's army is quickly defeated. Sparing his blood brother, Jamukha decides to sell him into slavery.&lt;ref name=&quot;film&quot;/&gt;<br /> <br /> Temüjin is sold to a Tangut nobleman despite the dire warning given to him by a [[Buddhist]] monk acting as his adviser, who senses the great potential the warrior carries and his future role in subjugating the Tangut State. While he is imprisoned, the monk pleads with him to spare his monastery when he will destroy the kingdom sometime in the future. In exchange for delivering a bone fragment to Börte indicating that he is still alive, Temüjin agrees. The monk succeeds in delivering the bone and the message at the cost of his life. Börte infiltrates the Tangut border town disguised as a merchant's [[concubinage|concubine]] and the two escape.<br /> <br /> Temüjin pledges to unify all of the Mongol tribes and imposes three basic laws for them to abide to: never kill women and children, always honor your promises and repay your debts, and ''never'' betray your Khan. Subsequently, (1196), he gathers an army and engages Jamukha, who has an even larger force. During the battle, a thunderstorm arises on the steppe, terrifying Jamhukha's and Temujin's armies, who cower in fear. However Temujin does not cower, and when his army sees him riding unafraid they are inspired to also be fearless and charge Jamukha's helpless and cowering army, which surrenders immediately. Temüjin allows Jamukha to live and brings the latter's army under his banner. Targutai is killed by his own soldiers and his body is presented to the Khan as a way of appeasing him, but they are executed for disobeying the law.<br /> <br /> A postscript indicates that by 1206, Temüjin was designated the Khan of all the [[Mongols]]—''[[Genghis Khan]] of the Great Steppe''. He would later go on to invade and conquer the Tangut kingdom by 1227, fulfilling the monk's prophecy, but spared the monastery, honoring his debt to the monk.&lt;ref name=&quot;film&quot;/&gt;<br /> <br /> ==Cast==<br /> [[File:Tadanobu.jpg|thumb|190px|right|Actor Tadanobu Asano, who portrayed the elder Temüjin in the film.]]<br /> {{div col|colwidth=22em}}<br /> * [[Tadanobu Asano]] as [[Genghis Khan|Genghis Khan/Temüjin]]<br /> ** Odnyam Odsuren as young Temüjin<br /> * [[Sun Honglei]] as [[Jamukha]]<br /> ** Amarbold Tuvshinbayar as young Jamukha<br /> * [[Chuluuny Khulan]] as [[Börte]]<br /> ** Bayertsetseg Erdenebat as young Börte<br /> * [[Amadu Mamadakov]] as Targutai<br /> * [[Batdorj-in Baasanjab|Ba Sen]] as [[Yesugei|Yesügei]]<br /> * Sai Xing Ga as Chiledu<br /> * Bu Ren as Taichar<br /> * Aliya as Oelun<br /> * He Qi as Dai-Sechen<br /> * Deng Ba Te Er as Daritai<br /> * Zhang Jiong as Garrison Chief<br /> * Ben Hon Sun as Monk<br /> {{div col end}}<br /> <br /> ==Production==<br /> {{More citations needed section|date=May 2021}}<br /> <br /> ===Development===<br /> [[File:Sergei Vladimirovich Bodrov.jpg|170px|left|thumb|Director Sergei Bodrov at the [[66th Venice International Film Festival|66th Venice Film Festival]]]]<br /> The premise of ''Mongol'' is the story of Genghis Khan, the Mongol leader who founded the [[Mongol Empire]], which ruled expansive areas of [[Eurasia]]. The film depicts the early life of Temüjin, not as an evil war-mongering brute, but rather an inspiring visionary leader. Director Bodrov noted that &quot;Russians lived under Mongolian rule for around 200 years&quot; and that &quot;Genghis Khan was portrayed as a monster&quot;. During the 1990s, Bodrov read a book by Russian historian [[Lev Gumilev]] entitled ''The Legend of the Black Arrow'', which offered a more disciplined view of the Mongol leader and influenced Bodrov to create a film project about the warrior.<br /> <br /> Bodrov spent several years researching the aspects of his story, discovering that Temüjin was an orphan, a slave and a combatant whom everyone tried to kill. He found difficulty in preparing the screenplay for the film due to the fact that no contemporary Mongol biography existed. The only Mongol history from the era is ''[[The Secret History of Mongols]]'', written for the Mongol royal family some time after Genghis Khan's death in AD 1227. Author Gumilev had used the work as a historical reference and a work of significant literature. Casting for the film took place worldwide, including Mongolia, China, Russia, and in Los Angeles. Speaking on the choice of Tadanobu Asano to portray Temüjin, Bodrov commented that although it might have seemed odd to cast a Japanese actor in the role, he explained that the Mongol ruler was seen by many Japanese as one of their own. Bodrov said, &quot;The Japanese had a very famous ancient warrior who disappeared {{bracket|[[Minamoto no Yoshitsune]]}}, and they think he went to Mongolia and became Genghis Khan. He's a national hero, Genghis Khan. Mongolians can claim he's Mongolian, but the Japanese, they think they know who he is.&quot; Bodrov felt casting actor Sun Honglei as Jamukha was a perfect mix of &quot;gravity and humor&quot; for the role. Describing the character interaction between Asano and Honglei, he noted &quot;They're completely different people, Temüjin and Jamukha, but they have a strong relationship, strong feelings between them.&quot; Aside from the Chinese and Japanese actors for those roles, the rest of the cast were Mongolian. It marked the first time a tale of Genghis Khan would be acted by Asians, this in contrast to such Hollywood and European attempts like the 1956 movie flop ''[[The Conqueror (1956 film)|The Conqueror]]'' and the 1965 film ''[[Genghis Khan]]'' with [[Omar Sharif]].<br /> <br /> The film was initially intended to be shot in [[Mongolia]], but the plans caused much protest in the country, as many Mongolians feared that it would not correctly portray their people and their national hero.&lt;ref name=Protests/&gt; As a consequence, shooting was moved to the Chinese autonomous region [[Inner Mongolia]] and to [[Kazakhstan]].<br /> <br /> ===Filming===<br /> [[File:MongolAsano.jpg|thumb|right|The character Temüjin, dressed in Mongolian warrior garb]]<br /> [[Principal photography|Filming]] began in 2005, lasting 25 weeks and taking place in China, Mongolia, and Kazakhstan. Production designer Dashi Namdakov helped to recreate the pastoral lifestyle of the nomadic tribesmen. Namdakov is originally from a Russian region which borders Mongolia and is home to many ethnic Mongols. Bodrov remarked, &quot;Dashi has the Mongol culture in his bones and knows how to approach this material.&quot; To help create some of the horse-mounted stunt sequences, Bodrov called upon seasoned stuntmen from Kazakhstan and Kyrgyzstan, whom he was familiar with from the production of ''[[Nomad (2005 film)|Nomad]]''. Describing some of the stunt work, Bodrov claimed: &quot;Not a single horse was hurt on this film. There's a line in the movie, when young Jamukha tells Temüjin, 'For Mongol, horse is more important than woman.' And that's how it is with the Kazakh and Kyrgyz stunt people. They took very good care of the horses and were very conscientious.&quot; Bodrov collaborated on the film with editors [[Zach Staenberg]] and [[Valdís Óskarsdóttir]].<br /> <br /> ==Release==<br /> ''Mongol'' was first released in Russia and Ukraine on 20 September 2007.&lt;ref name=BoxOfficeRelease&gt;{{cite web |url=https://boxofficemojo.com/movies/?page=intl&amp;id=mongol.htm |title=International Box Office Results |publisher=[[Box Office Mojo]] |access-date=2011-02-21}}&lt;/ref&gt; The film then premiered in cinemas in Turkey on 14 March 2008. Between April and December 2008, ''Mongol'' was released in various countries throughout the Middle East, Europe and Africa.&lt;ref name=BoxOfficeRelease/&gt; France, Algeria, Monaco, Morocco and Tunisia shared a release date of 9 April 2008. In the United States and the United Kingdom, the film was released on 6 June 2008. In 2009, certain Asian Pacific countries such as Singapore and Malaysia saw release dates for the film.&lt;ref name=BoxOfficeRelease/&gt; Within Latin America, Argentina saw a release for the film on 11 March, while Colombia began screenings on 9 April. The film grossed $20,821,749 in non-US box office totals.&lt;ref name=BoxOfficeRelease/&gt;<br /> <br /> ===US box office===<br /> In the United States, the film premiered in cinemas on 6 June 2008. During its opening weekend, the film opened in 22nd place grossing $135,326 in business showing at five locations.&lt;ref name=&quot;BoxOfficeMojo&quot;/&gt; The film's revenue dropped by 17% in its second week of release, earning $112,212. For that particular weekend, the film fell to 25th place screening in five theaters. During the film's final release week in theaters, ''Mongol'' opened in a distant 80th place with $11,503 in revenue.&lt;ref&gt;{{cite web |url=https://boxofficemojo.com/weekend/chart/?yr=2008&amp;wknd=36&amp;p=.htm |title=September 5–7, 2008 Weekend |publisher=[[Box Office Mojo]] |access-date=2011-02-21}}&lt;/ref&gt; The film went on to top out domestically at $5,705,761 in total ticket sales through a 14-week theatrical run. Internationally, the film took in an additional $20,821,749 in box office business for a combined worldwide total of $26,527,510.&lt;ref name=BoxOfficeMojo/&gt; For 2008 as a whole, the film would cumulatively rank at a box office performance position of 167.&lt;ref&gt;{{cite web |url=https://boxofficemojo.com/yearly/chart/?yr=2008&amp;p=.htm |title=2008 Domestic Grosses |publisher=[[Box Office Mojo]] |access-date=2011-02-21}}&lt;/ref&gt;<br /> <br /> ===Home media===<br /> Following its cinematic release in theaters, the [[DVD region code|Region 1 Code]] [[widescreen]] edition of the film was released on [[DVD]] in the United States on 14 October 2008. Special features for the DVD include scene selections, subtitles in English and Spanish, and subtitles in English for the hearing-impaired.&lt;ref&gt;{{cite web |url=http://video.barnesandnoble.com/DVD/Mongol/Tadanobu-Asano/e/883929028788/?itm=2&amp;USRI=mongol |title=Mongol DVD Widescreen |publisher=BarnesandNoble.com |access-date=2011-02-15 |archive-date=7 July 2011 |archive-url=https://web.archive.org/web/20110707212818/http://video.barnesandnoble.com/DVD/Mongol/Tadanobu-Asano/e/883929028788/?itm=2&amp;USRI=mongol |url-status=dead }}&lt;/ref&gt;<br /> <br /> The widescreen high-definition [[Blu-ray|Blu-ray Disc]] version of the film was also released on 14 October 2008. Special features include; scene selections and subtitles in English and Spanish.&lt;ref&gt;{{cite web |url=http://video.barnesandnoble.com/DVD/Mongol/Tadanobu-Asano/e/883929028771/?itm=1&amp;USRI=mongol |title=Mongol Blu-ray Widescreen |publisher=BarnesandNoble.com |access-date=2011-02-15 |archive-date=7 July 2011 |archive-url=https://web.archive.org/web/20110707212813/http://video.barnesandnoble.com/DVD/Mongol/Tadanobu-Asano/e/883929028771/?itm=1&amp;USRI=mongol |url-status=dead }}&lt;/ref&gt; A supplemental viewing option for the film in the media format of [[video on demand]] is currently available too.&lt;ref&gt;{{cite web |url=https://www.amazon.com/Mongol/dp/B001I9M7GM/ref=ed_oe_vdl |title=Mongol VOD Format |website=Amazon |access-date=2011-02-15}}&lt;/ref&gt;<br /> <br /> ==Reception==<br /> <br /> ===Critical response===<br /> Among mainstream critics in the U.S., the film received mostly positive reviews. [[Rotten Tomatoes]] reported that 87% of 104 sampled critics gave the film a positive review, with an average score of 7.10 out of 10. The site's critics' consensus reads: &quot;The sweeping ''Mongol'' mixes romance, family drama, and enough flesh-ripping battle scenes to make sense of Ghenghis Khan's legendary stature.&quot;&lt;ref&gt;{{Cite web|url=https://www.rottentomatoes.com/m/mongol|title=Mongol (2008)|website=[[Rotten Tomatoes]]|publisher=[[Fandango Media]]|access-date=April 5, 2020}}&lt;/ref&gt; At [[Metacritic]], which assigns a [[weighted mean|weighted average]] out of 100 to critics' reviews, the film received a score of 74 based on 27 reviews, indicating &quot;Generally favorable reviews&quot;.&lt;ref&gt;{{Cite web|url=https://www.metacritic.com/movie/mongol-the-rise-of-genghis-khan|title=Mongol: The Rise of Genghis Khan Reviews|website=[[Metacritic]]|publisher=[[CBS Interactive]]|access-date=April 13, 2020}}&lt;/ref&gt; However, the film was criticized in Mongolia for factual errors and historical inaccuracies.&lt;ref&gt;[http://www.olloo.mn/modules.php?name=News&amp;file=print&amp;sid=76632 Г. Жигжидсvрэн: Сергей Бодровын &quot;Монгол&quot; кинонд бvтээсэн дvр байхгvй] {{webarchive|url=https://web.archive.org/web/20110722220001/http://www.olloo.mn/modules.php?name=News&amp;file=print&amp;sid=76632 |date=22 July 2011 }}. ''olloo.mn''. Retrieved 2011-02-17.&lt;/ref&gt;<br /> <br /> Claudia Puig of ''[[USA Today]]'' said the film &quot;has a visceral energy with powerful battle sequences and also scenes of striking and serene physical beauty.&quot; Noting a flaw, she did comment that ''Mongol'' might have included &quot;one battle too many.&quot; Although overall, she concluded the film was &quot;an exotic saga that compels, moves and envelops us with its grand and captivating story.&quot;&lt;ref name=&quot;Puig&quot;&gt;Puig, Claudia (12 June 2008). [https://www.usatoday.com/life/movies/reviews/2008-06-12-mongol_N.htm Tepid 'Mongol' A sweeping historic tale]. ''[[USA Today]]''. Retrieved 2011-02-16.&lt;/ref&gt; <br /> {|class=&quot;toccolours&quot; style=&quot;float: left; margin-left: 1em; margin-right: 2em; font-size: 85%; background:#FFFFE0; color:black; width:40em; max-width: 35%;&quot; cellspacing=&quot;5&quot;<br /> |style=&quot;text-align: left;&quot;|&quot;Centered on the rise of Genghis Khan, the film is an enthralling tale, in the style of a David Lean saga, with similarly gorgeous cinematography. It combines a sprawling adventure saga with romance, family drama and riveting action sequences.&quot;<br /> |-<br /> |style=&quot;text-align: left;&quot;|—Claudia Puig, writing in ''USA Today''&lt;ref name=&quot;Puig&quot;/&gt;<br /> |}<br /> Jonathan Kiefer, writing in the ''[[Sacramento News &amp; Review]]'', said &quot;At once sweeping and intimately confidential, with durably magnetic performances by Japan's Asano Tadanobu as the adored warlord and China's Honglei Sun as Jamukha, his blood brother and eventual enemy, ''Mongol'', a 2007 Best Foreign Language Film Oscar nominee, has to be by far the best action epic of 12th- and 13th-century Asian nomads you'll see&quot;. He emphatically believed Bodrov's film was &quot;both ancient and authentic.&quot; He added that it was &quot;commendably unhurried, and the scope swells up in a way that feels organic to a character-driven story&quot;.&lt;ref name=&quot;kiefer&quot;&gt;Kiefer, Jonathan (26 June 2008). [http://www.newsreview.com/sacramento/content?oid=684906 I think I Khan ''Mongol'']. ''[[Sacramento News &amp; Review]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Walter Addiego, writing for the ''[[San Francisco Chronicle]],'' said that the film offers &quot;everything you would want from an imposing historical drama: furious battles between mass armies, unquenchable love between husband and wife, blood brothers who become deadly enemies, and many episodes of betrayal and treachery&quot;. Concerning cinematography, he believed the film included &quot;plenty of haunting landscapes, gorgeously photographed by Sergei Trofimov on location in China, Kazakhstan and Mongolia, along with the sort of warfare scenes that define epics&quot;.&lt;ref&gt;Addiego, Walter (20 June 2008). [http://www.sfgate.com/cgi-bin/article.cgi?file=/c/a/2008/06/20/DDH7115QHE.DTL Review: 'Mongol' revisits Genghis Khan]. ''[[San Francisco Chronicle]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Writing for ''[[The Boston Globe]]'', [[Wesley Morris]] said that ''Mongol'' &quot;actually works as an old-fashioned production - one with breathtaking mohawks, a scary yoking, one daring escape, hottish sex, ice, snow, braying sheep, blood oaths, dehydrating dunes, throat singing, a nighttime urination, kidnapping, charged reunions, and relatively authentic entertainment values.&quot;&lt;ref&gt;Morris, Wesley (20 June 2008). [https://www.boston.com/movies/display?display=movie&amp;id=8626 When blood runs hot and cold]. ''[[The Boston Globe]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Writing for the ''[[Chicago Sun-Times]]'', film critic [[Roger Ebert]] called the film a &quot;visual spectacle, it is all but overwhelming, putting to shame some of the recent historical epics from Hollywood.&quot; Summing up, Ebert wrote &quot;The nuances of an ancient and ingeniously developed culture are passed over, and it cannot be denied that ''Mongol'' is relentlessly entertaining as an action picture.&quot;&lt;ref name=&quot;Ebert&quot;&gt;Ebert, Roger (20 June 2008). [http://rogerebert.suntimes.com/apps/pbcs.dll/article?AID=/20080619/REVIEWS/944262138/1023 Mongol] {{Webarchive|url=https://web.archive.org/web/20110716164958/http://rogerebert.suntimes.com/apps/pbcs.dll/article?AID=%2F20080619%2FREVIEWS%2F944262138%2F1023 |date=16 July 2011 }}. ''[[Chicago Sun-Times]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> {| class=&quot;toccolours&quot; style=&quot;float: right; margin-left: 1em; margin-right: 2em; font-size: 85%; background:#FFFFE0; color:black; width:30em; max-width: 40%;&quot; cellspacing=&quot;5&quot;<br /> | style=&quot;text-align: left;&quot; |&quot;''Mongol'' is a ferocious film, blood-soaked, pausing occasionally for passionate romance and more frequently for torture.&quot;<br /> |-<br /> | style=&quot;text-align: left;&quot; |—Roger Ebert, writing for the ''Chicago Sun-Times''&lt;ref name=&quot;Ebert&quot; /&gt;<br /> |}<br /> <br /> [[A. O. Scott]] of ''[[The New York Times]]'' stated that ''Mongol'' was a &quot;big, ponderous epic, its beautifully composed landscape shots punctuated by thundering hooves and bloody, slow-motion battle sequences.&quot;&lt;ref name=&quot;Scott&quot; /&gt; Scott approved of how the film encompassed &quot;rich ethnographic detail and enough dramatic intrigue to sustain a viewer's interest through the slower stretches.&quot;&lt;ref name=&quot;Scott&quot;&gt;Scott A.O., (6 June 2008). [https://movies.nytimes.com/2008/06/06/movies/06mong.html?ref=movies Forge a Unity of Purpose, Then Conquer the World]. ''[[The New York Times]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Similarly, [[Joe Morgenstern]] wrote in ''[[The Wall Street Journal]]'' that the film consisted of battle scenes which were as &quot;notable for their clarity as their intensity; we can follow the strategies, get a sense of who's losing and who's winning. The physical production is sumptuous.&quot; Morgenstern affirmed that ''Mongol'' was &quot;an austere epic that turns the stuff of pulp adventure into a persuasive take on ancient history.&quot;&lt;ref&gt;Morgenstern, Joe (6 June 2008). [https://www.wsj.com/articles/SB121271272126950681 'Mongol' Brings Style And Sumptuous Scale To Genghis Khan Saga]. ''[[The Wall Street Journal]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> [[Lisa Schwarzbaum]], writing for ''[[Entertainment Weekly]],'' lauded the visual qualities of the film, remarking how ''Mongol'' &quot;contrasts images of sweeping landscape and propulsive battle with potent scenes of emotional intimacy&quot;, while also referring to its &quot;quite grand, quite exotic, David Lean-style epic&quot; resemblance.&lt;ref&gt;Schwarzbaum, Lisa (6 June 2008). [https://www.ew.com/ew/article/0,,20204732,00.html Mongol (2008)] {{Webarchive|url=https://web.archive.org/web/20121020215606/http://www.ew.com/ew/article/0,,20204732,00.html |date=20 October 2012 }}. ''[[Entertainment Weekly]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> [[Kyle Smith (critic)|Kyle Smith]] of the ''[[New York Post]]'' commented that the film combined the &quot;intelligence of an action movie with the excitement of an art-house release&quot; making ''Mongol'' &quot;as dry as summer in the Gobi Desert.&quot; Smith did compliment director Bodrov on staging a &quot;couple of splattery yet artful battle scenes&quot;, but concluded that the film &quot;really isn't worth leaving your yurt for.&quot;&lt;ref&gt;Smith, Kyle (6 June 2008). [http://www.nypost.com/p/entertainment/movies/item_8rOS7t1uZ5GaL778YGJnuN;jsessionid=234EED98FA14F2F45E288D1931E248C8 Sweet Mongolia: How Genghis Got His Horde] {{Webarchive|url=https://web.archive.org/web/20121022042112/http://www.nypost.com/p/entertainment/movies/item_8rOS7t1uZ5GaL778YGJnuN;jsessionid=234EED98FA14F2F45E288D1931E248C8 |date=22 October 2012 }}. ''[[New York Post]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Author Tom Hoskyns of ''[[The Independent]]'' described the film as being &quot;very thin plot-wise.&quot; Hoskyns commended the &quot;desolate landscapes and seasonal variations&quot;, but he was not excited about the repetitious nature of the story showing the &quot;hero getting repeatedly captured and escaping.&quot;&lt;ref&gt;Hoskyns, Tom (26 September 2008). [https://www.independent.co.uk/arts-entertainment/films/reviews/dvd-mongol-15-943388.html DVD: Mongol]{{dead link|date=August 2021|bot=medic}}{{cbignore|bot=medic}}. ''[[The Independent]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Joshua Rothkopf of ''[[Time Out (company)|Time Out]]'' said that ''Mongol'' was a &quot;Russian-produced dud.&quot; He said that it included &quot;ridiculous dialogue and Neanderthal motivations&quot; as well as bearing &quot;little relation to the raw, immediate work of his countrymates—like Andrei Tarkovsky, whose epic ''[[Andrei Rublev (film)|Andrei Rublev]]'' really gives you a sense of the dirt and desperation.&quot;&lt;ref&gt;Rothkopf, Joshua (11 June 2008). [https://archive.today/20130204125544/http://www.timeout.com/film/newyork/reviews/85544/mongol-the_rise_to_power_of_genghis_khan.html Mongol: The Rise to Power of Genghis Khan]. ''[[Time Out (company)|Time Out]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> ===Accolades===<br /> The film was nominated and won several awards in 2007–09. Various critics included the film on their lists of the top 10 best films of 2008. Mike Russell of ''[[The Oregonian]]'' named it the fifth-best film of 2008,&lt;ref name=mctop08/&gt; Lawrence Toppman of ''[[The Charlotte Observer]]'' named it the eighth-best film of 2008,&lt;ref name=mctop08/&gt; and V.A. Musetto of the ''[[New York Post]]'' also named it the eighth-best film of 2008.&lt;ref name=mctop08&gt;{{cite web |url=http://apps.metacritic.com/film/awards/2008/toptens.shtml |title=Metacritic: 2008 Film Critic Top Ten Lists |publisher=[[Metacritic]] |access-date=11 January 2009 |url-status=dead |archive-url=https://web.archive.org/web/20110720094350/http://apps.metacritic.com/film/awards/2008/toptens.shtml |archive-date=20 July 2011 |df=dmy-all }}&lt;/ref&gt;<br /> <br /> {|class=&quot;wikitable&quot; border=&quot;1&quot;<br /> |-<br /> ! Award<br /> ! Category<br /> ! Nominee<br /> ! Result<br /> |-<br /> |[[80th Academy Awards]]&lt;ref&gt;{{cite web |url=http://www.oscars.org/awards/academyawards/oscarlegacy/2000-present/2008/winners.html |title=Nominees &amp; Winners for the 80th Academy Awards |access-date=2011-02-21 |publisher=Oscars.org |archive-date=12 October 2013 |archive-url=https://web.archive.org/web/20131012045551/http://www.oscars.org/awards/academyawards/oscarlegacy/2000-present/2008/winners.html |url-status=dead }}&lt;/ref&gt;<br /> |Best Foreign Language Film<br /> |<br /> |{{nom}}<br /> |-<br /> |2007 [[Asia Pacific Screen Awards]]&lt;ref&gt;{{cite web |url=http://www.asiapacificscreenawards.com/the_awards/past_winners_and_nominees/nominees/achievement_in_cinematography |title=The Awards |access-date=2011-02-21 |publisher=Asia Pacific Screen Awards |url-status=dead |archive-url=https://web.archive.org/web/20110218215728/http://www.asiapacificscreenawards.com/the_awards/past_winners_and_nominees/nominees/achievement_in_cinematography |archive-date=18 February 2011 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Achievement in Cinematography<br /> |Sergey Trofimov<br /> |{{nom}}<br /> |-<br /> |[[2nd Asian Film Awards]]&lt;ref&gt;{{cite web |url=http://www.asianfilmawards.asia/2008/eng/nominations.html#b5 |title=Nominations &amp; Winners |access-date=2011-02-21 |publisher=Asian Film Awards |url-status=dead |archive-url=https://web.archive.org/web/20120618163020/http://www.asianfilmawards.asia/2008/eng/nominations.html#b5 |archive-date=18 June 2012 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Supporting Actor<br /> |Sun Honglei<br /> |{{won}}<br /> |-<br /> ||[[14th Critics' Choice Awards|Broadcast Film Critics Association Awards 2008]]&lt;ref&gt;{{cite web |url=http://www.bfca.org/ccawards/2008.php |title=The 14th Critics' Choice Movie Awards Nominees |access-date=2011-02-21 |publisher=BFCA.org |url-status=dead |archive-url=https://web.archive.org/web/20101124011333/http://www.bfca.org/ccawards/2008.php |archive-date=24 November 2010 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Foreign Language Film<br /> |<br /> |{{nom}}<br /> |-<br /> |rowspan=2|[[European Film Awards 2008]]&lt;ref&gt;{{cite web |url=http://www.europeanfilmacademy.org/2008/11/08/nominations-pour-les-european-film-awards-2008/ |title=Nominations for the European Film Awards 2008 |access-date=2011-02-21 |publisher=EuropeanFilmAcademy.org |archive-date=11 March 2012 |archive-url=https://web.archive.org/web/20120311025245/https://www.europeanfilmacademy.org/2008/11/08/nominations-pour-les-european-film-awards-2008/ |url-status=dead }}&lt;/ref&gt;&lt;ref&gt;{{cite web|url=http://www.europeanfilmacademy.org/2009/04/28/2008-4/ |title=The People's Choice Award 2008 |access-date=2011-02-21 |publisher=EuropeanFilmAcademy.org}}&lt;/ref&gt;<br /> |Best Cinematographer<br /> |Sergey Trofimov, Rogier Stoffers<br /> |{{nom}}<br /> |-<br /> |Best European Film<br /> |Sergey Bodrov<br /> |{{nom}}<br /> |-<br /> |rowspan=2|6th [[Golden Eagle Award (Russia)|Golden Eagle Award]]s&lt;ref&gt;{{cite web |url=http://www.kinoacademy.ru/main.php |title=Nominees &amp; Winners |access-date=2011-02-21 |publisher=KinoAcademy.ru |url-status=dead |archive-url=https://web.archive.org/web/20110717063252/http://www.kinoacademy.ru/main.php |archive-date=17 July 2011 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Costume Design<br /> |Karin Lohr<br /> |{{won}}<br /> |-<br /> |Best Sound Design<br /> |Stephan Konken<br /> |{{won}}<br /> |-<br /> |2009 40th [[NAACP Image Award]]s&lt;ref&gt;{{cite web |url=http://www.naacpimageawards.net/42/awards-show/40th/ |title=40th NAACP Image Awards |access-date=2010-06-04 |publisher=NAACP Image Awards |url-status=dead |archive-url=https://web.archive.org/web/20101215211104/http://www.naacpimageawards.net/42/awards-show/40th/ |archive-date=15 December 2010 |df=dmy-all }}&lt;/ref&gt;<br /> |Outstanding Foreign Motion Picture<br /> |<br /> |{{nom}}<br /> |-<br /> |Las Vegas Film Critics Society Awards 2008&lt;ref&gt;{{cite web |url=http://www.lvfcs.org/lvfcs/2008.html |title=2008 Sierra Award winners |access-date=2011-02-21 |publisher=lvfcs.org |url-status=dead |archive-url=https://web.archive.org/web/20120423011609/http://www.lvfcs.org/lvfcs/2008.html |archive-date=23 April 2012 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Foreign Language Film<br /> |<br /> |{{won}}<br /> |-<br /> |2008 [[National Board of Review of Motion Pictures]] Awards&lt;ref&gt;{{cite web |url=http://www.nbrmp.org/awards/past.cfm?year=2008 |title=Awards for 2008 |access-date=2011-02-21 |publisher=National Board of Review |url-status=dead |archive-url=https://web.archive.org/web/20120516165456/http://www.nbrmp.org/awards/past.cfm?year=2008 |archive-date=16 May 2012 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Foreign Language Film<br /> |<br /> |{{won}}<br /> |-<br /> |rowspan=6|2008 [[Nika Award]]s&lt;ref&gt;{{cite web|url=http://www.kino-nika.com/ |title=Award Winners &amp; Nominees |access-date=2011-02-21 |publisher=Nika Awards}}&lt;/ref&gt;<br /> |Best Cinematography<br /> |Sergey Trofimov, Rogier Stoffers<br /> |{{won}}<br /> |-<br /> |Best Costume Design<br /> |Karin Lohr<br /> |{{won}}<br /> |-<br /> |Best Director<br /> |Sergey Bodrov<br /> |{{won}}<br /> |-<br /> |Best Film<br /> |<br /> |{{won}}<br /> |-<br /> |Best Production Design<br /> |Dashi Namdakov, Yelena Zhukova<br /> |{{won}}<br /> |-<br /> |Best Sound<br /> |Stephan Konken<br /> |{{won}}<br /> |}<br /> <br /> ==Sequel==<br /> ''The Great Khan'' ({{lang|ru|Великий Хан}}) is the provisional title&lt;ref&gt;{{cite web | title = Bodrov launches production company, Director's first project to be a 'Mongol' | publisher = Variety | date = 16 May 2008 | url = https://www.variety.com/article/VR1117985966.html | access-date = 2010-11-01}}&lt;/ref&gt; for the second installment of Bodrov's planned trilogy on the life of Temüjin, [[Genghis Khan]]. The Mongolian pop singer, [[Amarkhuu Borkhuu]], was offered a role, but declined.&lt;ref&gt;{{cite news|url=http://www.postnews.mn/index.php?cp=news&amp;task=view&amp;news_id=6522&amp;PHPSESSID=d97d127e45acbded6332c901ba3ee32d&amp;page=52&amp;PHPSESSID=d97d127e45acbded6332c901ba3ee32d|title=Б.АМАРХҮҮ С.БОДРОВТ ГОЛОГДЖЭЭ|access-date=2011-01-03|language=mn}}&lt;/ref&gt; The trilogy project was eventually put on the shelf, but in July 2013, during a visit to the annual [[Naadam|Naadam Festival]] in [[Ulan Bator]], Bodrov told the press that the production of the sequel had started again.&lt;ref name=InfoMongolia/&gt; The sequel is now called &quot;Mongol II: The Legend&quot; and started its shooting in 2019.&lt;ref&gt;{{Cite web|last=|first=|date=|title=Getaway Pictures|url=http://getawaypictures.com/?p=103|access-date=|website=}}&lt;/ref&gt;<br /> <br /> ==Soundtrack==<br /> The soundtrack for ''Mongol'', was released in the United States by the [[Varèse Sarabande]] music label on 29 July 2008.&lt;ref&gt;{{cite web |url=http://music.barnesandnoble.com/Mongol/Altan-Urag/e/30206690224/?itm=1&amp;USRI=mongol |title=Mongol Original Motion Picture Soundtrack |publisher=BarnesandNoble.com |access-date=2011-02-15 }}{{Dead link|date=July 2022 |bot=InternetArchiveBot |fix-attempted=yes }}&lt;/ref&gt; The score for the film was composed by [[Tuomas Kantelinen]], with additional music orchestrated by the Mongolian folk rock band [[Altan Urag]].&lt;ref&gt;{{cite web|url=https://movies.yahoo.com/movie/1808754771/cast |title=Mongol (2008) |access-date=2011-02-15 |publisher=Yahoo! Movies}}&lt;/ref&gt;<br /> <br /> {{Infobox album<br /> | name = Mongol: Original Motion Picture Soundtrack<br /> | type = [[Film score]]<br /> | artist = [[Tuomas Kantelinen]]<br /> | cover =<br /> | caption =<br /> | alt =<br /> | released = 07/29/2008<br /> | recorded =<br /> | venue =<br /> | studio =<br /> | genre =<br /> | length = 43:39<br /> | label = Varèse Sarabande<br /> | producer =<br /> | prev_title =<br /> | prev_year =<br /> | next_title =<br /> | next_year =<br /> }}<br /> <br /> {{Track listing<br /> | headline = ''Mongol: Original Motion Picture Soundtrack''<br /> | total_length = 43:39<br /> | title1 = Beginning<br /> | length1 = 4:35<br /> | title2 = At the Fireplace: Composed and Performed by Altan Urag<br /> | length2 = 0:48<br /> | title3 = Blood Brothers<br /> | length3 = 1:08<br /> | title4 = Chase 1: Composed and Performed by Altan Urag<br /> | length4 = 0:51<br /> | title5 = Fighting Boys<br /> | length5 = 0:53<br /> | title6 = Temüjin's Escape<br /> | length6 = 2:03<br /> | title7 = Funeral and Robbery: Composed and Performed by Altan Urag<br /> | length7 = 2:30<br /> | title8 = Together Now<br /> | length8 = 1:52<br /> | title9 = Love Theme<br /> | length9 = 1:25<br /> | title10 = Chase 2: Composed and Performed by Altan Urag<br /> | length10 = 1:36<br /> | title11 = Cold Winter<br /> | length11 = 2:30<br /> | title12 = Merkit Territory<br /> | length12 = 1:53<br /> | title13 = Attack<br /> | length13 = 0:44<br /> | title14 = Martial Rage<br /> | length14 = 1:12<br /> | title15 = Jamukha is Following<br /> | length15 = 1:30<br /> | title16 = Slavery<br /> | length16 = 1:48<br /> | title17 = Long Journey<br /> | length17 = 0:49<br /> | title18 = Destiny<br /> | length18 = 1:49<br /> | title19 = Joy in Mongolia: Composed and Performed by Altan Urag<br /> | length19 = 3:07<br /> | title20 = Final Battle, Showing Strength<br /> | length20 = 2:15<br /> | title21 = Final Battle, Tactical Order<br /> | length21 = 0:36<br /> | title22 = Final Battle, The First Attachment<br /> | length22 = 1:21<br /> | title23 = Final Battle, Death by Arrows<br /> | length23 = 1:55<br /> | title24 = Tengri's Help<br /> | length24 = 0:57<br /> | title25 = Victory to Khan<br /> | length25 = 1:36<br /> | title26 = No Mercy<br /> | length26 = 1:56<br /> }}<br /> <br /> ==See also==<br /> * [[List of Asian historical drama films]]<br /> * [[List of submissions to the 80th Academy Awards for Best Foreign Language Film]]<br /> * [[List of Kazakhstani submissions for the Academy Award for Best Foreign Language Film]]<br /> <br /> ==References==<br /> {{reflist|colwidth=30em}}<br /> <br /> ==External links==<br /> {{wikiquote|Mongol (film)|Mongol}}<br /> * {{IMDb title|0416044|Mongol}}<br /> * {{mojo title|mongol|Mongol}}<br /> * {{rotten-tomatoes|mongol|Mongol}}<br /> * {{Metacritic film|title=Mongol}}<br /> <br /> {{Sergei Bodrov}}<br /> {{Nika Award Best Picture}}<br /> {{National Board of Review Award for Best Foreign Language Film}}<br /> <br /> {{DEFAULTSORT:Mongol (Film)}}<br /> [[Category:2007 films]]<br /> [[Category:2007 biographical drama films]]<br /> [[Category:2000s historical adventure films]]<br /> [[Category:2000s war films]]<br /> [[Category:Adventure films based on actual events]]<br /> [[Category:Biographical action films]]<br /> [[Category:Depictions of Genghis Khan on film]]<br /> [[Category:Films directed by Sergei Bodrov]]<br /> [[Category:Films scored by Tuomas Kantelinen]]<br /> [[Category:Films set in Mongolia]]<br /> [[Category:Films set in the 12th century]]<br /> [[Category:Films set in the 13th century]]<br /> [[Category:Films set in the Mongol Empire]]<br /> [[Category:Films shot in China]]<br /> [[Category:Films shot in Kazakhstan]]<br /> [[Category:German biographical drama films]]<br /> [[Category:German historical drama films]]<br /> [[Category:German war drama films]]<br /> [[Category:Kazakhstani war drama films]]<br /> [[Category:2000s Mandarin-language films]]<br /> [[Category:Mongolian drama films]]<br /> [[Category:Mongolian-language films]]<br /> [[Category:Picturehouse films]]<br /> [[Category:Russian biographical drama films]]<br /> [[Category:Russian war drama films]]<br /> [[Category:Russian historical drama films]]<br /> [[Category:Universal Pictures films]]<br /> [[Category:New Line Cinema films]]<br /> [[Category:War epic films]]<br /> [[Category:War films based on actual events]]<br /> [[Category:2007 drama films]]<br /> [[Category:2000s German films]]<br /> [[Category:2007 multilingual films]]<br /> [[Category:Russian multilingual films]]<br /> [[Category:German multilingual films]]<br /> [[Category:Kazakhstani multilingual films]]<br /> [[Category:Kazakhstani historical drama films]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Mongol_(film)&diff=1173482282 Mongol (film) 2023-09-02T18:28:26Z <p>205.189.94.9: </p> <hr /> <div>{{Short description|2007 historical epic film}}<br /> {{Other uses|Genghis Khan (disambiguation)}}<br /> {{Use dmy dates|date=April 2015}}<br /> {{Infobox film<br /> | name = Mongol<br /> | image = Mongol poster.jpg<br /> | caption = American theatrical release poster<br /> | director = [[Sergei Bodrov]]<br /> | writer = {{Plainlist|<br /> * {{ill|Arif Aliyev|ru|Алиев, Ариф Тагиевич}}<br /> * Sergei Bodrov<br /> }}<br /> | producer = {{Plainlist|<br /> * {{ill|Sergey Selyanov|ru|Сельянов, Сергей Михайлович}}<br /> * Sergei Bodrov<br /> * [[Anton Melnik]]<br /> }}<br /> | starring = {{Plainlist|<br /> * [[Tadanobu Asano]]<br /> * [[Sun Honglei]]<br /> * [[Chuluuny Khulan]]<br /> }}<br /> | cinematography = {{Plainlist|<br /> * [[Sergei Trofimov (cinematographer)|Sergei Trofimov]]<br /> * [[Rogier Stoffers]]<br /> }}<br /> | editing = {{Plainlist|<br /> * [[Zach Staenberg]]<br /> * [[Valdís Óskarsdóttir]]<br /> }}<br /> | music = [[Tuomas Kantelinen]]<br /> | studio = {{Plainlist|<br /> * {{ill|Kinokompaniya CTB|ru|СТВ (кинокомпания)}}<br /> * {{ill|Andreevski Flag|ru|Андреевский флаг (кинокомпания)}}<br /> * {{ill|X Filme Creative Pool|de}}<br /> * Kinofabrika<br /> * Eurasia Film<br /> }}<br /> | distributor = {{Plainlist|<br /> * Nashe Kino (Russia)<br /> * {{ill|X Verleih|de}} (Germany)<br /> }}<br /> | released = {{Film date|df=yes|2007|08|10|Vyborg|2007|09|20|Russia|2008|08|07|Germany}}<br /> | runtime = 125 minutes&lt;!--Theatrical runtime: 125:17--&gt;&lt;ref&gt;{{cite web | url=https://bbfc.co.uk/releases/mongol-2008-0 | title=''MONGOL'' (15) | work=[[British Board of Film Classification]] | date=31 March 2009 | access-date=21 April 2015}}&lt;/ref&gt;<br /> | country = {{Plainlist|<br /> * Kazakhstan<br /> * Russia<br /> * Mongolia<br /> * Germany&lt;ref&gt;{{cite web | url=https://lumiere.obs.coe.int/movie/30131# | title=Mongol | work=[[Lumiere (database)|Lumiere]] | access-date=15 January 2022}}&lt;/ref&gt;<br /> }}<br /> | language = {{Plainlist|<br /> * [[Mongolian language|Mongolian]]<br /> * [[Standard Chinese|Mandarin]]<br /> }}<br /> | budget = $18 million&lt;ref name=BoxOfficeMojo&gt;{{cite web |url=https://boxofficemojo.com/movies/?id=mongol.htm |title=Mongol |publisher=[[Box Office Mojo]] |access-date=2011-02-21}}&lt;/ref&gt;<br /> | gross = $26.5 million&lt;ref name=BoxOfficeMojo /&gt;<br /> }}<br /> '''''Mongol''''' ({{lang|ru|Монгол}}), also known as '''''Mongol: The Rise of Genghis Khan''''' in the United States and '''''Mongol: The Rise to Power of Genghis Khan''''' in the United Kingdom, is a 2007 [[Historical film|period]] [[epic film]] directed by [[Sergei Bodrov]], about the early life of [[Temüjin]], who later came to be known as [[Genghis Khan]]. The storyline was conceived from a [[screenplay]] written by Bodrov and Arif Aliev. It was produced by Bodrov, Sergei Selyanov, and Anton Melnik and stars [[Tadanobu Asano]], [[Sun Honglei]], and [[Chuluuny Khulan]] in the main roles. ''Mongol'' explores abduction, [[kinship]], and the repercussions of war.&lt;ref name=&quot;film&quot;/&gt; Perhaps it is so that the film itself is a Canadian regression into revisited borders. This is not a country known for establishing or keeping borders, nor are most [[iterations]] of the Empire]]. Check [[John Woo]]; [[Ivan Sergei]]; [[Once a Thief]] for current [[pollination]] of [[cultural hegemony]]. These appear to be hegemonies, do not mistake this for [[current]] [[geopolitical]] narrative]], because it is not is that narrative, according to the [[Charter of Rights and Freedoms, 1981. This is a [[New American]] film, do not look at it as [[American]] or [[Canadian]], because this is [[probably]] an [[offshoot]] of the [[Chinois]] [[migration]] of the late [[20th century]].<br /> <br /> The film was a co-production between companies in Russia, Germany and Kazakhstan. Filming took place mainly in the [[China|People's Republic of China]], principally in [[Inner Mongolia]] (the [[Mongols|Mongol]] [[Autonomous regions of China|autonomous region]]), and in [[Kazakhstan]]. Shooting began in September 2005, and was completed in November 2006. After an initial screening at the Russian Film Festival in [[Vyborg]] on 10 August 2007, ''Mongol'' was released in Russia on [[20 September 2007]]. It saw a limited release in the United States on 6 June 2008 grossing $5.7 million in domestic ticket sales. It additionally earned nearly $21 million in sales through international release for a combined $26.5 million in gross revenue. The film was a minor financial success after its theatrical run, and was generally met with positive critical reviews. The film was nominated for the 2007 [[Academy Award for Best Foreign Language Film]] as a submission from Kazakhstan.&lt;ref&gt;{{cite press release | title = 80th Academy Awards Nominations Announced | publisher = [[Academy of Motion Picture Arts and Sciences]] | date = 2008-01-22 | url = http://www.oscars.org/press/pressreleases/2008/08.01.22.html | access-date = 2008-01-22}}&lt;/ref&gt;<br /> <br /> The film is intended to be the first part of a [[trilogy]] about Genghis Khan, and initial work on the second part began in 2008.&lt;ref&gt;{{cite news |title=Bodrov kicks off production unit |first=Tom |last=Birchenough |url=http://www.varietyasiaonline.com/content/view/6083/53/ |newspaper=[[Variety (magazine)|Variety Asia]] |publisher=[[Reed Business Information]] |date=14 May 2008 |access-date=25 January 2010 |archive-url=https://web.archive.org/web/20080515195418/http://www.varietyasiaonline.com/content/view/6083/53/ |archive-date=15 May 2008 }}&lt;/ref&gt; The trilogy project was eventually put on the shelf, but in July 2013, during a visit to the annual [[Naadam|Naadam Festival]] in [[Ulan Bator]], Bodrov told the press that the production of the sequel had started, and that it may be shot in [[Mongolia]],&lt;ref name=InfoMongolia&gt;[http://www.infomongolia.com/ct/ci/6420 InfoMongolia, 6 August 2013: &quot;Russian Producer Announces the Sequel to 'Mongol'&quot;] {{Webarchive|url=https://web.archive.org/web/20150709061616/http://www.infomongolia.com/ct/ci/6420 |date=9 July 2015 }} Linked 2013-08-29&lt;/ref&gt; as had been the intention for ''Mongol'', before local protests, fearing that the film would not correctly portray the Mongolian people and their national hero, Genghis Khan, caused the shooting to move to Inner Mongolia and Kazakhstan.&lt;ref name=Protests&gt;[https://www.variety.com/article/VR1117920791.html?categoryid=1019&amp;cs=1 ''Variety'', 10 April 2005: &quot;Mongols protest Khan project&quot;]. Retrieved 2011-02-15.&lt;/ref&gt;<br /> <br /> ==Plot==<br /> In 1192, [[Genghis Khan|Temüjin]], a prisoner in the [[Western Xia|Tangut kingdom]], recounts his story through a series of [[Flashback (narrative)|flashback]]s.<br /> <br /> Embarking on an expedition 20 years earlier (1172), nine-year-old Temüjin is accompanied by his father [[Yesugei|Yesügei]] to select a girl as his future wife. He meets and chooses [[Börte]], against his father's wishes. On their way home, Yesügei is poisoned by an enemy tribe; on his dying breath, he tells his son that he is now [[Khan (title)|Khan]]. However, Targutai, Yesügei's lieutenant, proclaims himself as Khan and is about to kill his young rival. Prevented from doing so by [[Hoelun|the boy's mother]], Targutai lets him go and vows to kill him as soon as he becomes an adult.<br /> <br /> After falling through a frozen lake, Temüjin is rescued by [[Jamukha]]. The two quickly become friends and take an oath as [[blood brother]]s. Targutai later captures him, but he escapes under the cover of night and roams the countryside.<br /> <br /> Years later (1186), Temüjin is once again apprehended by Targutai. He escapes a second time, finding Börte and presenting her to his family. Later that night, they are attacked by the [[Merkit]] tribe. While being chased on horseback, Temüjin is shot with an arrow but survives. Börte, however, is kidnapped and taken to the Merkit camp.&lt;ref name=&quot;film&quot;&gt;[[Sergei Bodrov]]. (2007). ''Mongol'' [Motion picture]. Russia: [[Picturehouse (company)|Picturehouse Entertainment]].&lt;/ref&gt;<br /> <br /> Temüjin goes to Jamukha—who is now his tribe's Khan—and seeks his help in rescuing his wife. Jamukha agrees, and after a year, they launch an attack on the Merkits and are successful. One night, while celebrating their victory, Temüjin demonstrates his generosity by allowing his troops to take an equal share of the [[Looting|plunder]]. Two of Jamukha's men see this as a stark contrast to their Khan's behavior and desert him the next morning by following their new master. Jamukha chases him down and demands that he give his men back, to which he refused. This act, aggravated by the inadvertent killing of his biological brother by one of Temüjin's men, leaves Jamukha (with Targutai as an ally) no choice but to declare war on him. Outnumbered, Temüjin's army is quickly defeated. Sparing his blood brother, Jamukha decides to sell him into slavery.&lt;ref name=&quot;film&quot;/&gt;<br /> <br /> Temüjin is sold to a Tangut nobleman despite the dire warning given to him by a [[Buddhist]] monk acting as his adviser, who senses the great potential the warrior carries and his future role in subjugating the Tangut State. While he is imprisoned, the monk pleads with him to spare his monastery when he will destroy the kingdom sometime in the future. In exchange for delivering a bone fragment to Börte indicating that he is still alive, Temüjin agrees. The monk succeeds in delivering the bone and the message at the cost of his life. Börte infiltrates the Tangut border town disguised as a merchant's [[concubinage|concubine]] and the two escape.<br /> <br /> Temüjin pledges to unify all of the Mongol tribes and imposes three basic laws for them to abide to: never kill women and children, always honor your promises and repay your debts, and ''never'' betray your Khan. Subsequently, (1196), he gathers an army and engages Jamukha, who has an even larger force. During the battle, a thunderstorm arises on the steppe, terrifying Jamhukha's and Temujin's armies, who cower in fear. However Temujin does not cower, and when his army sees him riding unafraid they are inspired to also be fearless and charge Jamukha's helpless and cowering army, which surrenders immediately. Temüjin allows Jamukha to live and brings the latter's army under his banner. Targutai is killed by his own soldiers and his body is presented to the Khan as a way of appeasing him, but they are executed for disobeying the law.<br /> <br /> A postscript indicates that by 1206, Temüjin was designated the Khan of all the [[Mongols]]—''[[Genghis Khan]] of the Great Steppe''. He would later go on to invade and conquer the Tangut kingdom by 1227, fulfilling the monk's prophecy, but spared the monastery, honoring his debt to the monk.&lt;ref name=&quot;film&quot;/&gt;<br /> <br /> ==Cast==<br /> [[File:Tadanobu.jpg|thumb|190px|right|Actor Tadanobu Asano, who portrayed the elder Temüjin in the film.]]<br /> {{div col|colwidth=22em}}<br /> * [[Tadanobu Asano]] as [[Genghis Khan|Genghis Khan/Temüjin]]<br /> ** Odnyam Odsuren as young Temüjin<br /> * [[Sun Honglei]] as [[Jamukha]]<br /> ** Amarbold Tuvshinbayar as young Jamukha<br /> * [[Chuluuny Khulan]] as [[Börte]]<br /> ** Bayertsetseg Erdenebat as young Börte<br /> * [[Amadu Mamadakov]] as Targutai<br /> * [[Batdorj-in Baasanjab|Ba Sen]] as [[Yesugei|Yesügei]]<br /> * Sai Xing Ga as Chiledu<br /> * Bu Ren as Taichar<br /> * Aliya as Oelun<br /> * He Qi as Dai-Sechen<br /> * Deng Ba Te Er as Daritai<br /> * Zhang Jiong as Garrison Chief<br /> * Ben Hon Sun as Monk<br /> {{div col end}}<br /> <br /> ==Production==<br /> {{More citations needed section|date=May 2021}}<br /> <br /> ===Development===<br /> [[File:Sergei Vladimirovich Bodrov.jpg|170px|left|thumb|Director Sergei Bodrov at the [[66th Venice International Film Festival|66th Venice Film Festival]]]]<br /> The premise of ''Mongol'' is the story of Genghis Khan, the Mongol leader who founded the [[Mongol Empire]], which ruled expansive areas of [[Eurasia]]. The film depicts the early life of Temüjin, not as an evil war-mongering brute, but rather an inspiring visionary leader. Director Bodrov noted that &quot;Russians lived under Mongolian rule for around 200 years&quot; and that &quot;Genghis Khan was portrayed as a monster&quot;. During the 1990s, Bodrov read a book by Russian historian [[Lev Gumilev]] entitled ''The Legend of the Black Arrow'', which offered a more disciplined view of the Mongol leader and influenced Bodrov to create a film project about the warrior.<br /> <br /> Bodrov spent several years researching the aspects of his story, discovering that Temüjin was an orphan, a slave and a combatant whom everyone tried to kill. He found difficulty in preparing the screenplay for the film due to the fact that no contemporary Mongol biography existed. The only Mongol history from the era is ''[[The Secret History of Mongols]]'', written for the Mongol royal family some time after Genghis Khan's death in AD 1227. Author Gumilev had used the work as a historical reference and a work of significant literature. Casting for the film took place worldwide, including Mongolia, China, Russia, and in Los Angeles. Speaking on the choice of Tadanobu Asano to portray Temüjin, Bodrov commented that although it might have seemed odd to cast a Japanese actor in the role, he explained that the Mongol ruler was seen by many Japanese as one of their own. Bodrov said, &quot;The Japanese had a very famous ancient warrior who disappeared {{bracket|[[Minamoto no Yoshitsune]]}}, and they think he went to Mongolia and became Genghis Khan. He's a national hero, Genghis Khan. Mongolians can claim he's Mongolian, but the Japanese, they think they know who he is.&quot; Bodrov felt casting actor Sun Honglei as Jamukha was a perfect mix of &quot;gravity and humor&quot; for the role. Describing the character interaction between Asano and Honglei, he noted &quot;They're completely different people, Temüjin and Jamukha, but they have a strong relationship, strong feelings between them.&quot; Aside from the Chinese and Japanese actors for those roles, the rest of the cast were Mongolian. It marked the first time a tale of Genghis Khan would be acted by Asians, this in contrast to such Hollywood and European attempts like the 1956 movie flop ''[[The Conqueror (1956 film)|The Conqueror]]'' and the 1965 film ''[[Genghis Khan]]'' with [[Omar Sharif]].<br /> <br /> The film was initially intended to be shot in [[Mongolia]], but the plans caused much protest in the country, as many Mongolians feared that it would not correctly portray their people and their national hero.&lt;ref name=Protests/&gt; As a consequence, shooting was moved to the Chinese autonomous region [[Inner Mongolia]] and to [[Kazakhstan]].<br /> <br /> ===Filming===<br /> [[File:MongolAsano.jpg|thumb|right|The character Temüjin, dressed in Mongolian warrior garb]]<br /> [[Principal photography|Filming]] began in 2005, lasting 25 weeks and taking place in China, Mongolia, and Kazakhstan. Production designer Dashi Namdakov helped to recreate the pastoral lifestyle of the nomadic tribesmen. Namdakov is originally from a Russian region which borders Mongolia and is home to many ethnic Mongols. Bodrov remarked, &quot;Dashi has the Mongol culture in his bones and knows how to approach this material.&quot; To help create some of the horse-mounted stunt sequences, Bodrov called upon seasoned stuntmen from Kazakhstan and Kyrgyzstan, whom he was familiar with from the production of ''[[Nomad (2005 film)|Nomad]]''. Describing some of the stunt work, Bodrov claimed: &quot;Not a single horse was hurt on this film. There's a line in the movie, when young Jamukha tells Temüjin, 'For Mongol, horse is more important than woman.' And that's how it is with the Kazakh and Kyrgyz stunt people. They took very good care of the horses and were very conscientious.&quot; Bodrov collaborated on the film with editors [[Zach Staenberg]] and [[Valdís Óskarsdóttir]].<br /> <br /> ==Release==<br /> ''Mongol'' was first released in Russia and Ukraine on 20 September 2007.&lt;ref name=BoxOfficeRelease&gt;{{cite web |url=https://boxofficemojo.com/movies/?page=intl&amp;id=mongol.htm |title=International Box Office Results |publisher=[[Box Office Mojo]] |access-date=2011-02-21}}&lt;/ref&gt; The film then premiered in cinemas in Turkey on 14 March 2008. Between April and December 2008, ''Mongol'' was released in various countries throughout the Middle East, Europe and Africa.&lt;ref name=BoxOfficeRelease/&gt; France, Algeria, Monaco, Morocco and Tunisia shared a release date of 9 April 2008. In the United States and the United Kingdom, the film was released on 6 June 2008. In 2009, certain Asian Pacific countries such as Singapore and Malaysia saw release dates for the film.&lt;ref name=BoxOfficeRelease/&gt; Within Latin America, Argentina saw a release for the film on 11 March, while Colombia began screenings on 9 April. The film grossed $20,821,749 in non-US box office totals.&lt;ref name=BoxOfficeRelease/&gt;<br /> <br /> ===US box office===<br /> In the United States, the film premiered in cinemas on 6 June 2008. During its opening weekend, the film opened in 22nd place grossing $135,326 in business showing at five locations.&lt;ref name=&quot;BoxOfficeMojo&quot;/&gt; The film's revenue dropped by 17% in its second week of release, earning $112,212. For that particular weekend, the film fell to 25th place screening in five theaters. During the film's final release week in theaters, ''Mongol'' opened in a distant 80th place with $11,503 in revenue.&lt;ref&gt;{{cite web |url=https://boxofficemojo.com/weekend/chart/?yr=2008&amp;wknd=36&amp;p=.htm |title=September 5–7, 2008 Weekend |publisher=[[Box Office Mojo]] |access-date=2011-02-21}}&lt;/ref&gt; The film went on to top out domestically at $5,705,761 in total ticket sales through a 14-week theatrical run. Internationally, the film took in an additional $20,821,749 in box office business for a combined worldwide total of $26,527,510.&lt;ref name=BoxOfficeMojo/&gt; For 2008 as a whole, the film would cumulatively rank at a box office performance position of 167.&lt;ref&gt;{{cite web |url=https://boxofficemojo.com/yearly/chart/?yr=2008&amp;p=.htm |title=2008 Domestic Grosses |publisher=[[Box Office Mojo]] |access-date=2011-02-21}}&lt;/ref&gt;<br /> <br /> ===Home media===<br /> Following its cinematic release in theaters, the [[DVD region code|Region 1 Code]] [[widescreen]] edition of the film was released on [[DVD]] in the United States on 14 October 2008. Special features for the DVD include scene selections, subtitles in English and Spanish, and subtitles in English for the hearing-impaired.&lt;ref&gt;{{cite web |url=http://video.barnesandnoble.com/DVD/Mongol/Tadanobu-Asano/e/883929028788/?itm=2&amp;USRI=mongol |title=Mongol DVD Widescreen |publisher=BarnesandNoble.com |access-date=2011-02-15 |archive-date=7 July 2011 |archive-url=https://web.archive.org/web/20110707212818/http://video.barnesandnoble.com/DVD/Mongol/Tadanobu-Asano/e/883929028788/?itm=2&amp;USRI=mongol |url-status=dead }}&lt;/ref&gt;<br /> <br /> The widescreen high-definition [[Blu-ray|Blu-ray Disc]] version of the film was also released on 14 October 2008. Special features include; scene selections and subtitles in English and Spanish.&lt;ref&gt;{{cite web |url=http://video.barnesandnoble.com/DVD/Mongol/Tadanobu-Asano/e/883929028771/?itm=1&amp;USRI=mongol |title=Mongol Blu-ray Widescreen |publisher=BarnesandNoble.com |access-date=2011-02-15 |archive-date=7 July 2011 |archive-url=https://web.archive.org/web/20110707212813/http://video.barnesandnoble.com/DVD/Mongol/Tadanobu-Asano/e/883929028771/?itm=1&amp;USRI=mongol |url-status=dead }}&lt;/ref&gt; A supplemental viewing option for the film in the media format of [[video on demand]] is currently available too.&lt;ref&gt;{{cite web |url=https://www.amazon.com/Mongol/dp/B001I9M7GM/ref=ed_oe_vdl |title=Mongol VOD Format |website=Amazon |access-date=2011-02-15}}&lt;/ref&gt;<br /> <br /> ==Reception==<br /> <br /> ===Critical response===<br /> Among mainstream critics in the U.S., the film received mostly positive reviews. [[Rotten Tomatoes]] reported that 87% of 104 sampled critics gave the film a positive review, with an average score of 7.10 out of 10. The site's critics' consensus reads: &quot;The sweeping ''Mongol'' mixes romance, family drama, and enough flesh-ripping battle scenes to make sense of Ghenghis Khan's legendary stature.&quot;&lt;ref&gt;{{Cite web|url=https://www.rottentomatoes.com/m/mongol|title=Mongol (2008)|website=[[Rotten Tomatoes]]|publisher=[[Fandango Media]]|access-date=April 5, 2020}}&lt;/ref&gt; At [[Metacritic]], which assigns a [[weighted mean|weighted average]] out of 100 to critics' reviews, the film received a score of 74 based on 27 reviews, indicating &quot;Generally favorable reviews&quot;.&lt;ref&gt;{{Cite web|url=https://www.metacritic.com/movie/mongol-the-rise-of-genghis-khan|title=Mongol: The Rise of Genghis Khan Reviews|website=[[Metacritic]]|publisher=[[CBS Interactive]]|access-date=April 13, 2020}}&lt;/ref&gt; However, the film was criticized in Mongolia for factual errors and historical inaccuracies.&lt;ref&gt;[http://www.olloo.mn/modules.php?name=News&amp;file=print&amp;sid=76632 Г. Жигжидсvрэн: Сергей Бодровын &quot;Монгол&quot; кинонд бvтээсэн дvр байхгvй] {{webarchive|url=https://web.archive.org/web/20110722220001/http://www.olloo.mn/modules.php?name=News&amp;file=print&amp;sid=76632 |date=22 July 2011 }}. ''olloo.mn''. Retrieved 2011-02-17.&lt;/ref&gt;<br /> <br /> Claudia Puig of ''[[USA Today]]'' said the film &quot;has a visceral energy with powerful battle sequences and also scenes of striking and serene physical beauty.&quot; Noting a flaw, she did comment that ''Mongol'' might have included &quot;one battle too many.&quot; Although overall, she concluded the film was &quot;an exotic saga that compels, moves and envelops us with its grand and captivating story.&quot;&lt;ref name=&quot;Puig&quot;&gt;Puig, Claudia (12 June 2008). [https://www.usatoday.com/life/movies/reviews/2008-06-12-mongol_N.htm Tepid 'Mongol' A sweeping historic tale]. ''[[USA Today]]''. Retrieved 2011-02-16.&lt;/ref&gt; <br /> {|class=&quot;toccolours&quot; style=&quot;float: left; margin-left: 1em; margin-right: 2em; font-size: 85%; background:#FFFFE0; color:black; width:40em; max-width: 35%;&quot; cellspacing=&quot;5&quot;<br /> |style=&quot;text-align: left;&quot;|&quot;Centered on the rise of Genghis Khan, the film is an enthralling tale, in the style of a David Lean saga, with similarly gorgeous cinematography. It combines a sprawling adventure saga with romance, family drama and riveting action sequences.&quot;<br /> |-<br /> |style=&quot;text-align: left;&quot;|—Claudia Puig, writing in ''USA Today''&lt;ref name=&quot;Puig&quot;/&gt;<br /> |}<br /> Jonathan Kiefer, writing in the ''[[Sacramento News &amp; Review]]'', said &quot;At once sweeping and intimately confidential, with durably magnetic performances by Japan's Asano Tadanobu as the adored warlord and China's Honglei Sun as Jamukha, his blood brother and eventual enemy, ''Mongol'', a 2007 Best Foreign Language Film Oscar nominee, has to be by far the best action epic of 12th- and 13th-century Asian nomads you'll see&quot;. He emphatically believed Bodrov's film was &quot;both ancient and authentic.&quot; He added that it was &quot;commendably unhurried, and the scope swells up in a way that feels organic to a character-driven story&quot;.&lt;ref name=&quot;kiefer&quot;&gt;Kiefer, Jonathan (26 June 2008). [http://www.newsreview.com/sacramento/content?oid=684906 I think I Khan ''Mongol'']. ''[[Sacramento News &amp; Review]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Walter Addiego, writing for the ''[[San Francisco Chronicle]],'' said that the film offers &quot;everything you would want from an imposing historical drama: furious battles between mass armies, unquenchable love between husband and wife, blood brothers who become deadly enemies, and many episodes of betrayal and treachery&quot;. Concerning cinematography, he believed the film included &quot;plenty of haunting landscapes, gorgeously photographed by Sergei Trofimov on location in China, Kazakhstan and Mongolia, along with the sort of warfare scenes that define epics&quot;.&lt;ref&gt;Addiego, Walter (20 June 2008). [http://www.sfgate.com/cgi-bin/article.cgi?file=/c/a/2008/06/20/DDH7115QHE.DTL Review: 'Mongol' revisits Genghis Khan]. ''[[San Francisco Chronicle]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Writing for ''[[The Boston Globe]]'', [[Wesley Morris]] said that ''Mongol'' &quot;actually works as an old-fashioned production - one with breathtaking mohawks, a scary yoking, one daring escape, hottish sex, ice, snow, braying sheep, blood oaths, dehydrating dunes, throat singing, a nighttime urination, kidnapping, charged reunions, and relatively authentic entertainment values.&quot;&lt;ref&gt;Morris, Wesley (20 June 2008). [https://www.boston.com/movies/display?display=movie&amp;id=8626 When blood runs hot and cold]. ''[[The Boston Globe]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Writing for the ''[[Chicago Sun-Times]]'', film critic [[Roger Ebert]] called the film a &quot;visual spectacle, it is all but overwhelming, putting to shame some of the recent historical epics from Hollywood.&quot; Summing up, Ebert wrote &quot;The nuances of an ancient and ingeniously developed culture are passed over, and it cannot be denied that ''Mongol'' is relentlessly entertaining as an action picture.&quot;&lt;ref name=&quot;Ebert&quot;&gt;Ebert, Roger (20 June 2008). [http://rogerebert.suntimes.com/apps/pbcs.dll/article?AID=/20080619/REVIEWS/944262138/1023 Mongol] {{Webarchive|url=https://web.archive.org/web/20110716164958/http://rogerebert.suntimes.com/apps/pbcs.dll/article?AID=%2F20080619%2FREVIEWS%2F944262138%2F1023 |date=16 July 2011 }}. ''[[Chicago Sun-Times]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> {| class=&quot;toccolours&quot; style=&quot;float: right; margin-left: 1em; margin-right: 2em; font-size: 85%; background:#FFFFE0; color:black; width:30em; max-width: 40%;&quot; cellspacing=&quot;5&quot;<br /> | style=&quot;text-align: left;&quot; |&quot;''Mongol'' is a ferocious film, blood-soaked, pausing occasionally for passionate romance and more frequently for torture.&quot;<br /> |-<br /> | style=&quot;text-align: left;&quot; |—Roger Ebert, writing for the ''Chicago Sun-Times''&lt;ref name=&quot;Ebert&quot; /&gt;<br /> |}<br /> <br /> [[A. O. Scott]] of ''[[The New York Times]]'' stated that ''Mongol'' was a &quot;big, ponderous epic, its beautifully composed landscape shots punctuated by thundering hooves and bloody, slow-motion battle sequences.&quot;&lt;ref name=&quot;Scott&quot; /&gt; Scott approved of how the film encompassed &quot;rich ethnographic detail and enough dramatic intrigue to sustain a viewer's interest through the slower stretches.&quot;&lt;ref name=&quot;Scott&quot;&gt;Scott A.O., (6 June 2008). [https://movies.nytimes.com/2008/06/06/movies/06mong.html?ref=movies Forge a Unity of Purpose, Then Conquer the World]. ''[[The New York Times]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Similarly, [[Joe Morgenstern]] wrote in ''[[The Wall Street Journal]]'' that the film consisted of battle scenes which were as &quot;notable for their clarity as their intensity; we can follow the strategies, get a sense of who's losing and who's winning. The physical production is sumptuous.&quot; Morgenstern affirmed that ''Mongol'' was &quot;an austere epic that turns the stuff of pulp adventure into a persuasive take on ancient history.&quot;&lt;ref&gt;Morgenstern, Joe (6 June 2008). [https://www.wsj.com/articles/SB121271272126950681 'Mongol' Brings Style And Sumptuous Scale To Genghis Khan Saga]. ''[[The Wall Street Journal]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> [[Lisa Schwarzbaum]], writing for ''[[Entertainment Weekly]],'' lauded the visual qualities of the film, remarking how ''Mongol'' &quot;contrasts images of sweeping landscape and propulsive battle with potent scenes of emotional intimacy&quot;, while also referring to its &quot;quite grand, quite exotic, David Lean-style epic&quot; resemblance.&lt;ref&gt;Schwarzbaum, Lisa (6 June 2008). [https://www.ew.com/ew/article/0,,20204732,00.html Mongol (2008)] {{Webarchive|url=https://web.archive.org/web/20121020215606/http://www.ew.com/ew/article/0,,20204732,00.html |date=20 October 2012 }}. ''[[Entertainment Weekly]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> [[Kyle Smith (critic)|Kyle Smith]] of the ''[[New York Post]]'' commented that the film combined the &quot;intelligence of an action movie with the excitement of an art-house release&quot; making ''Mongol'' &quot;as dry as summer in the Gobi Desert.&quot; Smith did compliment director Bodrov on staging a &quot;couple of splattery yet artful battle scenes&quot;, but concluded that the film &quot;really isn't worth leaving your yurt for.&quot;&lt;ref&gt;Smith, Kyle (6 June 2008). [http://www.nypost.com/p/entertainment/movies/item_8rOS7t1uZ5GaL778YGJnuN;jsessionid=234EED98FA14F2F45E288D1931E248C8 Sweet Mongolia: How Genghis Got His Horde] {{Webarchive|url=https://web.archive.org/web/20121022042112/http://www.nypost.com/p/entertainment/movies/item_8rOS7t1uZ5GaL778YGJnuN;jsessionid=234EED98FA14F2F45E288D1931E248C8 |date=22 October 2012 }}. ''[[New York Post]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Author Tom Hoskyns of ''[[The Independent]]'' described the film as being &quot;very thin plot-wise.&quot; Hoskyns commended the &quot;desolate landscapes and seasonal variations&quot;, but he was not excited about the repetitious nature of the story showing the &quot;hero getting repeatedly captured and escaping.&quot;&lt;ref&gt;Hoskyns, Tom (26 September 2008). [https://www.independent.co.uk/arts-entertainment/films/reviews/dvd-mongol-15-943388.html DVD: Mongol]{{dead link|date=August 2021|bot=medic}}{{cbignore|bot=medic}}. ''[[The Independent]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> Joshua Rothkopf of ''[[Time Out (company)|Time Out]]'' said that ''Mongol'' was a &quot;Russian-produced dud.&quot; He said that it included &quot;ridiculous dialogue and Neanderthal motivations&quot; as well as bearing &quot;little relation to the raw, immediate work of his countrymates—like Andrei Tarkovsky, whose epic ''[[Andrei Rublev (film)|Andrei Rublev]]'' really gives you a sense of the dirt and desperation.&quot;&lt;ref&gt;Rothkopf, Joshua (11 June 2008). [https://archive.today/20130204125544/http://www.timeout.com/film/newyork/reviews/85544/mongol-the_rise_to_power_of_genghis_khan.html Mongol: The Rise to Power of Genghis Khan]. ''[[Time Out (company)|Time Out]]''. Retrieved 2011-02-16.&lt;/ref&gt;<br /> <br /> ===Accolades===<br /> The film was nominated and won several awards in 2007–09. Various critics included the film on their lists of the top 10 best films of 2008. Mike Russell of ''[[The Oregonian]]'' named it the fifth-best film of 2008,&lt;ref name=mctop08/&gt; Lawrence Toppman of ''[[The Charlotte Observer]]'' named it the eighth-best film of 2008,&lt;ref name=mctop08/&gt; and V.A. Musetto of the ''[[New York Post]]'' also named it the eighth-best film of 2008.&lt;ref name=mctop08&gt;{{cite web |url=http://apps.metacritic.com/film/awards/2008/toptens.shtml |title=Metacritic: 2008 Film Critic Top Ten Lists |publisher=[[Metacritic]] |access-date=11 January 2009 |url-status=dead |archive-url=https://web.archive.org/web/20110720094350/http://apps.metacritic.com/film/awards/2008/toptens.shtml |archive-date=20 July 2011 |df=dmy-all }}&lt;/ref&gt;<br /> <br /> {|class=&quot;wikitable&quot; border=&quot;1&quot;<br /> |-<br /> ! Award<br /> ! Category<br /> ! Nominee<br /> ! Result<br /> |-<br /> |[[80th Academy Awards]]&lt;ref&gt;{{cite web |url=http://www.oscars.org/awards/academyawards/oscarlegacy/2000-present/2008/winners.html |title=Nominees &amp; Winners for the 80th Academy Awards |access-date=2011-02-21 |publisher=Oscars.org |archive-date=12 October 2013 |archive-url=https://web.archive.org/web/20131012045551/http://www.oscars.org/awards/academyawards/oscarlegacy/2000-present/2008/winners.html |url-status=dead }}&lt;/ref&gt;<br /> |Best Foreign Language Film<br /> |<br /> |{{nom}}<br /> |-<br /> |2007 [[Asia Pacific Screen Awards]]&lt;ref&gt;{{cite web |url=http://www.asiapacificscreenawards.com/the_awards/past_winners_and_nominees/nominees/achievement_in_cinematography |title=The Awards |access-date=2011-02-21 |publisher=Asia Pacific Screen Awards |url-status=dead |archive-url=https://web.archive.org/web/20110218215728/http://www.asiapacificscreenawards.com/the_awards/past_winners_and_nominees/nominees/achievement_in_cinematography |archive-date=18 February 2011 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Achievement in Cinematography<br /> |Sergey Trofimov<br /> |{{nom}}<br /> |-<br /> |[[2nd Asian Film Awards]]&lt;ref&gt;{{cite web |url=http://www.asianfilmawards.asia/2008/eng/nominations.html#b5 |title=Nominations &amp; Winners |access-date=2011-02-21 |publisher=Asian Film Awards |url-status=dead |archive-url=https://web.archive.org/web/20120618163020/http://www.asianfilmawards.asia/2008/eng/nominations.html#b5 |archive-date=18 June 2012 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Supporting Actor<br /> |Sun Honglei<br /> |{{won}}<br /> |-<br /> ||[[14th Critics' Choice Awards|Broadcast Film Critics Association Awards 2008]]&lt;ref&gt;{{cite web |url=http://www.bfca.org/ccawards/2008.php |title=The 14th Critics' Choice Movie Awards Nominees |access-date=2011-02-21 |publisher=BFCA.org |url-status=dead |archive-url=https://web.archive.org/web/20101124011333/http://www.bfca.org/ccawards/2008.php |archive-date=24 November 2010 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Foreign Language Film<br /> |<br /> |{{nom}}<br /> |-<br /> |rowspan=2|[[European Film Awards 2008]]&lt;ref&gt;{{cite web |url=http://www.europeanfilmacademy.org/2008/11/08/nominations-pour-les-european-film-awards-2008/ |title=Nominations for the European Film Awards 2008 |access-date=2011-02-21 |publisher=EuropeanFilmAcademy.org |archive-date=11 March 2012 |archive-url=https://web.archive.org/web/20120311025245/https://www.europeanfilmacademy.org/2008/11/08/nominations-pour-les-european-film-awards-2008/ |url-status=dead }}&lt;/ref&gt;&lt;ref&gt;{{cite web|url=http://www.europeanfilmacademy.org/2009/04/28/2008-4/ |title=The People's Choice Award 2008 |access-date=2011-02-21 |publisher=EuropeanFilmAcademy.org}}&lt;/ref&gt;<br /> |Best Cinematographer<br /> |Sergey Trofimov, Rogier Stoffers<br /> |{{nom}}<br /> |-<br /> |Best European Film<br /> |Sergey Bodrov<br /> |{{nom}}<br /> |-<br /> |rowspan=2|6th [[Golden Eagle Award (Russia)|Golden Eagle Award]]s&lt;ref&gt;{{cite web |url=http://www.kinoacademy.ru/main.php |title=Nominees &amp; Winners |access-date=2011-02-21 |publisher=KinoAcademy.ru |url-status=dead |archive-url=https://web.archive.org/web/20110717063252/http://www.kinoacademy.ru/main.php |archive-date=17 July 2011 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Costume Design<br /> |Karin Lohr<br /> |{{won}}<br /> |-<br /> |Best Sound Design<br /> |Stephan Konken<br /> |{{won}}<br /> |-<br /> |2009 40th [[NAACP Image Award]]s&lt;ref&gt;{{cite web |url=http://www.naacpimageawards.net/42/awards-show/40th/ |title=40th NAACP Image Awards |access-date=2010-06-04 |publisher=NAACP Image Awards |url-status=dead |archive-url=https://web.archive.org/web/20101215211104/http://www.naacpimageawards.net/42/awards-show/40th/ |archive-date=15 December 2010 |df=dmy-all }}&lt;/ref&gt;<br /> |Outstanding Foreign Motion Picture<br /> |<br /> |{{nom}}<br /> |-<br /> |Las Vegas Film Critics Society Awards 2008&lt;ref&gt;{{cite web |url=http://www.lvfcs.org/lvfcs/2008.html |title=2008 Sierra Award winners |access-date=2011-02-21 |publisher=lvfcs.org |url-status=dead |archive-url=https://web.archive.org/web/20120423011609/http://www.lvfcs.org/lvfcs/2008.html |archive-date=23 April 2012 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Foreign Language Film<br /> |<br /> |{{won}}<br /> |-<br /> |2008 [[National Board of Review of Motion Pictures]] Awards&lt;ref&gt;{{cite web |url=http://www.nbrmp.org/awards/past.cfm?year=2008 |title=Awards for 2008 |access-date=2011-02-21 |publisher=National Board of Review |url-status=dead |archive-url=https://web.archive.org/web/20120516165456/http://www.nbrmp.org/awards/past.cfm?year=2008 |archive-date=16 May 2012 |df=dmy-all }}&lt;/ref&gt;<br /> |Best Foreign Language Film<br /> |<br /> |{{won}}<br /> |-<br /> |rowspan=6|2008 [[Nika Award]]s&lt;ref&gt;{{cite web|url=http://www.kino-nika.com/ |title=Award Winners &amp; Nominees |access-date=2011-02-21 |publisher=Nika Awards}}&lt;/ref&gt;<br /> |Best Cinematography<br /> |Sergey Trofimov, Rogier Stoffers<br /> |{{won}}<br /> |-<br /> |Best Costume Design<br /> |Karin Lohr<br /> |{{won}}<br /> |-<br /> |Best Director<br /> |Sergey Bodrov<br /> |{{won}}<br /> |-<br /> |Best Film<br /> |<br /> |{{won}}<br /> |-<br /> |Best Production Design<br /> |Dashi Namdakov, Yelena Zhukova<br /> |{{won}}<br /> |-<br /> |Best Sound<br /> |Stephan Konken<br /> |{{won}}<br /> |}<br /> <br /> ==Sequel==<br /> ''The Great Khan'' ({{lang|ru|Великий Хан}}) is the provisional title&lt;ref&gt;{{cite web | title = Bodrov launches production company, Director's first project to be a 'Mongol' | publisher = Variety | date = 16 May 2008 | url = https://www.variety.com/article/VR1117985966.html | access-date = 2010-11-01}}&lt;/ref&gt; for the second installment of Bodrov's planned trilogy on the life of Temüjin, [[Genghis Khan]]. The Mongolian pop singer, [[Amarkhuu Borkhuu]], was offered a role, but declined.&lt;ref&gt;{{cite news|url=http://www.postnews.mn/index.php?cp=news&amp;task=view&amp;news_id=6522&amp;PHPSESSID=d97d127e45acbded6332c901ba3ee32d&amp;page=52&amp;PHPSESSID=d97d127e45acbded6332c901ba3ee32d|title=Б.АМАРХҮҮ С.БОДРОВТ ГОЛОГДЖЭЭ|access-date=2011-01-03|language=mn}}&lt;/ref&gt; The trilogy project was eventually put on the shelf, but in July 2013, during a visit to the annual [[Naadam|Naadam Festival]] in [[Ulan Bator]], Bodrov told the press that the production of the sequel had started again.&lt;ref name=InfoMongolia/&gt; The sequel is now called &quot;Mongol II: The Legend&quot; and started its shooting in 2019.&lt;ref&gt;{{Cite web|last=|first=|date=|title=Getaway Pictures|url=http://getawaypictures.com/?p=103|access-date=|website=}}&lt;/ref&gt;<br /> <br /> ==Soundtrack==<br /> The soundtrack for ''Mongol'', was released in the United States by the [[Varèse Sarabande]] music label on 29 July 2008.&lt;ref&gt;{{cite web |url=http://music.barnesandnoble.com/Mongol/Altan-Urag/e/30206690224/?itm=1&amp;USRI=mongol |title=Mongol Original Motion Picture Soundtrack |publisher=BarnesandNoble.com |access-date=2011-02-15 }}{{Dead link|date=July 2022 |bot=InternetArchiveBot |fix-attempted=yes }}&lt;/ref&gt; The score for the film was composed by [[Tuomas Kantelinen]], with additional music orchestrated by the Mongolian folk rock band [[Altan Urag]].&lt;ref&gt;{{cite web|url=https://movies.yahoo.com/movie/1808754771/cast |title=Mongol (2008) |access-date=2011-02-15 |publisher=Yahoo! Movies}}&lt;/ref&gt;<br /> <br /> {{Infobox album<br /> | name = Mongol: Original Motion Picture Soundtrack<br /> | type = [[Film score]]<br /> | artist = [[Tuomas Kantelinen]]<br /> | cover =<br /> | caption =<br /> | alt =<br /> | released = 07/29/2008<br /> | recorded =<br /> | venue =<br /> | studio =<br /> | genre =<br /> | length = 43:39<br /> | label = Varèse Sarabande<br /> | producer =<br /> | prev_title =<br /> | prev_year =<br /> | next_title =<br /> | next_year =<br /> }}<br /> <br /> {{Track listing<br /> | headline = ''Mongol: Original Motion Picture Soundtrack''<br /> | total_length = 43:39<br /> | title1 = Beginning<br /> | length1 = 4:35<br /> | title2 = At the Fireplace: Composed and Performed by Altan Urag<br /> | length2 = 0:48<br /> | title3 = Blood Brothers<br /> | length3 = 1:08<br /> | title4 = Chase 1: Composed and Performed by Altan Urag<br /> | length4 = 0:51<br /> | title5 = Fighting Boys<br /> | length5 = 0:53<br /> | title6 = Temüjin's Escape<br /> | length6 = 2:03<br /> | title7 = Funeral and Robbery: Composed and Performed by Altan Urag<br /> | length7 = 2:30<br /> | title8 = Together Now<br /> | length8 = 1:52<br /> | title9 = Love Theme<br /> | length9 = 1:25<br /> | title10 = Chase 2: Composed and Performed by Altan Urag<br /> | length10 = 1:36<br /> | title11 = Cold Winter<br /> | length11 = 2:30<br /> | title12 = Merkit Territory<br /> | length12 = 1:53<br /> | title13 = Attack<br /> | length13 = 0:44<br /> | title14 = Martial Rage<br /> | length14 = 1:12<br /> | title15 = Jamukha is Following<br /> | length15 = 1:30<br /> | title16 = Slavery<br /> | length16 = 1:48<br /> | title17 = Long Journey<br /> | length17 = 0:49<br /> | title18 = Destiny<br /> | length18 = 1:49<br /> | title19 = Joy in Mongolia: Composed and Performed by Altan Urag<br /> | length19 = 3:07<br /> | title20 = Final Battle, Showing Strength<br /> | length20 = 2:15<br /> | title21 = Final Battle, Tactical Order<br /> | length21 = 0:36<br /> | title22 = Final Battle, The First Attachment<br /> | length22 = 1:21<br /> | title23 = Final Battle, Death by Arrows<br /> | length23 = 1:55<br /> | title24 = Tengri's Help<br /> | length24 = 0:57<br /> | title25 = Victory to Khan<br /> | length25 = 1:36<br /> | title26 = No Mercy<br /> | length26 = 1:56<br /> }}<br /> <br /> ==See also==<br /> * [[List of Asian historical drama films]]<br /> * [[List of submissions to the 80th Academy Awards for Best Foreign Language Film]]<br /> * [[List of Kazakhstani submissions for the Academy Award for Best Foreign Language Film]]<br /> <br /> ==References==<br /> {{reflist|colwidth=30em}}<br /> <br /> ==External links==<br /> {{wikiquote|Mongol (film)|Mongol}}<br /> * {{IMDb title|0416044|Mongol}}<br /> * {{mojo title|mongol|Mongol}}<br /> * {{rotten-tomatoes|mongol|Mongol}}<br /> * {{Metacritic film|title=Mongol}}<br /> <br /> {{Sergei Bodrov}}<br /> {{Nika Award Best Picture}}<br /> {{National Board of Review Award for Best Foreign Language Film}}<br /> <br /> {{DEFAULTSORT:Mongol (Film)}}<br /> [[Category:2007 films]]<br /> [[Category:2007 biographical drama films]]<br /> [[Category:2000s historical adventure films]]<br /> [[Category:2000s war films]]<br /> [[Category:Adventure films based on actual events]]<br /> [[Category:Biographical action films]]<br /> [[Category:Depictions of Genghis Khan on film]]<br /> [[Category:Films directed by Sergei Bodrov]]<br /> [[Category:Films scored by Tuomas Kantelinen]]<br /> [[Category:Films set in Mongolia]]<br /> [[Category:Films set in the 12th century]]<br /> [[Category:Films set in the 13th century]]<br /> [[Category:Films set in the Mongol Empire]]<br /> [[Category:Films shot in China]]<br /> [[Category:Films shot in Kazakhstan]]<br /> [[Category:German biographical drama films]]<br /> [[Category:German historical drama films]]<br /> [[Category:German war drama films]]<br /> [[Category:Kazakhstani war drama films]]<br /> [[Category:2000s Mandarin-language films]]<br /> [[Category:Mongolian drama films]]<br /> [[Category:Mongolian-language films]]<br /> [[Category:Picturehouse films]]<br /> [[Category:Russian biographical drama films]]<br /> [[Category:Russian war drama films]]<br /> [[Category:Russian historical drama films]]<br /> [[Category:Universal Pictures films]]<br /> [[Category:New Line Cinema films]]<br /> [[Category:War epic films]]<br /> [[Category:War films based on actual events]]<br /> [[Category:2007 drama films]]<br /> [[Category:2000s German films]]<br /> [[Category:2007 multilingual films]]<br /> [[Category:Russian multilingual films]]<br /> [[Category:German multilingual films]]<br /> [[Category:Kazakhstani multilingual films]]<br /> [[Category:Kazakhstani historical drama films]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Red-giant_branch&diff=1170910459 Red-giant branch 2023-08-17T23:09:47Z <p>205.189.94.9: </p> <hr /> <div>{{Short description|Portion of the giant branch before helium ignition}}<br /> [[File:M5 colour magnitude diagram.png|thumb|right|upright=1.4|[[Hertzsprung–Russell diagram]] for [[globular cluster]] [[Messier 5|M5]]. The red-giant branch runs from the thin horizontal [[subgiant]] branch to the top right, with a number of the more luminous RGB stars marked in red.]]<br /> The '''red-giant branch''' (RGB), sometimes called the first giant branch, is the portion of the giant branch before helium ignition occurs in the course of [[stellar evolution]]. It is a stage that follows the [[main sequence]] for low- to intermediate-mass stars. Red-giant-branch stars have an inert [[helium]] core surrounded by a shell of [[hydrogen]] fusing via the [[CNO cycle]]. They are K- and M-class stars much larger and more luminous than main-sequence stars of the same temperature. This is a [[galactic]], not [[stellar]] [[trend]].<br /> <br /> ==Discovery==<br /> [[File:NGC 288 HST.jpg|thumb|left|upright=1.0|The brightest stars in [[globular cluster]]s such as [[NGC 288]] are red giants.]]<br /> [[Red giant]]s were identified early in the 20th century when the use of the [[Hertzsprung–Russell diagram]] made it clear that there were two distinct types of cool stars with very different sizes: dwarfs, now formally known as the [[main sequence]]; and [[giant star|giant]]s.&lt;ref name=adamsjoy&gt;{{cite journal|bibcode=1921ApJ....53...13A|title=The parallaxes of 1646 stars derived by the spectroscopic method|journal=Astrophysical Journal|volume=53|pages=13|last1=Adams|first1=W. S.|last2=Joy|first2=A. H.|last3=Stromberg|first3=G.|last4=Burwell|first4=C. G.|year=1921|doi=10.1086/142584}}&lt;/ref&gt;&lt;ref name=trumpler&gt;{{cite journal|bibcode=1925PASP...37..307T|title=Spectral Types in Open Clusters|journal=Publications of the Astronomical Society of the Pacific|volume=37|issue=220|pages=307|last1=Trumpler|first1=R. J.|year=1925|doi=10.1086/123509|doi-access=free}}&lt;/ref&gt;<br /> <br /> The term ''red-giant branch'' came into use during the 1940s and 1950s, although initially just as a general term to refer to the red-giant region of the Hertzsprung–Russell diagram. Although the basis of a thermonuclear main-sequence lifetime, followed by a thermodynamic contraction phase to a [[white dwarf]] was understood by 1940, the internal details of the various types of giant stars were not known.&lt;ref name=gamow1939&gt;{{cite journal|doi=10.1103/PhysRev.55.718|title=Physical Possibilities of Stellar Evolution|journal=Physical Review|volume=55|issue=8|pages=718–725|year=1939|last1=Gamow|first1=G.|bibcode = 1939PhRv...55..718G }}&lt;/ref&gt;<br /> <br /> In 1968, the name [[asymptotic giant branch]] (AGB) was used for a branch of stars somewhat more luminous than the bulk of red giants and more unstable, often large-amplitude [[variable star]]s such as [[Mira]].&lt;ref name=sandage&gt;{{cite journal|bibcode=1968ApJ...153L.129S|title=An Indication of Gaps in the Giant Branch of the Globular Cluster M15|journal=Astrophysical Journal|volume=153|pages=L129|last1=Sandage|first1=Allan|last2=Katem|first2=Basil|last3=Kristian|first3=Jerome|year=1968|doi=10.1086/180237}}&lt;/ref&gt; Observations of a bifurcated giant branch had been made years earlier but it was unclear how the different sequences were related.&lt;ref name=arp&gt;{{cite journal|bibcode=1953AJ.....58....4A|title=The color-magnitude diagram of the globular cluster M 92|journal=Astronomical Journal|volume=58|pages=4|last1=Arp|first1=Halton C.|last2=Baum|first2=William A.|last3=Sandage|first3=Allan R.|year=1953|doi=10.1086/106800|doi-access=free}}&lt;/ref&gt; By 1970, the red-giant region was well understood as being made up from [[subgiant]]s, the RGB itself, the [[horizontal branch]], and the AGB, and the evolutionary state of the stars in these regions was broadly understood.&lt;ref name=strom&gt;{{cite journal|bibcode=1970A&amp;A.....8..243S|title=On the Evolutionary Status of Stars above the Horizontal Branch in Globular Clusters|journal=Astronomy and Astrophysics|volume=8|pages=243|last1=Strom|first1=S. E.|last2=Strom|first2=K. M.|last3=Rood|first3=R. T.|last4=Iben|first4=I.|year=1970}}&lt;/ref&gt; The red-giant branch was described as the first giant branch in 1967, to distinguish it from the second or asymptotic giant branch,&lt;ref name=iben&gt;{{cite journal|bibcode=1967ARA&amp;A...5..571I|title=Stellar Evolution Within and off the Main Sequence|journal=Annual Review of Astronomy and Astrophysics|volume=5|pages=571–626|last1=Iben|first1=Icko|year=1967|doi=10.1146/annurev.aa.05.090167.003035}}&lt;/ref&gt; and this terminology is still frequently used today.&lt;ref name=pols/&gt;<br /> <br /> Modern stellar physics has modelled the internal processes that produce the different phases of the post-main-sequence life of moderate-mass stars,&lt;ref name=vassiliaid&gt;{{cite journal|bibcode=1993ApJ...413..641V|title=Evolution of low- and intermediate-mass stars to the end of the asymptotic giant branch with mass loss|journal=Astrophysical Journal|volume=413|pages=641|last1=Vassiliadis|first1=E.|last2=Wood|first2=P. R.|year=1993|doi=10.1086/173033|doi-access=free}}&lt;/ref&gt; with ever-increasingly complexity and precision.&lt;ref name=marigo&gt;{{cite journal|bibcode=2008A&amp;A...482..883M |doi=10.1051/0004-6361:20078467 |title=Evolution of asymptotic giant branch stars |journal=Astronomy and Astrophysics |volume=482 |issue=3 |pages=883–905 |year=2008 |last1=Marigo |first1=P. |last2=Girardi |first2=L. |last3=Bressan |first3=A. |last4=Groenewegen |first4=M. A. T. |last5=Silva |first5=L. |last6=Granato |first6=G. L. |arxiv = 0711.4922 |s2cid=15076538 }}&lt;/ref&gt; The results of RGB research are themselves being used as the basis for research in other areas.&lt;ref name=rizzi&gt;{{cite journal|bibcode=2007ApJ...661..815R|arxiv=astro-ph/0701518|title=Tip of the Red Giant Branch Distances. II. Zero-Point Calibration|journal=The Astrophysical Journal|volume=661|issue=2|pages=815–829|last1=Rizzi|first1=Luca|last2=Tully|first2=R. Brent|last3=Makarov|first3=Dmitry|last4=Makarova|first4=Lidia|last5=Dolphin|first5=Andrew E.|last6=Sakai|first6=Shoko|last7=Shaya|first7=Edward J.|year=2007|doi=10.1086/516566|s2cid=12864247}}&lt;/ref&gt;<br /> <br /> ==Evolution==<br /> [[File:Zams and tracks.png|thumb|right|upright=1.5|Evolutionary tracks for stars of different masses:{{unordered list| the {{solar mass|0.6|link=y}} track shows the RGB and stops at the [[helium flash]].| the {{solar mass|1}} track shows a short but long-lasting subgiant branch and the RGB to the helium flash.| the {{solar mass|2}} track shows the [[subgiant]] branch and RGB, with a barely detectable blue loop onto the [[asymptotic giant branch|AGB]].| the {{solar mass|5}} track shows a long but very brief subgiant branch, a short RGB and an extended blue loop.}}]]<br /> When a star with a mass from about {{solar mass|0.4}} ([[solar mass]]) to {{solar mass|12}} ({{solar mass|8}} for low-metallicity stars) exhausts its core hydrogen, it enters a phase of hydrogen shell burning during which it becomes a red giant, larger and cooler than on the main sequence. During hydrogen shell burning, the interior of the star goes through several distinct stages which are reflected in the outward appearance. The evolutionary stages vary depending primarily on the mass of the star, but also on its [[metallicity]].<br /> <br /> ===Subgiant phase===<br /> After a main-sequence star has exhausted its core hydrogen, it begins to fuse hydrogen in a thick shell around a core consisting largely of helium. The mass of the helium core is below the [[Schönberg–Chandrasekhar limit]] and is in [[thermal equilibrium]], and the star is a [[subgiant]]. Any additional energy production from the shell fusion is consumed in inflating the envelope and the star cools but does not increase in luminosity.&lt;ref name=catelan&gt;{{cite conference|bibcode=2007AIPC..930...39C|arxiv=astro-ph/0703724|title=Structure and Evolution of Low-Mass Stars: An Overview and Some Open Problems|conference=GRADUATE SCHOOL IN ASTRONOMY: XI Special Courses at the National Observatory of Rio de Janeiro (XI CCE). AIP Conference Proceedings|volume=930|pages=39–90|last1=Catelan|first1=Márcio|last2=Roig|first2=Fernando|last3=Alcaniz|first3=Jailson|last4=de la Reza|first4=Ramiro|last5=Lopes|first5=Dalton|year=2007|doi=10.1063/1.2790333|s2cid=15599804}}&lt;/ref&gt;<br /> <br /> Shell hydrogen fusion continues in stars of roughly solar mass until the helium core increases in mass sufficiently that it becomes [[degenerate matter|degenerate]]. The core then shrinks, heats up and develops a strong temperature gradient. The hydrogen shell, fusing via the temperature-sensitive [[CNO cycle]], greatly increases its rate of energy production and the stars is considered to be at the foot of the red-giant branch. For a star the same mass as the sun, this takes approximately 2 billion years from the time that hydrogen was exhausted in the core.&lt;ref name=salaris2005&gt;{{cite journal|bibcode=2005essp.book.....S|title=Evolution of Stars and Stellar Populations|url=https://archive.org/details/evolutionofstars0000sala|url-access=registration|journal=Evolution of Stars and Stellar Populations|pages=400|last1=Salaris|first1=Maurizio|last2=Cassisi|first2=Santi|year=2005}}&lt;/ref&gt;<br /> <br /> Subgiants more than about {{solar mass|2}} reach the Schönberg–Chandrasekhar limit relatively quickly before the core becomes degenerate. The core still supports its own weight thermodynamically with the help of energy from the hydrogen shell, but is no longer in thermal equilibrium. It shrinks and heats causing the hydrogen shell to become thinner and the stellar envelope to inflate. This combination decreases luminosity as the star cools towards the foot of the RGB. Before the core becomes degenerate, the outer hydrogen envelope becomes opaque which causes the star to stop cooling, increases the rate of fusion in the shell, and the star has entered the RGB. In these stars, the subgiant phase occurs within a few million years, causing an apparent gap in the Hertzsprung–Russell diagram between [[B-type main-sequence star]]s and the RGB seen in young [[open cluster]]s such as [[Praesepe]]. This is the [[Hertzsprung gap]] and is actually sparsely populated with subgiant stars rapidly evolving towards red giants, in contrast to the short densely populated low-mass subgiant branch seen in older clusters such as [[ω Centauri]].&lt;ref name=mermilliod&gt;{{cite journal|bibcode=1981A&amp;A....97..235M|title=Comparative studies of young open clusters. III – Empirical isochronous curves and the zero age main sequence|journal=Astronomy and Astrophysics|volume=97|pages=235|last1=Mermilliod|first1=J. C.|year=1981}}&lt;/ref&gt;&lt;ref name=bedin&gt;{{cite journal|bibcode=2004ApJ...605L.125B|arxiv=astro-ph/0403112|title=Ω Centauri: The Population Puzzle Goes Deeper|journal=The Astrophysical Journal|volume=605|issue=2|pages=L125|last1=Bedin|first1=Luigi R.|last2=Piotto|first2=Giampaolo|last3=Anderson|first3=Jay|last4=Cassisi|first4=Santi|last5=King|first5=Ivan R.|last6=Momany|first6=Yazan|last7=Carraro|first7=Giovanni|year=2004|doi=10.1086/420847|s2cid=2799751|url=https://zenodo.org/record/968404}}&lt;/ref&gt;<br /> <br /> ===Ascending the red-giant branch===<br /> [[File:Evolutionary track 1m.svg|thumb|Sun-like stars have a degenerate core on the red-giant branch and ascend to the tip before starting core helium fusion with a flash.]]<br /> [[File:Evolutionary track 5m.svg|thumb|Stars more massive than the Sun do not have a degenerate core and leave the red-giant branch before the tip when their core helium ignites without a flash.]]<br /> Stars at the foot of the red-giant branch all have a similar temperature around {{Val|5000|fmt=commas|ul=K}}, corresponding to an early to mid-K spectral type. Their luminosities range from a few times the luminosity of the sun for the least massive red giants to several thousand times as luminous for stars around {{solar mass|8}}.&lt;ref name=vandenberg&gt;{{cite journal|bibcode=2006ApJS..162..375V|arxiv=astro-ph/0510784|title=The Victoria-Regina Stellar Models: Evolutionary Tracks and Isochrones for a Wide Range in Mass and Metallicity that Allow for Empirically Constrained Amounts of Convective Core Overshooting|journal=The Astrophysical Journal Supplement Series|volume=162|issue=2|pages=375–387|last1=Vandenberg|first1=Don A.|last2=Bergbusch|first2=Peter A.|last3=Dowler|first3=Patrick D.|year=2006|doi=10.1086/498451|s2cid=1791448}}&lt;/ref&gt;<br /> <br /> As their hydrogen shells continue to produce more helium, the cores of RGB stars increase in mass and temperature. This causes the hydrogen shell to fuse more rapidly. Stars become more luminous, larger and somewhat cooler. They are described as ascending the RGB.&lt;ref name=hekker&gt;{{cite journal|bibcode=2011MNRAS.414.2594H|arxiv=1103.0141|title=Characterization of red giant stars in the public Kepler data|journal=Monthly Notices of the Royal Astronomical Society|volume=414|issue=3|pages=2594|last1=Hekker|first1=S.|last2=Gilliland|first2=R. L.|last3=Elsworth|first3=Y.|last4=Chaplin|first4=W. J.|last5=De Ridder|first5=J.|last6=Stello|first6=D.|last7=Kallinger|first7=T.|last8=Ibrahim|first8=K. A.|last9=Klaus|first9=T. C.|last10=Li|first10=J.|year=2011|doi=10.1111/j.1365-2966.2011.18574.x|s2cid=118513871}}&lt;/ref&gt;<br /> <br /> On the ascent of the RGB, there are a number of internal events that produce observable external features. The outer [[Convective zone|convective envelope]] becomes deeper and deeper as the star grows and shell energy production increases. Eventually it reaches deep enough to bring fusion products to the surface from the formerly convective core, known as the first [[dredge-up]]. This changes the surface abundance of helium, carbon, nitrogen and oxygen.&lt;ref name=stoesz&gt;{{cite journal|bibcode=2003MNRAS.340..763S|arxiv=astro-ph/0212128|title=Oxygen isotopic ratios in first dredge-up red giant stars and nuclear reaction rate uncertainties revisited|journal=Monthly Notices of the Royal Astronomical Society|volume=340|issue=3|pages=763|last1=Stoesz|first1=Jeffrey A.|last2=Herwig|first2=Falk|year=2003|doi=10.1046/j.1365-8711.2003.06332.x|s2cid=14107804}}&lt;/ref&gt; A noticeable clustering of stars at one point on the RGB can be detected and is known as the RGB bump. It is caused by a discontinuity in hydrogen abundance left behind by the deep convection. Shell energy production temporarily decreases at this discontinuity, effective stalling the ascent of the RGB and causing an excess of stars at that point.&lt;ref name=cassisi&gt;{{cite journal|bibcode=2011A&amp;A...527A..59C|arxiv=1012.0419|title=The magnitude difference between the main sequence turn off and the red giant branch bump in Galactic globular clusters|journal=Astronomy &amp; Astrophysics|volume=527|pages=A59|last1=Cassisi|first1=S.|last2=Marín-Franch|first2=A.|last3=Salaris|first3=M.|last4=Aparicio|first4=A.|last5=Monelli|first5=M.|last6=Pietrinferni|first6=A.|year=2011|doi=10.1051/0004-6361/201016066|s2cid=56067351}}&lt;/ref&gt;<br /> <br /> ===Tip of the red-giant branch===<br /> {{Main | Tip of the red-giant branch}}<br /> For stars with a degenerate helium core, there is a limit to this growth in size and luminosity, known as the [[tip of the red-giant branch]], where the core reaches sufficient temperature to begin fusion. All stars that reach this point have an identical helium core mass of almost {{solar mass|0.5}}, and very similar stellar luminosity and temperature. These luminous stars have been used as standard candle distance indicators. Visually, the tip of the red-giant branch occurs at about absolute magnitude −3 and temperatures around 3,000 K at solar metallicity, closer to 4,000 K at very low metallicity.&lt;ref name=vandenberg/&gt;&lt;ref name=lee&gt;{{cite journal|bibcode=1993ApJ...417..553L|title=The Tip of the Red Giant Branch as a Distance Indicator for Resolved Galaxies|journal=Astrophysical Journal |volume=417|pages=553|last1=Lee|first1=Myung Gyoon|last2=Freedman|first2=Wendy L.|last3=Madore|first3=Barry F.|year=1993|doi=10.1086/173334|doi-access=free}}&lt;/ref&gt; Models predict a luminosity at the tip of {{solar luminosity|2000–2500}}, depending on metallicity.&lt;ref name=salaris11997&gt;{{cite journal|bibcode=1997MNRAS.289..406S|arxiv=astro-ph/9703186|title=The 'tip' of the red giant branch as a distance indicator: Results from evolutionary models|journal=Monthly Notices of the Royal Astronomical Society|volume=289|issue=2|pages=406|last1=Salaris|first1=Maurizio|last2=Cassisi|first2=Santi|year=1997|doi=10.1093/mnras/289.2.406|s2cid=18796954}}&lt;/ref&gt; In modern research, infrared magnitudes are more commonly used.&lt;ref name=conn&gt;{{cite journal|doi=10.1088/0004-637X/758/1/11|arxiv=1209.4952|title=A Bayesian Approach to Locating the Red Giant Branch Tip Magnitude. Ii. Distances to the Satellites of M31|journal=The Astrophysical Journal|volume=758|issue=1|pages=11|year=2012|last1=Conn|first1=A. R.|last2=Ibata|first2=R. A.|last3=Lewis|first3=G. F.|last4=Parker|first4=Q. A.|last5=Zucker|first5=D. B.|last6=Martin|first6=N. F.|last7=McConnachie|first7=A. W.|last8=Irwin|first8=M. J.|last9=Tanvir|first9=N.|last10=Fardal|first10=M. A.|last11=Ferguson|first11=A. M. N.|last12=Chapman|first12=S. C.|last13=Valls-Gabaud|first13=D.|bibcode = 2012ApJ...758...11C |s2cid=53556162}}&lt;/ref&gt;<br /> <br /> ===Leaving the red-giant branch===<br /> A degenerate core begins fusion explosively in an event known as the [[helium flash]], but externally there is little immediate sign of it. The energy is consumed in lifting the degeneracy in the core. The star overall becomes less luminous and hotter and migrates to the horizontal branch. All degenerate helium cores have approximately the same mass, regardless of the total stellar mass, so the helium fusion luminosity on the horizontal branch is the same. Hydrogen shell fusion can cause the total stellar luminosity to vary, but for most stars at near solar metallicity, the temperature and luminosity are very similar at the cool end of the horizontal branch. These stars form the [[red clump]] at about 5,000 K and {{solar luminosity|50}}. Less massive hydrogen envelopes cause the stars to take up a hotter and less luminous position on the horizontal branch, and this effect occurs more readily at low metallicity so that old metal-poor clusters show the most pronounced horizontal branches.&lt;ref name=salaris2005/&gt;&lt;ref name=dantona&gt;{{cite journal|arxiv=astro-ph/0209331|bibcode=2002A&amp;A...395...69D|doi=10.1051/0004-6361:20021220|title=Helium variation due to self-pollution among Globular Cluster stars|journal=Astronomy and Astrophysics|volume=395|pages=69–76|year=2002|last1=d'Antona|first1=F.|last2=Caloi|first2=V.|last3=Montalbán|first3=J.|last4=Ventura|first4=P.|last5=Gratton|first5=R.|s2cid=15262502}}&lt;/ref&gt;<br /> <br /> Stars initially more massive than {{solar mass|2}} have non-degenerate helium cores on the red-giant branch. These stars become hot enough to start triple-alpha fusion before they reach the tip of the red-giant branch and before the core becomes degenerate. They then leave the red-giant branch and perform a blue loop before returning to join the asymptotic giant branch. Stars only a little more massive than {{solar mass|2}} perform a barely noticeable blue loop at a few hundred {{solar luminosity}} before continuing on the AGB hardly distinguishable from their red-giant branch position. More massive stars perform extended blue loops which can reach 10,000 K or more at luminosities of {{solar luminosity|thousands of}}. These stars will cross the [[instability strip]] more than once and pulsate as [[Classical Cepheid variable|Type I (Classical) Cepheid variable]]s.&lt;ref name=bono&gt;{{cite journal|bibcode=2000ApJ...543..955B|arxiv=astro-ph/0006251|title=Intermediate-Mass Star Models with Different Helium and Metal Contents|journal=The Astrophysical Journal|volume=543|issue=2|pages=955|last1=Bono|first1=Giuseppe|last2=Caputo|first2=Filippina|last3=Cassisi|first3=Santi|last4=Marconi|first4=Marcella|last5=Piersanti|first5=Luciano|last6=Tornambè|first6=Amedeo|year=2000|doi=10.1086/317156|s2cid=18898755}}&lt;/ref&gt;<br /> <br /> ===Properties===<br /> The table below shows the typical lifetimes on the main sequence (MS), subgiant branch (SB) and red-giant branch (RGB), for stars with different initial masses, all at solar metallicity (Z = 0.02). Also shown are the helium core mass, surface effective temperature, radius and luminosity at the start and end of the RGB for each star. The end of the red-giant branch is defined to be when core helium ignition takes place.&lt;ref name=pols&gt;{{cite journal|bibcode=1998MNRAS.298..525P|title=Stellar evolution models for Z = 0.0001 to 0.03|journal=Monthly Notices of the Royal Astronomical Society|volume=298|issue=2|pages=525|last1=Pols|first1=Onno R.|last2=Schröder|first2=Klaus-Peter|last3=Hurley|first3=Jarrod R.|last4=Tout|first4=Christopher A.|last5=Eggleton|first5=Peter P.|year=1998|doi=10.1046/j.1365-8711.1998.01658.x|doi-access=free}}&lt;/ref&gt;<br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! rowspan=2 | Mass&lt;br/&gt;({{solar mass}}) !! rowspan=2 | MS (GYrs) <br /> ! rowspan=&quot;2&quot; |Hook (MYrs)!! rowspan=&quot;2&quot; | SB (MYrs) !! rowspan=2 | RGB&lt;br/&gt;(MYrs) !! colspan=4 | RGB&lt;sub&gt;foot&lt;/sub&gt;&lt;br/&gt; !! colspan=4 | RGB&lt;sub&gt;end&lt;/sub&gt;&lt;br/&gt;<br /> |-<br /> ! Core mass ({{solar mass}}) !! T&lt;sub&gt;eff&lt;/sub&gt; (K) !! Radius ({{solar radius}}) !! Luminosity ({{solar luminosity}}) !! Core mass ({{solar mass}}) !! T&lt;sub&gt;eff&lt;/sub&gt; (K) !! Radius ({{solar radius}}) !! Luminosity ({{solar luminosity}})<br /> |- style=&quot;text-align:right;&quot;<br /> | 0.6 || 58.8 <br /> |N/A|| 5,100 || 2,500 || 0.10 || 4,634 || 1.2 || 0.6 || 0.48 || 2,925 || 207 || 2,809<br /> |- style=&quot;text-align:right;&quot;<br /> | 1.0 || 9.3 <br /> |N/A|| 2,600 || 760 || 0.13 || 5,034 || 2.0 || 2.2 || 0.48 || 3,140 || 179 || 2,802<br /> |- style=&quot;text-align:right;&quot;<br /> | 2.0 || 1.2 <br /> |10|| 22 || 25 || 0.25 || 5,220 || 5.4 || 19.6 || 0.34 || 4,417 || 23.5 || 188<br /> |- style=&quot;text-align:right;&quot;<br /> | 5.0 || 0.1 <br /> |0.4|| 15 || 0.3 || 0.83 || 4,737 || 43.8 || 866.0 || 0.84 || 4,034 || 115 || 3,118<br /> |}<br /> <br /> Intermediate-mass stars only lose a small fraction of their mass as main-sequence and subgiant stars, but lose a significant amount of mass as red giants.&lt;ref name=meynet&gt;{{cite journal|bibcode=1993A&amp;AS...98..477M|title=New dating of galactic open clusters|journal=Astronomy and Astrophysics Supplement Series|volume=98|pages=477|last1=Meynet|first1=G.|last2=Mermilliod|first2=J.-C.|last3=Maeder|first3=A.|year=1993}}&lt;/ref&gt;<br /> <br /> The mass lost by a star similar to the Sun affects the temperature and luminosity of the star when it reaches the horizontal branch, so the properties of red-clump stars can be used to determine the mass difference before and after the helium flash. Mass lost from red giants also determines the mass and properties of the [[white dwarf]]s that form subsequently. Estimates of total mass loss for stars that reach the tip of the red-giant branch are around {{solar mass|0.2–0.25}}. Most of this is lost within the final million years before the helium flash.&lt;ref name=origlia&gt;{{cite journal|bibcode=2002ApJ...571..458O|title=ISOCAM Observations of Galactic Globular Clusters: Mass Loss along the Red Giant Branch|journal=The Astrophysical Journal|volume=571|issue=1|pages=458–468|last1=Origlia|first1=Livia|last2=Ferraro|first2=Francesco R.|last3=Fusi Pecci|first3=Flavio|last4=Rood|first4=Robert T.|year=2002|doi=10.1086/339857|arxiv = astro-ph/0201445 |s2cid=18299018}}&lt;/ref&gt;&lt;ref name=mcdonald&gt;{{cite journal|bibcode=2011ApJS..193...23M|arxiv=1101.1095|title=Fundamental Parameters, Integrated Red Giant Branch Mass Loss, and Dust Production in the Galactic Globular Cluster 47 Tucanae|journal=The Astrophysical Journal Supplement|volume=193|issue=2|pages=23|last1=McDonald|first1=I.|last2=Boyer|first2=M. L.|last3=Van Loon|first3=J. Th.|last4=Zijlstra|first4=A. A.|last5=Hora|first5=J. L.|last6=Babler|first6=B.|last7=Block|first7=M.|last8=Gordon|first8=K.|last9=Meade|first9=M.|last10=Meixner|first10=M.|last11=Misselt|first11=K.|last12=Robitaille|first12=T.|last13=Sewiło|first13=M.|last14=Shiao|first14=B.|last15=Whitney|first15=B.|year=2011|doi=10.1088/0067-0049/193/2/23|s2cid=119266025}}&lt;/ref&gt;<br /> <br /> Mass lost by more massive stars that leave the red-giant branch before the helium flash is more difficult to measure directly. The current mass of Cepheid variables such as [[δ Cephei]] can be measured accurately because there are either binaries or pulsating stars. When compared with evolutionary models, such stars appear to have lost around 20% of their mass, much of it during the blue loop and especially during pulsations on the instability strip.&lt;ref name=xu&gt;{{cite journal|bibcode=2004A&amp;A...418..213X|title=Blue loops of intermediate mass stars . I. CNO cycles and blue loops|journal=Astronomy and Astrophysics|volume=418|pages=213–224|last1=Xu|first1=H. Y.|last2=Li|first2=Y.|year=2004|doi=10.1051/0004-6361:20040024|doi-access=free}}&lt;/ref&gt;&lt;ref name=neilson&gt;{{cite journal|bibcode=2011A&amp;A...529L...9N|arxiv=1104.1638|title=The Cepheid mass discrepancy and pulsation-driven mass loss|journal=Astronomy &amp; Astrophysics|volume=529|pages=L9|last1=Neilson|first1=H. R.|last2=Cantiello|first2=M.|last3=Langer|first3=N.|year=2011|doi=10.1051/0004-6361/201116920|s2cid=119180438}}&lt;/ref&gt;<br /> <br /> ==Variability==<br /> Some [[red giant]]s are large amplitude variables. Many of the earliest-known variable stars are [[Mira variable]]s with regular periods and amplitudes of several magnitudes, [[semiregular variable]]s with less obvious periods or multiple periods and slightly lower amplitudes, and [[slow irregular variable]]s with no obvious period. These have long been considered to be [[asymptotic giant branch]] (AGB) stars or supergiants and the red-giant branch (RGB) stars themselves were not generally considered to be variable. A few apparent exceptions were considered to be low-luminosity AGB stars.&lt;ref name=kiss&gt;{{cite journal|bibcode=2003MNRAS.343L..79K|arxiv=astro-ph/0306426|title=Red variables in the OGLE-II data base – I. Pulsations and period-luminosity relations below the tip of the red giant branch of the Large Magellanic Cloud|journal=Monthly Notices of the Royal Astronomical Society|volume=343|issue=3|pages=L79|last1=Kiss|first1=L. L.|last2=Bedding|first2=T. R.|year=2003|doi=10.1046/j.1365-8711.2003.06931.x|s2cid=2383837}}&lt;/ref&gt;<br /> <br /> Studies in the late 20th century began to show that all giants of class M were variable with amplitudes of 10 milli-magnitudes of more, and that late-K-class giants were also likely to be variable with smaller amplitudes. Such variable stars were amongst the more luminous red giants, close to the tip of the RGB, but it was difficult to argue that they were all actually AGB stars. The stars showed a period amplitude relationship with larger-amplitude variables pulsating more slowly.&lt;ref name=jorissen&gt;{{cite journal|bibcode=1997A&amp;A...324..578J|title=The onset of photometric variability in red giant stars|journal=Astronomy and Astrophysics|volume=324|pages=578|last1=Jorissen|first1=A.|last2=Mowlavi|first2=N.|last3=Sterken|first3=C.|last4=Manfroid|first4=J.|year=1997}}&lt;/ref&gt;<br /> <br /> [[Microlensing Observations in Astrophysics|Microlensing surveys]] in the 21st century have provided extremely accurate photometry of thousands of stars over many years. This has allowed for the discovery of many new variable stars, often of very small amplitudes. Multiple [[period-luminosity relationship]]s have been discovered, grouped into regions with ''ridges'' of closely spaced parallel relationships. Some of these correspond to the known Miras and semi-regulars, but an additional class of variable star has been defined: [[OGLE]] Small Amplitude Red Giants, or [[Long-period variable star|OSARGs]]. OSARGs have amplitudes of a few thousandths of a magnitude and semi-regular periods of 10 to 100 days. The OGLE survey published up to three periods for each OSARG, indicating a complex combination of pulsations. Many thousands of OSARGs were quickly detected in the [[Magellanic Clouds]], both AGB and RGB stars.&lt;ref name=soszynski&gt;{{cite journal|bibcode=2007AcA....57..201S|arxiv=0710.2780|title=The Optical Gravitational Lensing Experiment. Period—Luminosity Relations of Variable Red Giant Stars|journal=Acta Astronomica|volume=57|pages=201|last1=Soszynski|first1=I.|last2=Dziembowski|first2=W. A.|last3=Udalski|first3=A.|last4=Kubiak|first4=M.|last5=Szymanski|first5=M. K.|last6=Pietrzynski|first6=G.|last7=Wyrzykowski|first7=L.|last8=Szewczyk|first8=O.|last9=Ulaczyk|first9=K.|year=2007}}&lt;/ref&gt; A catalog has since been published of 192,643 OSARGs in the direction of the [[Milky Way]] central bulge. Although around a quarter of Magellanic Cloud OSARgs show long secondary periods, very few of the galactic OSARGs do.&lt;ref name=soszynski2013&gt;{{cite journal|bibcode=2013AcA....63...21S|arxiv=1304.2787|title=The Optical Gravitational Lensing Experiment. The OGLE-III Catalog of Variable Stars. XV. Long-Period Variables in the Galactic Bulge|journal=Acta Astronomica|volume=63|issue=1|pages=21|last1=Soszyński|first1=I.|last2=Udalski|first2=A.|last3=Szymański|first3=M. K.|last4=Kubiak|first4=M.|last5=Pietrzyński|first5=G.|last6=Wyrzykowski|first6=Ł.|last7=Ulaczyk|first7=K.|last8=Poleski|first8=R.|last9=Kozłowski|first9=S.|last10=Pietrukowicz|first10=P.|last11=Skowron|first11=J.|year=2013}}&lt;/ref&gt;<br /> <br /> The RGB OSARGs follow three closely spaced period-luminosity relations, corresponding to the first, second and third [[overtone]]s of [[radial pulsations|radial pulsation]] models for stars of certain masses and luminosities, but that dipole and quadrupole non-radial pulsations are also present leading to the semi-regular nature of the variations.&lt;ref name=takayama&gt;{{cite journal|bibcode=2013EPJWC..4303013T|title=On the pulsation modes and masses of RGB OSARGs|journal=40th Liège International Astrophysical Colloquium. Ageing Low Mass Stars: From Red Giants to White Dwarfs|volume=43|pages=03013|last1=Takayama|first1=M.|last2=Saio|first2=H.|last3=Ita|first3=Y.|year=2013|doi=10.1051/epjconf/20134303013|doi-access=free}}&lt;/ref&gt; The [[fundamental mode]] does not appear, and the underlying cause of the excitation is not known. [[Stochastic]] convection has been suggested as a cause, similar to [[solar-like oscillations]].&lt;ref name=soszynski/&gt;<br /> <br /> Two additional types of variation have been discovered in RGB stars: long secondary periods, which are associated with other variations but can show larger amplitudes with periods of hundreds or thousands of days; and ''ellipsoidal'' variations. The cause of the long secondary periods is unknown, but it has been proposed that they are due to interactions with low-mass companions in close orbits.&lt;ref name=nicholls2009&gt;{{cite journal|bibcode=2009MNRAS.399.2063N|arxiv=0907.2975|title=Long Secondary Periods in variable red giants|journal=Monthly Notices of the Royal Astronomical Society|volume=399|issue=4|pages=2063–2078|last1=Nicholls|first1=C. P.|last2=Wood|first2=P. R.|last3=Cioni|first3=M.-R. L.|last4=Soszyński|first4=I.|year=2009|doi=10.1111/j.1365-2966.2009.15401.x|s2cid=19019968}}&lt;/ref&gt; The ellipsoidal variations are also thought to be created in binary systems, in this case contact binaries where distorted stars cause strictly periodic variations as they orbit.&lt;ref name=nicholls2012&gt;{{cite journal|bibcode=2012MNRAS.421.2616N|arxiv=1201.1043|title=Eccentric ellipsoidal red giant binaries in the LMC: Complete orbital solutions and comments on interaction at periastron|journal=Monthly Notices of the Royal Astronomical Society|volume=421|issue=3|pages=2616|last1=Nicholls|first1=C. P.|last2=Wood|first2=P. R.|year=2012|doi=10.1111/j.1365-2966.2012.20492.x|s2cid=59464524}}&lt;/ref&gt;<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==Bibliography==<br /> * {{cite journal|bibcode=1993ApJ...413..641V|title=Evolution of low- and intermediate-mass stars to the end of the asymptotic giant branch with mass loss|journal=Astrophysical Journal|volume=413|pages=641|last1=Vassiliadis|first1=E.|last2=Wood|first2=P. R.|year=1993|doi=10.1086/173033|doi-access=free}}<br /> * {{cite journal|arxiv=astro-ph/9910164|bibcode=2000A&amp;AS..141..371G|doi=10.1051/aas:2000126|title=Evolutionary tracks and isochrones for low- and intermediate-mass stars: From 0.15 to 7 M☉, and from Z=0.0004 to 0.03|journal=Astronomy and Astrophysics Supplement Series|volume=141|issue=3|pages=371–383|year=2000|last1=Girardi|first1=L.|last2=Bressan|first2=A.|last3=Bertelli|first3=G.|last4=Chiosi|first4=C.|s2cid=14566232}}<br /> <br /> ==External links==<br /> * [https://astro.uni-bonn.de/~nlanger/siu_web/ssescript/new/chapter9.pdf Post-main sequence evolution through helium burning]<br /> * [http://othes.univie.ac.at/17128/ Long period variables – period luminosity relations and classification in the Gaia Mission]<br /> <br /> {{Star}}<br /> <br /> [[Category:Hertzsprung–Russell classifications]]<br /> [[Category:Red giants]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Sagitta&diff=1170910361 Sagitta 2023-08-17T23:08:36Z <p>205.189.94.9: </p> <hr /> <div>{{short description|Constellation in the northern celestial hemisphere}}<br /> {{hatnote group|<br /> {{distinguish|text=the southern constellation of [[Sagittarius (constellation)|Sagittarius]]}}<br /> {{other uses}}<br /> }}<br /> {{Use British English|date=November 2020}}<br /> <br /> {{Infobox constellation<br /> | name = Sagitta<br /> | abbreviation = Sge&lt;ref name=&quot;pa30_469&quot;/&gt;<br /> | genitive = Sagittae&lt;ref name=&quot;pa30_469&quot;/&gt;<br /> | pronounce = {{IPAc-en|s|ə|ˈ|dʒ|ɪ|t|ə}} or {{IPAc-en|s|ə|ˈ|ɡ|ɪ|t|ə}} ''Sagítta,''&lt;ref&gt;{{Cite dictionary |url=http://www.lexico.com/definition/Sagitta |archive-url=https://web.archive.org/web/20210415161550/https://www.lexico.com/definition/Sagitta |url-status=dead |archive-date=2021-04-15 |title=Sagitta |dictionary=[[Lexico]] UK English Dictionary |publisher=[[Oxford University Press]]}}&lt;/ref&gt;&lt;br/&gt;genitive {{IPAc-en|s|ə|ˈ|dʒ|ɪ|t|iː}}<br /> | symbolism = the [[Arrow]]&lt;ref name=&quot;Kunitzsch&quot;/&gt;<br /> | RA = {{RA|18|57|21.3919}} – {{RA|20|20|44.8677}}&lt;ref name=boundary/&gt;<br /> | dec= {{dec|16.0790844}} to {{dec|21.6436558}}&lt;ref name=boundary/&gt;<br /> | areatotal = 79.9&lt;ref name=tirionconst/&gt;<br /> | arearank = 86th<br /> | numbermainstars = 4<br /> | numberbfstars = 19<br /> | numberstarsplanets = 2<br /> | numberbrightstars = 0<br /> | numbernearbystars = 2 (GJ 745A/B)<br /> | brighteststarname = [[Gamma Sagittae|γ Sge]]<br /> | starmagnitude = 3.51<br /> | neareststarname = Gliese 745<br /> | stardistancely = 28.14<br /> | stardistancepc = 8.63<br /> | numbermessierobjects = 1<br /> | meteorshowers =<br /> | bordering = [[Vulpecula]]&lt;br/&gt;[[Hercules (constellation)|Hercules]]&lt;br/&gt;[[Aquila (constellation)|Aquila]]&lt;br/&gt;[[Delphinus (constellation)|Delphinus]]<br /> | latmax = [[North Pole|90]]<br /> | latmin = [[70th parallel south|70]]<br /> | month = August<br /> | notes=<br /> }}<br /> '''Sagitta''' is a dim but distinctive [[constellation]] in the northern sky. Its name is [[Latin]] for 'arrow', not to be confused with the significantly larger constellation [[Sagittarius (constellation)|Sagittarius]] 'the archer'. It was included among the 48 constellations listed by the 2nd-century astronomer [[Ptolemy]], and it remains one of the [[88 modern constellations]] defined by the [[International Astronomical Union]]. Although it dates to antiquity, Sagitta has no star brighter than 3rd [[apparent magnitude|magnitude]] and has the third-smallest area of any constellation.<br /> <br /> [[Gamma Sagittae]] is the constellation's brightest star, with an apparent magnitude of 3.47. It is an aging [[red giant]] star 90% as massive as the Sun that has cooled and expanded to a diameter 54 times greater than it. [[Delta Sagittae|Delta]], [[Epsilon Sagittae|Epsilon]], [[Zeta Sagittae|Zeta]], and [[Theta Sagittae]] are each [[Star system|multiple]] stars whose components can be seen in small telescopes. [[V Sagittae]] is a [[cataclysmic variable]]—a [[Binary star|binary star system]] composed of a [[white dwarf]] accreting mass of a donor star that is expected to go [[nova]] and briefly become the most [[Luminosity|luminous]] star in the [[Milky Way]] and one of the brightest stars in our sky around the year 2083. Two star systems in Sagitta are known to have [[Jupiter-like]] planets, while a third—[[15 Sagittae]]—has a [[brown dwarf]] companion.<br /> {{TOC limit|3}}<br /> <br /> ==History==<br /> [[File:Sidney Hall - Urania's Mirror - Delphinus, Sagitta, Aquila, and Antinous.jpg|thumb|alt=Drawing of a [[dolphin]], [[eagle]], [[archer]], and [[arrow]] overlaid on a [[medieval]] star chart|Sagitta can be seen above [[Aquila (constellation)|Aquila]] in this [[plate]] from ''[[Urania's Mirror]]'' (1825).]]<br /> The [[classical Greece|ancient Greeks]] called Sagitta {{lang|grc|Oistos}} 'the arrow',&lt;ref name=&quot;Kunitzsch&quot;/&gt; and it was one of the 48 constellations described by [[Ptolemy]].&lt;ref name=ridpathsag/&gt; It was regarded as the weapon that [[Hercules]] used to kill the eagle ({{lang|la|Aquila}}) of [[Jupiter (mythology)|Jove]]&lt;!--wouldn't the Greeks have said &quot;Zeus&quot;?--&gt; that perpetually gnawed [[Prometheus]]' liver.&lt;ref name=&quot;hyginus_14&quot;&gt;{{cite web |url=http://www.theoi.com/Text/HyginusAstronomica.html#15 |title=Astronomica |author=Hyginus |translator=Mary Grant |website=Theoi Project |access-date=31 January 2020}}&lt;/ref&gt; Sagitta is located beyond the north border of [[Aquila (constellation)|Aquila]], the Eagle. An amateur naturalist, polymath [[Richard Hinckley Allen]] proposed that the constellation could represent the arrow shot by Hercules towards the adjacent [[Stymphalian birds]] (which feature in [[Labours of Hercules|Hercules' sixth labour]]) who had claws, beaks, and wings of iron, and who lived on human flesh in the marshes of [[Arcadia (ancient region)|Arcadia]]—denoted in the sky by the constellations Aquila the Eagle, [[Cygnus (constellation)|Cygnus 'the Swan']], and [[Lyra|Lyra 'the Vulture']]—and still lying between them, whence the title [[Herculea]].&lt;ref&gt;{{cite book|first=Richard Hinckley | last=Allen|title=Star-Names and Their Meanings|pages =349–351 | url=https://books.google.com/books?id=l8V2DY3tQMgC&amp;q=Star-Names+and+Their+Meanings | year=1963 | orig-year=1899 | publisher=Dover Publications |location=New York | isbn=978-0-486-21079-7 }}&lt;/ref&gt; Greek scholar [[Eratosthenes]] claimed it as the arrow with which [[Apollo]] exterminated the [[Cyclopes]].&lt;ref name=&quot;hyginus_14&quot;/&gt; The Romans named it Sagitta.&lt;ref&gt;{{Cite book|title=The Star Atlas Companion : What You Need to Know about the Constellations|last=Bagnall|first= Philip M. |date=2012|publisher=[[Springer Science+Business Media|Springer]] |pages= 386–389 | location=New York | isbn=9781461408307|oclc=794225463}}&lt;/ref&gt; In Arabic, it became ''al-sahm'' 'arrow', though this name became ''Sham'' and was transferred to [[Alpha Sagittae]] only. The Greek name has also been mistranslated as {{lang|el-latn|ὁ istos}} 'the loom' and thus in Arabic ''al-nawl''. It was also called ''al-'anaza'' 'pike/javelin'.&lt;ref name=&quot;Kunitzsch&quot;&gt;{{cite journal |url= http://opar.unior.it/473/1/P._Kunitzsch_pp.19-28_pdf.pdf |title= Albumasariana |journal=Annali dell'Università degli studi di Napoli &quot;L'Orientale&quot; |publisher=Rivista del Dipartimento di Studi Asiatici e del Dipartimento di Studi e Ricerche su Africa e Paesi Arabi |first=Paul |last= Kunitzsch |volume=62 |page=4 |date=2002 |issn=0393-3180 }}&lt;/ref&gt;<br /> <br /> == Characteristics ==<br /> The four brightest stars make up an arrow-shaped [[Asterism (astronomy)|asterism]] located due north of the bright star [[Altair]].&lt;ref name=&quot;moore366&quot;&gt;{{Cite book |last=Moore |first=Patrick |title=The Observer's Year: 366 Nights in the Universe |publisher=[[Springer Science &amp; Business Media]] |year=2005 |isbn=978-1-85233-884-8 |location=New York |page=10 |author-link=Patrick Moore}}&lt;/ref&gt; Covering 79.9 square degrees and hence 0.194% of the sky, Sagitta ranks 86th of the [[88 modern constellations]] by area. Only [[Equuleus]] and [[Crux]] are smaller.&lt;ref name=tirionconst/&gt; Sagitta is most readily observed from the late spring to early autumn to northern hemisphere observers, with midnight [[culmination]] occurring on 17 July.&lt;ref name=thompson2&gt;{{cite book|title=Illustrated Guide to Astronomical Wonders: From Novice to Master Observer|author1=Thompson, Robert Bruce|author2=Barbara Fritchman|url=https://books.google.com/books?id=ymt9nj_uPhwC&amp;pg=PA392|page=392|isbn=978-0-596-52685-6|publisher=O'Reilly Media, Inc.|location=Sebastopol, California|year=2007}}&lt;/ref&gt; Its position in the [[Northern Celestial Hemisphere]] means that the whole [[constellation]] is visible to observers north of [[69th parallel south|69°S]].&lt;ref name=tirionconst&gt;{{cite web| url=http://www.ianridpath.com/constellations2.html | title=Constellations: Lacerta–Vulpecula | work= Star Tales |author=Ridpath, Ian | access-date= 22 May 2015| author-link=Ian Ridpath }}&lt;/ref&gt;{{efn|1=While parts of the constellation technically rise above the horizon to observers between the 69°S and 73°S, stars within a few degrees of the horizon are to all intents and purposes unobservable.&lt;ref name=tirionconst/&gt;}} Sagitta is bordered by [[Vulpecula]] to the north, [[Hercules (constellation)|Hercules]] to the west, Aquila to the south, and [[Delphinus (constellation)|Delphinus]] to the east. The three-letter abbreviation for the constellation, as adopted by the [[International Astronomical Union]] in 1922, is &quot;Sge&quot;; American astronomer [[Henry Norris Russell]], who devised the code, had to resort to using the [[Genitive case|genitive]] form of the name to come up with a letter to include ('e') that was not in the name of the constellation Sagittarius.&lt;ref name=&quot;pa30_469&quot;&gt;{{cite journal | last=Russell | first=Henry Norris |author-link=Henry Norris Russell | title=The New International Symbols for the Constellations | journal=[[Popular Astronomy (US magazine)|Popular Astronomy]] | volume=30 | page=469 | bibcode=1922PA.....30..469R | year=1922 }}&lt;/ref&gt; The official constellation boundaries, as set by Belgian astronomer [[Eugène Joseph Delporte|Eugène Delporte]] in 1930, are defined by a polygon of twelve segments (''illustrated in infobox''). In the [[equatorial coordinate system]], the [[right ascension]] coordinates of these borders lie between {{RA|18|57.2}} and {{RA|20|20.5}}, while the [[declination]] coordinates are between 16.08° and 21.64°.&lt;ref name=&quot;boundary&quot;&gt;{{Cite web | title=Sagitta, Constellation Boundary | publisher=[[International Astronomical Union]] | url=http://www.iau.org/public/constellations/#sge | access-date=20 October 2020 }}&lt;/ref&gt;<br /> <br /> ==Notable features==<br /> ===Stars===<br /> {{see also|List of stars in Sagitta}}<br /> Celestial cartographer [[Johann Bayer]] gave [[Bayer designation]]s to [[eight stars]], labelling them Alpha to Theta. English astronomer [[John Flamsteed]] added the letters x, mistaken as [[Chi (letter)|Chi]] (χ), y and z to 13, 14, and 15 Sagittae in his ''Catalogus Britannicus''. All three were dropped by later astronomers [[John Bevis]] and [[Francis Baily]].&lt;ref name=wagman&gt;{{cite book | last = Wagman | first = Morton | year = 2003 | title = Lost Stars: Lost, Missing and Troublesome Stars from the Catalogues of Johannes Bayer, Nicholas Louis de Lacaille, John Flamsteed, and Sundry Others |pages=266–267, 515 | publisher = The McDonald &amp; Woodward Publishing Company | location = Blacksburg, Virginia | isbn = 978-0-939923-78-6 }}&lt;/ref&gt;<br /> <br /> ====Bright stars====<br /> Ptolemy saw the constellation's brightest star [[Gamma Sagittae]] as marking the arrow's head,&lt;ref name=&quot;ridpathsag&quot;&gt;{{cite web |last=Ridpath |first=Ian |author-link=Ian Ridpath |title=Sagitta |url=http://www.ianridpath.com/startales/sagitta.html |access-date=22 May 2015 |work=Star Tales}}&lt;/ref&gt; while Bayer saw Gamma, Eta, and Theta as depicting the arrow's shaft.&lt;ref name=wagman/&gt; Gamma Sagittae is a [[red giant]] of spectral type M0&amp;nbsp;III,&lt;ref name=strassmeier&gt;{{cite journal|bibcode=2018A&amp;A...612A..45S|title=PEPSI deep spectra. II. Gaia benchmark stars and other M-K standards|journal=[[Astronomy and Astrophysics]]|volume=612|pages=A45|last1=Strassmeier|first1=K. G. |last2=Ilyin|first2=I.|last3=Weber|first3=M.|year=2018|arxiv=1712.06967|doi=10.1051/0004-6361/201731633|s2cid=119244142}}&lt;/ref&gt; and magnitude 3.47. It lies at a distance of {{val|258|4|ul=light-years}} from Earth.&lt;ref name=&quot;vanLeeuwen2007&quot;&gt;{{cite journal | first=F. | last=van Leeuwen | title=Validation of the New Hipparcos Reduction | journal=[[Astronomy and Astrophysics]] | volume=474 | issue=2 | pages=653–664 | date=2007 | bibcode=2007A&amp;A...474..653V | doi=10.1051/0004-6361:20078357 | arxiv=0708.1752| s2cid=18759600 }}&lt;/ref&gt; With around 90% of the Sun's mass,&lt;ref name=stock&gt;{{cite journal<br /> | title=Precise radial velocities of giant stars. X. Bayesian stellar parameters and evolutionary stages for 372 giant stars from the Lick planet search<br /> | last1=Stock | first1=Stephan | last2=Reffert | first2=Sabine<br /> | last3=Quirrenbach | first3=Andreas | last4=Hauschildt | first4=P.<br /> | journal=Astronomy and Astrophysics<br /> | volume=616 | pages=A33 | year=2018<br /> | bibcode=2018A&amp;A...616A..33S | arxiv=1805.04094<br /> | doi=10.1051/0004-6361/201833111 | s2cid=119361866 }}&lt;/ref&gt;&lt;ref name=Neilson2008&gt;{{cite journal<br /> | last1=Neilson | first1=Hilding R. | last2=Lester | first2=John B.<br /> | title=Determining parameters of cool giant stars by modeling spectrophotometric and interferometric observations using the SAtlas program<br /> | journal=Astronomy and Astrophysics<br /> | volume=490 | issue=2 | pages=807–10 | date=2008<br /> | bibcode=2008A&amp;A...490..807N | arxiv=0809.1875<br /> | doi=10.1051/0004-6361:200810627 | s2cid=1586125 }}&lt;/ref&gt; it has a radius 54 times that of the Sun and is 575 times as bright. It is most likely on the [[red-giant branch]] of its [[stellar evolution|evolutionary]] lifespan, having exhausted its core hydrogen and now burning it in a surrounding shell.&lt;ref name=stock/&gt;&lt;!-- cites previous 3 sentences --&gt;<br /> <br /> [[Delta Sagittae]] is the second-brightest star in the constellation and is a binary. Delta and Zeta depicted the spike according to Bayer.{{sfn|Wagman|2003|p=515}} The Delta Sagittae system is composed of a [[red supergiant]] of spectral type M2&amp;nbsp;II&lt;ref name=Eaton/&gt; that has 3.9 times the Sun's mass and 152 times its diameter and a blue-white B9.5V&lt;ref name=Eaton/&gt; [[main sequence]] star that is 2.9 times as massive as the Sun. The two orbit each other every ten years.&lt;ref name=Eaton&gt;{{cite journal|title=Winds and accretion in delta Sagittae|author=Eaton, Joel A.|author2=Hartkopf, William I.|author3=McAlister, Harold A.|author4=Mason, Brian D.|journal=[[Astronomical Journal]]|volume=109|number=4|pages=1856–1866|date=1995|bibcode=1995AJ....109.1856E|doi=10.1086/117412}}&lt;/ref&gt; [[Zeta Sagittae]] is a triple star system,&lt;ref name=Eggleton2008&gt;{{citation<br /> | last1=Eggleton | first1=P. P. | last2=Tokovinin | first2=A. A.<br /> | title=A catalogue of multiplicity among bright stellar systems<br /> | journal=Monthly Notices of the Royal Astronomical Society<br /> | volume=389 | issue=2 | pages=869–879 | date=September 2008<br /> | bibcode=2008MNRAS.389..869E | doi=10.1111/j.1365-2966.2008.13596.x<br /> | arxiv=0806.2878 | s2cid=14878976 | postscript=. }}&lt;/ref&gt; approximately {{val|326|u=light-years}} from Earth. The primary and secondary are A-type stars.&lt;ref name=Christy1969&gt;{{citation<br /> | last1=Christy | first1=James W. | last2=Walker | first2=R. L., Jr.<br /> | title=MK Classification of 142 Visual Binaries | postscript=.<br /> | journal=Publications of the Astronomical Society of the Pacific<br /> | volume=81 | issue=482 | page=643 | date=October 1969<br /> | doi=10.1086/128831 | bibcode=1969PASP...81..643C | doi-access=free }}&lt;/ref&gt;&lt;ref name=Cowley1969&gt;{{citation | display-authors=1<br /> | last1=Cowley | first1=A. | last2=Cowley | first2=C.<br /> | last3=Jaschek | first3=M. | last4=Jaschek | first4=C.<br /> | title=A study of the bright A stars. I. A catalogue of spectral classifications<br /> | journal=Astronomical Journal | postscript=.<br /> | volume=74 | pages=375–406 | date=April 1969<br /> | doi=10.1086/110819 | bibcode=1969AJ.....74..375C }}&lt;/ref&gt;<br /> <br /> In his ''Uranometria'', Bayer depicted Alpha, [[Beta Sagittae|Beta]], and [[Epsilon Sagittae]] as the fins of the arrow.{{sfn|Wagman|2003|p=515}} Also known as Sham, Alpha is a yellow [[bright giant]] star of [[Stellar classification|spectral class]] G1&amp;nbsp;II with an [[apparent magnitude]] of 4.38, which lies at a distance of {{val|382|8|u=light-years}} from Earth.&lt;ref name=Gaia-DR2alpha&gt;{{cite DR2|1824277055360974720}}&lt;/ref&gt; Four times as massive as the Sun, it has swollen and brightened to 20&amp;nbsp;times the Sun's diameter and 340&amp;nbsp;times its [[luminosity]].&lt;ref name=kalersham&gt;{{cite web| first=James B. | last=Kaler | title=Sham | work=Stars | publisher=University of Illinois | url=http://stars.astro.illinois.edu/sow/sham.html | access-date=22 May 2015}}&lt;/ref&gt; Also of magnitude 4.38, Beta is a G-type [[giant star|giant]] located {{val|420|10|u=light-years}} distant from Earth.&lt;ref name=Gaia-DR2beta&gt;{{cite DR2|1823991938300446336}}&lt;/ref&gt; Estimated to be around 129&amp;nbsp;million years old, it is 4.33&amp;nbsp;times as massive as the Sun,&lt;ref name=Liu2014&gt;{{cite journal | title=The Lithium Abundances of a Large Sample of Red Giants | display-authors=1 | last1=Liu | first1=Y. J. | last2=Tan | first2=K. F. | last3=Wang | first3=L. | last4=Zhao | first4=G. | last5=Sato | first5=Bun'ei | last6=Takeda | first6=Y. | last7=Li | first7=H. N. | journal=[[The Astrophysical Journal]] | arxiv=1404.1687 | volume=785 | issue=2 | id=94 | pages=12 | date=2014 | doi=10.1088/0004-637X/785/2/94 | bibcode=2014ApJ...785...94L | s2cid=119226316 }}&lt;/ref&gt; and has expanded to roughly 27 times its diameter.&lt;ref name=vanBelle2009&gt;{{cite journal | title=Supergiant temperatures and linear radii from near-infrared interferometry | journal=[[Monthly Notices of the Royal Astronomical Society]] | volume=394 | issue=4 | pages=1925 | year=2009 | last1=Van Belle | first1=G. T. | last2=Creech-Eakman | first2=M. J. | last3=Hart | first3=A. | bibcode=2009MNRAS.394.1925V | arxiv=0811.4239 | doi=10.1111/j.1365-2966.2008.14146.x | s2cid=118372600 }}&lt;/ref&gt; Epsilon Sagittae is a double star whose component stars can be seen in a small telescope.&lt;ref name=&quot;turnleft&quot;&gt;{{cite book|last=Consolmagno|first=Guy |title=Turn Left at Orion: Hundreds of Night Sky Objects to See in a Home Telescope – and How to Find Them|publisher=[[Cambridge University Press]]|location=Cambridge, United Kingdom|year=2019 | orig-year=1989 |page=138 |isbn=978-1-108-45756-9|url=https://books.google.com/books?id=D2JjDwAAQBAJ&amp;pg=PA139}}&lt;/ref&gt; With an apparent magnitude of 5.77,&lt;ref name=Mason2014/&gt; the main star is a 331-million-year-old yellow giant of spectral type G8&amp;nbsp;III around 3.09&amp;nbsp;times as massive as the Sun,&lt;ref name=takeda14&gt;{{cite journal | title=Spectroscopic study on the beryllium abundances of red giant stars | journal=[[Publications of the Astronomical Society of Japan]] | volume=66 | issue=5 | pages=91 | year=2014 | last1=Takeda | first1=Yoichi | last2=Tajitsu | first2=Akito | doi=10.1093/pasj/psu066 | bibcode=2014PASJ...66...91T | arxiv=1406.7066| s2cid=119283677 }}&lt;/ref&gt; that has swollen to {{val|18.37|0.65|0.88}} its diameter.&lt;ref name=&quot;GaiaDR2epsilon&quot;/&gt; It is {{val|580|10|u=light-years}} distant.&lt;ref name=&quot;GaiaDR2epsilon&quot;&gt;{{Cite DR2|4321830946398475776}}&lt;/ref&gt; The visual companion of magnitude 8.35 is 87.4&amp;nbsp;[[arcseconds]] distant,&lt;ref name=Mason2014&gt;{{cite journal| last1=Mason | first1=B. D. | last2=Wycoff | first2=G. L. | last3=Hartkopf | first3=W. I. | last4=Douglass | first4=G. G. | last5=Worley | first5=C. E. | title=The Washington Visual Double Star Catalog | journal=[[Nature (journal)|Nature]] | year=2014 | volume=122 | issue=6 | page=3466 | bibcode=2001AJ....122.3466M | doi = 10.1086/323920 | doi-access=free }}&lt;/ref&gt; but is an unrelated [[blue supergiant]] around {{val|7000|fmt=commas|u=light-years}} distant from Earth.&lt;ref name=&quot;GaiaDR2comp&quot;&gt;{{cite DR2|4321830980758181760}}&lt;/ref&gt;<br /> <br /> [[Eta Sagittae]] is an orange giant of spectral class K2&amp;nbsp;III&lt;ref name=Roman1952&gt;{{citation<br /> | title=The Spectra of the Bright Stars of Types F5-K5<br /> | last=Roman | first=Nancy G. | postscript=.<br /> | journal=Astrophysical Journal<br /> | volume=116 | page=122 | date=July 1952<br /> | doi=10.1086/145598 | bibcode=1952ApJ...116..122R | doi-access=free }}&lt;/ref&gt; with a magnitude of 5.09.&lt;ref name=Argue1966&gt;{{citation<br /> | last=Argue | first=A. N. | postscript=.<br /> | title=UBV photometry of 550 F, G and K type stars<br /> | journal=[[Monthly Notices of the Royal Astronomical Society]]<br /> | volume=133 | pages=475–493 | year=1966<br /> | issue=4 | bibcode=1966MNRAS.133..475A | doi=10.1093/mnras/133.4.475| doi-access=free}}&lt;/ref&gt; Located {{val|155.9|0.9|u=light-years}} from Earth, it has a 61.1% chance of being a member of the [[Hyades (star cluster)|Hyades]]–[[Pleiades]] stream of stars that share a [[Stellar kinematics|common motion through space]].&lt;ref name=Famaey2005&gt;{{cite journal| last1=Famaey | first1=B. | last2=Jorissen | first2=A. | last3=Luri | first3=X. | last4=Mayor | first4=M. | last5=Udry | first5=S. | last6=Dejonghe | first6=H. | last7=Turon | first7=C. | title=Local kinematics of K and M giants from CORAVEL/Hipparcos/Tycho-2 data. Revisiting the concept of superclusters | journal=[[Astronomy and Astrophysics]] | volume=430 | pages=165–186 | date=2005 | doi=10.1051/0004-6361:20041272 | bibcode=2005A&amp;A...430..165F | arxiv=astro-ph/0409579 | s2cid=17804304 }}&lt;/ref&gt; [[Theta Sagittae]] is a double star system, with components 12 arcseconds apart visible in a small telescope.&lt;ref name=&quot;turnleft&quot;/&gt; At magnitude 6.5, the brighter is a yellow-white main sequence star of spectral type F3&amp;nbsp;V,&lt;ref name=&quot;abt1985&quot;&gt;{{cite journal |last=Abt |first=Helmut A. |title=Visual multiples. VIII. 1000 MK types |journal=The Astrophysical Journal Supplement Series |date=1985 |bibcode= 1985ApJS...59...95A |volume=59 |pages=95–112 |doi=10.1086/191064|doi-access=free }}&lt;/ref&gt; located {{val|146.1|0.2|u=light-years}} from Earth.&lt;ref name=Gaia-DR2tet1&gt;{{cite DR2|1829590548393010560}}&lt;/ref&gt; The 8.8-magnitude fainter companion is a main sequence star of spectral type G5&amp;nbsp;V. A 7.4-magnitude orange giant of spectral type K2&amp;nbsp;III is also visible {{val|91|ul=&quot;}} from the binary pair,&lt;ref name=&quot;abt1985&quot;/&gt; located {{val|842|9|u=light-years}} away.&lt;ref name=Gaia-DR2tet2&gt;{{cite DR2|1829590410954063744}}&lt;/ref&gt;<br /> <br /> ====Variable stars====<br /> <br /> [[Image:Wolf-Rayet 124 (NIRCam and MIRI composite image).tif|thumb|right|300px|[[James Webb Space Telescope]] image of [[WR 124]] in Sagitta. [[NIRCam]] and [[Mid-Infrared Instrument|MIRI]] composite]]<br /> <br /> Variable stars are popular targets for amateur astronomers, their observations providing valuable contributions to understanding star behaviour.&lt;ref&gt;{{cite web |last=Tooke |first=Owen |title=Variables: What Are They and Why Observe Them? |url=https://www.aavso.org/variables-what-are-they-why-observe-them |publisher=AAVSO | date=24 August 2017 |access-date=14 October 2020}}&lt;/ref&gt; [[R Sagittae]] is a member of the rare [[RV Tauri variable]] class of star. It ranges in magnitude from 8.2 to 10.4.&lt;ref name=&quot;Levy 1998&quot;&gt;{{cite book|last=Levy|first=David H.|title=Observing Variable Stars: A Guide for the Beginner|publisher=[[Cambridge University Press]]|location=Cambridge, United Kingdom|date=1998|pages=152–153|isbn=978-0-521-62755-9|url=https://books.google.com/books?id=5-O2cd937FMC&amp;pg=PA153}}&lt;/ref&gt; It is around {{val|8100|fmt=commas|u=light-years}} distant.&lt;ref name=dr2R&gt;{{cite DR2|1808748613981554176}}&lt;/ref&gt; It has a diameter {{val|61.2|12.4|9.9}} times that of the Sun, and is {{val|2329|744|638|fmt=commas}} as luminous, yet most likely is less massive than the Sun. An aging star, it has moved on from the [[asymptotic giant branch]] of stellar evolution and is on its way to becoming a [[planetary nebula]].&lt;ref name=bodikiss&gt;{{cite journal |doi=10.3847/1538-4357/aafc24 |title=Physical Properties of Galactic RV Tauri Stars from Gaia DR2 Data |journal=[[The Astrophysical Journal]] |volume=872 |issue=1 |pages=60 |year=2019 |last1=Bódi |first1=A. |last2=Kiss |first2=L. L. |bibcode=2019ApJ...872...60B |arxiv=1901.01409 |s2cid=119099605 }}&lt;/ref&gt; [[FG Sagittae]] is a &quot;born again&quot; star, a highly luminous star around {{val|4000|fmt=commas|u=light-years}} distant from Earth.&lt;ref name=dr2FG&gt;{{cite DR2|1828750899461025536}}&lt;/ref&gt; It reignited fusion of a helium shell shortly before becoming a white dwarf, and has expanded first to a blue supergiant and then to a K-class [[supergiant]] in less than 100 years.&lt;ref name=jurcsik1999&gt;{{cite journal |bibcode=1999NewAR..43..415J |title=The remarkable evolution of the post-AGB star FG Sge |last1=Jurcsik |first1=Johanna |last2=Montesinos |first2=Benjamín. |journal=New Astronomy Reviews |year=1999 |volume=43 |issue=6 |page=415 |doi=10.1016/S1387-6473(99)00098-6 |url=http://real.mtak.hu/2086/1/K.pdf }}&lt;/ref&gt; It is surrounded by a faint (visual magnitude 23) planetary nebula, Henize 1–5, that formed when FG Sagittae first left the asymptotic giant branch.&lt;ref name=rosenbush2015&gt;{{cite journal |bibcode=2015Ap.....58...46R |title=Photometry, Spectrometry, and Polarimetry of FG Sge in the Active State |last1=Rosenbush |first1=A. É. |last2=Efimov |first2=Yu. S. |s2cid=121128187 |journal=Astrophysics |year=2015 |volume=58 |issue=1 |page=46 |doi=10.1007/s10511-015-9365-x }}&lt;/ref&gt;<br /> <br /> [[S Sagittae]] is a [[classical Cepheid]] that varies from magnitude 5.24 to 6.04 every 8.38&amp;nbsp;days. It is a yellow-white supergiant that pulsates between spectral types F6&amp;nbsp;Ib and G5&amp;nbsp;Ib.&lt;ref name=AAVSOS&gt;{{cite web|url=http://www.aavso.org/vsx/index.php?view=detail.top&amp;oid=27343 |title=S Sagittae |author =Watson, Christopher |date=4 January 2010|publisher=AAVSO |access-date=22 May 2015}}&lt;/ref&gt; Around 6 or 7 times as massive and 3,500 times as luminous as the Sun,&lt;ref name=kalerSSge&gt;{{cite web| first=James B. | last=Kaler | title=S Sagittae | work=Stars | publisher=University of Illinois | url=http://stars.astro.illinois.edu/sow/ssge.html | date= 4 October 2013|access-date=22 May 2015}}&lt;/ref&gt; it is located around {{val|5100|fmt=commas|u=light-years}} from Earth.&lt;ref name=Gaia-DR2S&gt;{{cite DR2|1820309639468685824}}&lt;/ref&gt; [[HD 183143]] is a remote highly luminous star around {{val|7900|fmt=commas|u=light-years}} away,&lt;ref name=dr2&gt;{{cite DR2|4323280515006629760}}&lt;/ref&gt; that has been classified as a blue [[hypergiant]].&lt;ref name=chentsov&gt;{{cite journal|last=Chentsov|first=E. L.|title=HD 183143: A Hypergiant|journal=Astronomy Letters|volume=30|issue=5|year=2004|pages=325–331|doi=10.1134/1.1738155|bibcode=2004AstL...30..325C|s2cid=121435951}}&lt;/ref&gt; [[Infrared]] bands of ionised [[buckminsterfullerene]] molecules have also been found in its spectrum.&lt;ref name=c60&gt;{{cite journal|bibcode=2015ApJ...812L...8W|arxiv=1509.06818|title=Identification of More Interstellar C60+ Bands|journal=The Astrophysical Journal Letters|volume=812|pages=L8|last1=Walker|first1=G. A. H.|last2=Bohlender|first2=D. A.|last3=Maier|first3=J. P.|last4=Campbell|first4=E. K.|year=2015|issue=1|doi=10.1088/2041-8205/812/1/L8|s2cid=118598331}}&lt;/ref&gt; [[WR 124]] is a [[Wolf–Rayet star]] moving at great speed surrounded by a nebula of ejected gas.&lt;ref name=crowther&gt;{{cite journal|bibcode=1999A&amp;A...350.1007C|title=Wolf–Rayet nebulae as tracers of stellar ionizing fluxes. I. M1-67|journal=Astronomy and Astrophysics|volume=350|pages=1007|last1=Crowther|first1=Paul A.|last2=Pasquali|first2=A.|last3=De Marco|first3=Orsola|last4=Schmutz|first4=W.|last5=Hillier|first5=D. J.|last6=De Koter|first6=A.|year=1999|arxiv = astro-ph/9908200 }}&lt;/ref&gt;<br /> <br /> [[U Sagittae]] is an eclipsing binary that varies between magnitudes 6.6 and 9.2 over 3.4&amp;nbsp;days, making it a suitable target for enthusiasts with small telescopes.&lt;ref name=moore366/&gt; There are two component stars—a blue-white star of spectral type B8&amp;nbsp;V and an ageing star that has cooled and expanded into a yellow subgiant of spectral type G4&amp;nbsp;III-IV. They orbit each other close enough that the cooler subgiant has filled its [[Roche lobe]] and is passing material to the hotter star, and hence it is a [[semidetached binary]] system.&lt;ref&gt;{{cite journal |last=Malkov |first=Oleg Yu |title=Semidetached double-lined eclipsing binaries: Stellar parameters and rare classes |journal=Monthly Notices of the Royal Astronomical Society |date=2020 |volume=491 |issue=4 |pages=5489–5497| bibcode= 2020MNRAS.491.5489M |doi=10.1093/mnras/stz3363|doi-access=free }}&lt;/ref&gt; The system is {{val|900|10|u=light-years}} distant.&lt;ref name=Gaia-DR2u&gt;{{cite DR2|4516549576568929408}}&lt;/ref&gt; Near U Sagittae is [[X Sagittae]], a [[semiregular variable]] ranging between magnitudes 7.9 and 8.4 over 196&amp;nbsp;days.&lt;ref name=moore366/&gt; A [[carbon star]], X Sagittae has a surface temperature of {{val|2576|fmt=commas|ul=K}}.&lt;ref&gt;{{cite journal |last1=Taranova |first1=O.G. | first2=V. I. | last2=Shenavrin |title=JHKLM Photometry for Carbon Stars |journal=[[Astronomy Letters]] |date=2004 |volume=30 |issue=8 |pages=605–622 |doi=10.1134/1.1784497|bibcode=2004AstL...30..549T |s2cid=119984131 }}&lt;/ref&gt;<br /> <br /> Located near 18 Sagittae is [[V Sagittae]], the prototype of the [[V Sagittae variable]]s, [[cataclysmic variable]]s that are also [[super soft X-ray source]]s.&lt;ref name=&quot;Levy 1998&quot;/&gt; It is expected to become a [[luminous red nova]] when the two stars merge around the year 2083, and briefly become the most luminous star in the [[Milky Way]] and one of the brightest stars in Earth's sky.&lt;ref name=2020-01&gt;{{Cite web|url=https://phys.org/news/2020-01-binary-star-sagittae-bright-nova.html|title=Binary star V Sagittae to explode as very bright nova by century's end| last= Lavalle| first= Mimi |website=phys.org|language=en-us|access-date=9 January 2020| date=7 January 2020}}&lt;/ref&gt;&lt;ref&gt;{{Cite web|url=http://m.cnn.com/en/article/h_f7a7fda778b24834ac5316152cc63433|title=There will be a new brightest star in the sky, when it explodes in about 60 years | publisher=CNN |access-date=9 January 2020| date=8 Jan 2020}}&lt;/ref&gt; [[WZ Sagittae]] is another cataclysmic variable, composed of a [[white dwarf]] that has about 85% the mass of the Sun, and low-mass star companion that has been calculated to be a [[brown dwarf]] of spectral class L2 that is only 8% as massive as the Sun.&lt;ref name=&quot;apj667&quot;&gt;{{cite journal | author=Steeghs, Danny | author2=Howell, Steve B. | author3=Knigge, Christian | author4=Gänsicke, Boris T. | author5=Sion, Edward M. | author6=Welsh, William F. | title=Dynamical Constraints on the Component Masses of the Cataclysmic Variable WZ Sagittae | journal=[[The Astrophysical Journal]] | volume=667 | issue=1 | pages=442–447 |date=September 2007 | doi=10.1086/520702 | bibcode=2007ApJ...667..442S|arxiv = 0706.0987 | s2cid=209833493 }}&lt;/ref&gt; Normally a faint object dimmer than magnitude 15, it flared up in 1913, 1946 and 1978 to be visible in binoculars.&lt;ref name=moore366/&gt; The [[black widow pulsar]] (B1957+20) is the second [[millisecond pulsar]] ever discovered.&lt;ref&gt;{{cite journal|last1=Fruchter|first1=A. S.|last2=Stinebring|first2=D. R.|last3=Taylor|first3=J. H.|year=1988|title=A millisecond pulsar in an eclipsing binary|journal=[[Nature (journal)|Nature]]|volume=333|issue=6170|pages=237–239 |bibcode=1988Natur.333..237F|doi=10.1038/333237a0|s2cid=4337525}}&lt;/ref&gt; It is a massive [[neutron star]] that is [[Ablation|ablating]] its brown dwarf-sized companion which causes the pulsar's radio signals to [[Attenuation|attenuate]] as they pass through the outflowing material.&lt;ref&gt;{{Cite web|url=https://chandra.harvard.edu/photo/2003/b1957/|title=B1957+20: A Cocoon Found Inside the Black Widow's Web|website=Chandra|access-date=23 October 2020}}&lt;/ref&gt;<br /> <br /> ====Stars with exoplanets====<br /> [[Image:Messier 71 Hubble WikiSky.jpg|thumb|right|alt=Several hundred stars of different brightnesses and colours scattered on a black background|[[Messier 71]] globular cluster]]<br /> <br /> [[HD 231701]] is a [[F-type main sequence star|yellow-white main sequence star]] hotter and larger than the Sun, with a [[Jupiter]]-like [[planet]] that was discovered in 2007 by the [[methods of detecting extrasolar planets#Radial velocity|radial velocity technique]]. The planet orbits at a distance of {{Val|0.57|ul=AU}} from the star with a period of 141.6&amp;nbsp;days.&lt;ref name=&quot;Fischer2007&quot;&gt;{{cite journal | title=Five Intermediate-Period Planets from the N2K Sample | last1=Fischer | first1=Debra A. | last2=Vogt | first2=Steven S. | last3=Marcy | first3=Geoffrey W. | last4=Butler | first4=R. Paul | last5=Sato | first5=Bun’ei | last6=Henry | first6=Gregory W. | last7=Robinson | first7=Sarah | last8=Laughlin | first8=Gregory | last9=Ida | first9=Shigeru | journal=[[The Astrophysical Journal]] | volume=669 | issue=2 | pages=1336–1344 | year=2007 | arxiv=0704.1191 | bibcode=2007ApJ...669.1336F | doi=10.1086/521869 | s2cid=7774321 }}&lt;/ref&gt; It has a mass of at least 1.13 Jupiter masses.&lt;ref&gt;{{cite journal<br /> | title=Radial Velocities from the N2K Project: Six New Cold Gas Giant Planets Orbiting HD 55696, HD 98736, HD 148164, HD 203473, and HD 211810<br /> | last1=Ment | first1=Kristo | last2=Fischer | first2=Debra A.<br /> | last3=Bakos | first3=Gaspar | last4=Howard | first4=Andrew W.<br /> | last5=Isaacson | first5=Howard | display-authors=1<br /> | journal=The Astronomical Journal<br /> | volume=156 | issue=5 | at=213 | year=2018<br /> | arxiv=1809.01228 | bibcode=2018AJ....156..213M<br /> | doi=10.3847/1538-3881/aae1f5 | s2cid=119243619 }}&lt;/ref&gt; <br /> <br /> HAT-P-34 is a star {{val|1.392|0.047}} times as massive as the Sun with {{val|1.535|0.135|0.102}} times its radius and {{val|3.63|0.75|0.51}} times its luminosity. With an apparent magnitude of 10.4,&lt;ref name=&quot;Bakos2012&quot;/&gt; it is {{val|819|9|u=light-years}} distant.&lt;ref name=Gaia-DR2hat34&gt;{{cite DR2|1810218734055374720}}&lt;/ref&gt; A planet {{val|3.328|0.211}} times as massive as Jupiter was discovered transiting it in 2012. With a period of 5.45&amp;nbsp;days and a distance of {{val|0.06|u=AU}} from its star, it has an estimated surface temperature of {{val|1520|60|fmt=commas|u=K}}.&lt;ref name=&quot;Bakos2012&quot;&gt;{{cite journal | title=HAT-P-34b – HAT-P-37b: Four Transiting Planets More Massive Than Jupiter Orbiting Moderately Bright Stars | last1=Bakos | first1=G. Á. | last2=Hartman | first2=J. D. | last3=Torres | first3=G. | last4=Béky | first4=B. | last5=Latham | first5=D. W. | last6=Buchhave | first6=L. A. | last7=Csubry | first7=Z. | last8=Kovács | first8=Géza | last9=Bieryla | first9=A. | last10=Quinn | first10=S. | last11=Szklenár | first11=T. | last12=Esquerdo | first12=G. A. | last13=Shporer | first13=A. | last14=Noyes | first14=R. W. | last15=Fischer | first15=D. A. | last16=Johnson | first16=J. A. | last17=Howard | first17=A. W. | last18=Marcy | first18=G. W. | last19=Sato | first19=B. | last20=Penev | first20=K. | last21=Everett | first21=M. | last22=Sasselov | first22=D. D. | last23=Fűrész | first23=G. | last24=Stefanik | first24=R. P. | last25=Lázár | first25=J. | last26=Papp | first26=I. | last27=Sári | first27=P. | journal=[[The Astronomical Journal]] | volume=144 | issue=1 | pages=19–32 | date=2012 | arxiv=1201.0659 | bibcode=2012AJ....144...19B | doi=10.1088/0004-6256/144/1/19 | s2cid=119291677 }}&lt;/ref&gt; <br /> <br /> [[15 Sagittae]] is a solar analog—a star similar to the Sun, with {{val|1.08|0.04}} times its mass, {{val|1.115|0.021}} times its radius and {{val|1.338|0.03}} times its luminosity. It has an apparent magnitude of 5.80.&lt;ref name=Anderson2012&gt;{{citation<br /> | title=XHIP: An extended hipparcos compilation<br /> | last1=Anderson | first1=E. | last2=Francis | first2=Ch.<br /> | journal=Astronomy Letters | postscript=.<br /> | arxiv=1108.4971 | volume=38 | issue=5 | pages=331 | year=2012<br /> | bibcode=2012AstL...38..331A | doi=10.1134/S1063773712050015 | s2cid=119257644 }}&lt;/ref&gt; It has an L4 brown dwarf substellar companion that is around the same size as Jupiter but 69 times as massive with a surface temperature of between 1,510 and {{val|1850|fmt=commas|u=K}}, taking around 73.3&amp;nbsp;years to complete an orbit around the star.&lt;ref name=Crepp2012/&gt; The system is estimated to be {{val|2.5|1.8}} billion years old.&lt;ref name=Crepp2012&gt;{{cite journal | last1=Crepp | first=Justin R. | last2=Johnson | first2=John Asher | last3=Fischer | first3=Debra A. | last4=Howard | first4=Andrew W. | last5=Marcy | first5=Geoffrey W. | last6=Wright | first6=Jason T. | last7=Isaacson | first7=Howard | last8=Boyajian | first8=Tabetha | last9=von Braun | first9=Kaspar | last10=Hillenbrand | first10=Lynne A. | last11=Hinkley | first11=Sasha | last12=Carpenter | first12=John M. | last13=Brewer | first13=John M. | title=The Dynamical Mass and Three-Dimensional Orbit of HR7672B: A Benchmark Brown Dwarf with High Eccentricity | journal=[[The Astrophysical Journal]] | volume=751 | issue=2 | id=97 | pages=14 | date=2012 | arxiv=1112.1725 | bibcode=2012ApJ...751...97C | doi=10.1088/0004-637X/751/2/97| s2cid=16113054 }}&lt;/ref&gt;<br /> <br /> ===Deep-sky objects===<br /> The band of the Milky Way and the [[Great Rift (astronomy)|Great Rift]] within it pass though Sagitta, with Alpha, Beta and Epsilon Sagittae marking the Rift's border.&lt;ref name=&quot;crossen 2004&quot;&gt;{{cite book|author1=Crossen, Craig |author2=Rhemann, Gerald |title=Sky Vistas: Astronomy for Binoculars and Richest-Field Telescopes|publisher=[[Springer Science+Business Media|Springer]]|location=New York|orig-year=2004 | year=2012 | page=150 | url=https://books.google.com/books?id=3vELBwAAQBAJ&amp;pg=PA150 | isbn=978-3-709-10626-6}}&lt;/ref&gt; Located between Beta and Gamma Sagittae is [[Messier 71]],&lt;ref name=moore366/&gt; a very loose [[globular cluster]] mistaken for some time for a dense [[open cluster]].&lt;ref name=tt/&gt; At a distance of about {{val|13000|fmt=commas|u=light-years}} from Earth,&lt;ref name=inglis2017&gt;{{cite book |last=Inglis |first1=Mike |title=Astronomy of the Milky Way: The Observer's Guide to the Northern Sky |date=2017 |publisher=[[Springer Science+Business Media|Springer]] |location=New York |isbn=978-3-319-49082-3 |pages=83–89 |url=https://books.google.com/books?id=02DTDgAAQBAJ&amp;q=Messier+71&amp;pg=PA89}}&lt;/ref&gt; it was first discovered by the French astronomer [[Philippe Loys de Chéseaux]] in the year 1745 or 1746.&lt;ref name=tt&gt;{{cite book | last1 = Thompson | first1 = Robert Bruce | last2 = Thompson | first2 = Barbara Fritchman | date = 2007 | title = Illustrated Guide to Astronomical Wonders: From Novice to Master Observer | page=394 | url=https://books.google.com/books?id=ymt9nj_uPhwC&amp;pg=PA394 | publisher = [[O'Reilly Media]] | location = North Sebastopol, California | isbn = 978-0-596-52685-6 }}&lt;/ref&gt; The loose globular cluster has a mass of around {{solar mass|53,000|link=yes}} and a [[Luminosity#In astronomy|luminosity]] of approximately 19,000 {{lo|link=yes}}.&lt;ref name=&quot;Dalgleish&quot;&gt;{{cite journal |last=Dalgleish |first1=H. |last2=Kamann |first2=S. |last3=Usher |first3=C. |last4=Baumgardt |first4=H. |last5=Bastian |first5=N. |last6=Veitch-Michaelis |first6=J. |last7=Bellini |first7=A. |last8=Martocchia |first8=S. |last9=Da Costa |first9=G. S. |last10=Mackey |first10=D. |last11=Bellstedt |first11=S. |last12=Pastorello |first12=N. |last13=Cerulo |first13=P. |title=The WAGGS project-III. Discrepant mass-to-light ratios of Galactic globular clusters at high metallicity |journal=Monthly Notices of the Royal Astronomical Society |date=March 2020 |volume=492 |issue=3 |pages=3859–3871 |doi=10.1093/mnras/staa091 |arxiv=2001.01810 |bibcode=2020MNRAS.492.3859D |doi-access=free }}&lt;/ref&gt;<br /> <br /> There are two notable [[planetary nebula]]e in Sagitta: [[NGC 6886]] is composed of a hot central post-AGB star that has 55% of the Sun's mass yet {{val|2700|850|fmt=commas}} times its luminosity, with a surface temperature of {{val|142000|fmt=commas|u=K}}, and surrounding nebula estimated to have been expanding for between 1,280 and 1,600 years,&lt;ref name=schonberner&gt;{{Cite journal | doi=10.1051/0004-6361/201731788 | bibcode=2018A&amp;A...609A.126S| title=Expansion patterns and parallaxes for planetary nebulae| year=2018| last1=Schönberner| first1=D.| last2=Balick| first2=B.| last3=Jacob| first3=R.| journal=[[Astronomy &amp; Astrophysics]]| volume=609| pages=A126| doi-access=free}}&lt;/ref&gt; The nebula was discovered by [[Ralph Copeland]] in 1884.&lt;ref&gt;{{cite web|last=Seligman|first=Courtney|title=NGC Objects: NGC 6850 - 6899|url=http://cseligman.com/text/atlas/ngc68a.htm|access-date=22 August 2015}}&lt;/ref&gt; The [[Necklace Nebula]]—originally a close binary, one component of which swallowed the other as it expanded to become a giant star. The smaller star remained in orbit inside the larger, whose rotation speed increased greatly, resulting in it flinging its outer layers off into space, forming a ring with knots of bright gas formed from clumps of stellar material.&lt;ref name=NASANN&gt;{{cite press release |last1=Weaver |first1=Donna |first2=Ray |last2=Villard |date=11 August 2011 |title=Hubble Offers a Dazzling 'Necklace' |url=https://www.nasa.gov/mission_pages/hubble/science/necklace-nebula.html |publisher=NASA |agency=Space Telescope Science Institute |access-date=20 October 2020}}&lt;/ref&gt; It was discovered in 2005 and is around 2 light-years wide.&lt;ref name=STSINN&gt;[https://web.archive.org/web/20111014231028/http://hubblesite.org/newscenter/archive/releases/2011/24/image/b/ Hubble Offers a Dazzling View of the 'Necklace' Nebula], news release STScI-2011-24 dated August 11, 2011, from [[Space Telescope Science Institute]]&lt;/ref&gt;&lt;ref name=NASANN/&gt; It has a size of {{Val|0.35|ul=arcminute}}.&lt;ref name=Sabin2014&gt;{{cite journal | display-authors=1<br /> | title=First release of the IPHAS catalogue of new extended planetary nebulae<br /> | last1=Sabin | first1=L. | last2=Parker | first2=Q. A.<br /> | last3=Corradi | first3=R. L. M. | last4=Guzman-Ramirez | first4=L.<br /> | last5=Morris | first5=R. A. H. | last6=Zijlstra | first6=A. A.<br /> | last7=Bojičić | first7=I. S. | last8=Frew | first8=D. J.<br /> | last9=Guerrero | first9=M. | last10=Stupar | first10=M.<br /> | last11=Barlow | first11=M. J. | last12=Cortés Mora | first12=F.<br /> | last13=Drew | first13=J. E. | last14=Greimel | first14=R.<br /> | last15=Groot | first15=P. | last16=Irwin | first16=J. M.<br /> | last17=Irwin | first17=M. J. | last18=Mampaso | first18=A.<br /> | last19=Miszalski | first19=B. | last20=Olguín | first20=L.<br /> | last21=Phillipps | first21=S. | last22=Santander García | first22=M.<br /> | last23=Viironen | first23=K. | last24=Wright | first24=N. J.<br /> | journal=Monthly Notices of the Royal Astronomical Society<br /> | volume=443 | issue=4 | pages=3388–3401 | date=October 2014<br /> | doi=10.1093/mnras/stu1404 | arxiv=1407.0109<br /> | bibcode=2014MNRAS.443.3388S }}&lt;/ref&gt; Both nebulae are around {{val|15000|fmt=commas|u=light-years}} from Earth.&lt;ref name=schonberner/&gt;&lt;ref name=NASANN/&gt;<br /> <br /> == See also ==<br /> * [[Sagitta (Chinese astronomy)]]<br /> <br /> == Notes ==<br /> {{notelist}}<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> == External links ==<br /> {{Commons|Sagitta}}<br /> * [http://www.allthesky.com/constellations/vulpecula/ The Deep Photographic Guide to the Constellations: Sagitta]<br /> * [https://iconographic.warburg.sas.ac.uk/category/vpc-taxonomy-017052 Warburg Institute Iconographic Database (ca 160 medieval and early modern images of Sagitta)]<br /> * Bayer's ''[http://lhldigital.lindahall.org/cdm/compoundobject/collection/astro_atlas/id/118/show/41 Uranometria]'', from the [[Linda Hall Library]] digital collection.<br /> <br /> {{Stars of Sagitta}}<br /> {{navconstel}}<br /> {{Portal bar|Astronomy|Stars|Spaceflight|Outer space|Solar System}}<br /> {{Sky|19|50|00|+|18|40|00|10}}<br /> {{featured article}}<br /> {{DEFAULTSORT:Sagitta}}<br /> [[Category:Sagitta| ]]<br /> [[Category:Constellations]]<br /> [[Category:Northern constellations]]<br /> [[Category:Constellations listed by Ptolemy]]<br /> [[Category:Articles containing video clips]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Altair&diff=1170910192 Altair 2023-08-17T23:06:30Z <p>205.189.94.9: </p> <hr /> <div>{{Short description|Brightest star in the constellation Aquila}}<br /> {{About|a star in the Aquila constellation}}<br /> {{good article}}<br /> <br /> {{Starbox begin}}<br /> {{Starbox image<br /> | image=<br /> {{Location mark<br /> | image=Aquila constellation map.svg<br /> | float=center | width=260 | position=right<br /> | mark=Red circle.svg | mark_width=12 | mark_link=Altair (star)<br /> | x=440 | y=339<br /> }}<br /> | caption=Location of Altair (circled)<br /> }}<br /> {{Starbox observe<br /> | epoch = [[J2000.0]]97<br /> | equinox = [[J2000.0]] ([[International Celestial Reference System|ICRS]])<br /> | constell = [[Aquila (constellation)|Aquila]]<br /> | pronounce = {{IPAc-en|'|æ|l|t|ɛər|}}, {{IPAc-en|'|æ|l|t|aɪər|}}&lt;ref&gt;{{cite web|url=http://www.oxforddictionaries.com/definition/american_english/altair|archive-url=https://web.archive.org/web/20140512142137/http://www.oxforddictionaries.com/definition/american_english/Altair|url-status=dead|archive-date=May 12, 2014|title=Altair: definition of Altair in Oxford dictionary (American English)}}&lt;/ref&gt;&lt;ref name=Kunitzsch&gt;{{cite book |last1=Kunitzsch |first1=Paul |last2=Smart |first2=Tim |date = 2006 |edition = 2nd rev. |title = A Dictionary of Modern star Names: A Short Guide to 254 Star Names and Their Derivations |publisher = Sky Pub |location = Cambridge, Massachusetts |isbn = 978-1-931559-44-7<br /> }}&lt;/ref&gt;<br /> | ra = {{RA|19|50|46.99855}}&lt;ref name=aaa474_2_653/&gt;<br /> | dec = {{DEC|+08|52|05.9563}}&lt;ref name=aaa474_2_653/&gt;<br /> | appmag_v = 0.76&lt;ref name=ducati&gt;{{cite journal|bibcode=2002yCat.2237....0D|title=VizieR Online Data Catalog: Catalogue of Stellar Photometry in Johnson's 11-color system|journal=CDS/ADC Collection of Electronic Catalogues|volume=2237|pages=0|last1=Ducati|first1=J. R.|date=2002}}&lt;/ref&gt;<br /> }}<br /> {{Starbox character<br /> | type=[[Main sequence]]<br /> | class = A7V&lt;ref name=sb0/&gt;<br /> | u-b = +0.09&lt;ref name=ducati/&gt;<br /> | b-v = +0.22&lt;ref name=ducati/&gt;<br /> | v-r = +0.14&lt;ref name=ducati/&gt;<br /> | r-i = +0.13&lt;ref name=ducati/&gt;<br /> | variable = [[Delta Scuti variable|Delta Scuti]]&lt;ref name=&quot;Buzasi et al 2005&quot;/&gt;<br /> }}<br /> {{Starbox astrometry<br /> | radial_v = {{nowrap|−26.1 ± 0.9}}&lt;ref name=sb0/&gt;<br /> | prop_mo_ra = +536.23&lt;ref name=aaa474_2_653/&gt;<br /> | prop_mo_dec = +385.29&lt;ref name=aaa474_2_653/&gt;<br /> | parallax = 194.95<br /> | p_error = 0.57<br /> | parallax_footnote = &lt;ref name=aaa474_2_653/&gt;<br /> | absmag_v= 2.22&lt;ref name=&quot;Buzasi et al 2005&quot;/&gt;<br /> }}<br /> {{Starbox detail|<br /> | mass = {{val|1.86|0.03}}&lt;ref name=bouchaud2020/&gt;<br /> | radius = 1.57{{snd}}2.01&lt;ref name=bouchaud2020/&gt;&lt;ref group=nb name=rot/&gt;<br /> | gravity = 4.29&lt;ref name=aass85_3_1015/&gt;<br /> | rotation = 7.77 hours&lt;ref name=peterson06/&gt;<br /> | luminosity = 10.6&lt;ref name=peterson06/&gt;<br /> | temperature = 6,860{{snd}}8,621&lt;ref name=bouchaud2020/&gt;&lt;ref group=nb name=rot/&gt;<br /> | metal_fe = −0.2&lt;ref name=monnier07/&gt;<br /> | rotational_velocity = 242&lt;ref name=bouchaud2020/&gt;<br /> | age_myr = 100&lt;ref name=bouchaud2020/&gt;<br /> }}<br /> {{Starbox catalog<br /> | names = {{odlist | name=Atair | B=α&amp;nbsp;Aquilae, α&amp;nbsp;Aql, Alpha&amp;nbsp;Aquilae, Alpha&amp;nbsp;Aql | F=53&amp;nbsp;Aquilae, 53&amp;nbsp;Aql | BD=+08°4236 | FK5=745 | GCTP=4665.00 | GJ=768 | HD=187642 | HIP=97649 | HR=7557 | LFT=1499 | LHS=3490 | LTT=15795 | NLTT=48314 | SAO=125122 | WDS=19508+0852A }}&lt;ref name=sb0/&gt;&lt;ref name=bsc1/&gt;&lt;ref name=wds/&gt;<br /> }}<br /> {{Starbox reference<br /> | Simbad = alf+aql<br /> }}<br /> {{Starbox end<br /> }}<br /> <br /> '''Altair''' is the brightest [[star]] in the [[constellation]] of [[Aquila (constellation)|Aquila]] and the [[list of brightest stars|twelfth-brightest star]] in the [[night sky]]. It has the [[Bayer designation]] Alpha&amp;nbsp;Aquilae, which is [[Latinisation of names|Latinised]] from '''α&amp;nbsp;Aquilae''' and abbreviated '''Alpha&amp;nbsp;Aql''' or '''α&amp;nbsp;Aql'''. Altair is an [[A-type main-sequence star|A-type]] [[main-sequence star]] with an [[apparent visual magnitude]] of 0.77 and is one of the vertices of the [[Summer Triangle]] [[Asterism (astronomy)|asterism]]; the other two vertices are marked by [[Deneb]] and [[Vega]].&lt;ref name=sb0/&gt;&lt;ref name=darlingaltair/&gt;&lt;ref name=&quot;darlingsummer&quot;&gt;{{Cite web |last=Darling |first=David |title=Summer Triangle |url=http://www.daviddarling.info/encyclopedia/S/Summer_Triangle.html |access-date=2008-11-26 |website=www.daviddarling.info}}&lt;/ref&gt; It is located at a distance of {{convert|16.7|ly|pc|abbr=off|lk=on}} from the [[Sun]].&lt;ref name=&quot;schaaf2008&quot;&gt;{{Cite book |last=Hoboken |first=Fred Schaaf |title=The brightest stars : discovering the universe through the sky's most brilliant stars |publisher=John Wiley &amp; Sons, Inc. |year=2008 |isbn=978-0-471-70410-2 |location=New Jersey |pages= |oclc=440257051}}&lt;/ref&gt;{{Citation page|page=194}} Altair is currently in the [[G-cloud]]—a nearby [[interstellar cloud]], an accumulation of gas and dust.&lt;ref&gt;{{Cite web|url=http://interstellar.jpl.nasa.gov/interstellar/probe/introduction/neighborhood.html|title=Our Local Galactic Neighborhood|publisher=[[NASA]]|archive-url=https://web.archive.org/web/20131121061128/http://interstellar.jpl.nasa.gov/interstellar/probe/introduction/neighborhood.html|archive-date=2013-11-21|url-status=dead}}&lt;/ref&gt;&lt;ref&gt;{{Cite news|url=http://www.centauri-dreams.org/?p=14203|title=Into the Interstellar Void|last=Gilster|first=Paul|date=2010-09-01|work=Centauri Dreams|access-date=2017-03-26|language=en-US}}&lt;/ref&gt; <br /> <br /> Altair rotates rapidly, with a velocity at the [[equator]] of approximately 286&amp;nbsp;km/s.&lt;ref group=nb&gt;From values of ''v'' sin ''i'' and ''i'' in the second column of Table 1, Monnier et al. 2007.&lt;/ref&gt;&lt;ref name=monnier07/&gt; This is a significant fraction [see .71%] of the star's estimated breakup speed of 400&amp;nbsp;km/s.&lt;ref name=robrade2009/&gt; A study with the [[Palomar Testbed Interferometer]] revealed that Altair is not spherical, but is flattened at the poles due to its high rate of rotation.&lt;ref name=&quot;pti2001&quot;&gt;{{Cite journal|last1=Belle|first1=Gerard T. van|last2=Ciardi|first2=David R.|last3=Thompson|first3=Robert R.|last4=Akeson|first4=Rachel L.|last5=Lada|first5=Elizabeth A.|year=2001|title=Altair's Oblateness and Rotation Velocity from Long-Baseline Interferometry|url=http://stacks.iop.org/0004-637X/559/i=2/a=1155|journal=The Astrophysical Journal|language=en|volume=559|issue=2|pages=1155–1164|bibcode=2001ApJ...559.1155V|doi=10.1086/322340|s2cid=13969695 |issn=0004-637X}}&lt;/ref&gt; Other [[interferometric]] studies with multiple telescopes, operating in the [[infrared]], have imaged and confirmed this phenomenon.&lt;ref name=&quot;monnier07&quot;&gt;{{Cite journal|last2=Zhao|first2=M|last3=Pedretti|first3=E|last4=Thureau|first4=N|last5=Ireland|first5=M|last6=Muirhead|first6=P|last7=Berger|first7=J. P.|last8=Millan-Gabet|first8=R|last9=Van Belle|first9=G|year=2007|title=Imaging the surface of Altair|journal=Science|volume=317|issue=5836|pages=342–345|bibcode=2007Sci...317..342M|doi=10.1126/science.1143205|pmid=17540860|last1=Monnier|first1=J. D.|last10=Ten Brummelaar|first10=T|last11=McAlister|first11=H|last12=Ridgway|first12=S|last13=Turner|first13=N|last14=Sturmann|first14=L|last15=Sturmann|first15=J|last16=Berger|first16=D|arxiv = 0706.0867 |s2cid=4615273}} See second column of Table 1 for stellar parameters.&lt;/ref&gt;<br /> <br /> ==Nomenclature==<br /> [[File:AquilaCC.jpg|thumb|left|Altair is the brightest star in the constellation Aquila]]<br /> ''α Aquilae'' ([[Latinisation of names|Latinised]] to ''Alpha Aquilae'') is the star's [[Bayer designation]]. The traditional name ''Altair'' has been used since medieval times. It is an abbreviation of the [[Arabic]] phrase {{lang|ar|النسر الطائر}} ''Al-Nisr Al-Ṭa'ir'', &quot;{{lang|en|the flying eagle}}&quot;.&lt;ref&gt;{{Cite web|url=https://www.dictionary.com/browse/altair|title=the definition of altair|website=www.dictionary.com|language=en|access-date=2018-09-30}}&lt;/ref&gt;<br /> <br /> In 2016, the [[International Astronomical Union]] organized a [[Working Group on Star Names]] (WGSN)&lt;ref name=&quot;WGSN&quot;&gt;{{cite web | url=https://www.iau.org/science/scientific_bodies/working_groups/280/ | title=IAU Working Group on Star Names (WGSN)|access-date=22 May 2016}}&lt;/ref&gt; to catalog and standardize proper names for stars. The WGSN's first bulletin of July 2016&lt;ref name=&quot;WGSN1&quot;&gt;{{cite web | url=http://www.pas.rochester.edu/~emamajek/WGSN/WGSN_bulletin1.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://www.pas.rochester.edu/~emamajek/WGSN/WGSN_bulletin1.pdf |archive-date=2022-10-09 |url-status=live | title=Bulletin of the IAU Working Group on Star Names, No. 1 |access-date=28 July 2016}}&lt;/ref&gt; included a table of the first two batches of names approved by the WGSN, which included ''Altair'' for this star. It is now so entered in the IAU Catalog of Star Names.&lt;ref name=&quot;IAU-CSN&quot;&gt;{{cite web | url=http://www.pas.rochester.edu/~emamajek/WGSN/IAU-CSN.txt | title=IAU Catalog of Star Names |access-date=28 July 2016}}&lt;/ref&gt;<br /> {{clear left}}<br /> <br /> ==Physical characteristics==<br /> [[File:Altair-Sun comparison.png|thumb|left|Altair in comparison with the Sun]]<br /> <br /> Along with [[β Aquilae]] and [[γ Aquilae]], Altair forms the well-known line of stars sometimes referred to as the ''Family of Aquila'' or ''Shaft of Aquila''.&lt;ref name=&quot;schaaf2008&quot; /&gt;{{Citation page|page=190}} <br /> <br /> Altair is a [[Type-A star|type-A]] [[main-sequence star]] with about 1.8 times the [[mass of the Sun]] and 11 times [[Solar luminosity|its luminosity]].&lt;ref name=monnier07/&gt;&lt;ref name=peterson06/&gt; It is thought to be a young star close to the [[zero age main sequence]] at about 100 million years old, although previous estimates gave an age closer to one billion years old.&lt;ref name=bouchaud2020/&gt; Altair rotates rapidly, with a rotational period of under eight hours;&lt;ref name=bouchaud2020/&gt; for comparison, the equator of the [[Sun]] makes a complete rotation in a little more than 25 days, but Altair's rotation is similar to, and slightly faster than, those of [[Jupiter]] and [[Saturn]]. Like those two planets, its rapid rotation forces the star to be [[oblate spheroid|oblate]]; its equatorial diameter is over 20 percent greater than its polar diameter.&lt;ref name=monnier07/&gt;<br /> <br /> [[File:AlphaAqlLightCurve.png|thumb|left|A [[light curve]] for Altair, adapted from Buzasi ''et al.'' (2005)&lt;ref name=&quot;Buzasi et al 2005&quot;/&gt;]]<br /> Satellite measurements made in 1999 with the [[Wide Field Infrared Explorer]] showed that the brightness of Altair fluctuates slightly, varying by just a few thousandths of a magnitude with several different periods less than 2 hours.&lt;ref name=&quot;Buzasi et al 2005&quot;/&gt; As a result, it was identified in 2005 as a [[Delta Scuti variable]] star. Its [[light curve]] can be approximated by adding together a number of [[sine wave]]s, with periods that range between 0.8 and 1.5 hours.&lt;ref name=&quot;Buzasi et al 2005&quot;&gt;{{Cite journal |last1=Buzasi |first1=D. L. |last2=Bruntt |first2=H. |last3=Bedding |first3=T. R. |last4=Retter |first4=A. |last5=Kjeldsen |first5=H. |last6=Preston |first6=H. L. |last7=Mandeville |first7=W. J. |last8=Suarez |first8=J. C. |last9=Catanzarite |first9=J. |date=February 2005 |title=Altair: The Brightest δ Scuti Star |journal=The Astrophysical Journal |language=en |volume=619 |issue=2 |pages=1072–1076 |arxiv=astro-ph/0405127 |bibcode=2005ApJ...619.1072B |doi=10.1086/426704 |s2cid=16524681 |issn=0004-637X}}&lt;/ref&gt; It is a weak source of [[stellar corona|coronal]] [[X-ray astronomy|X-ray]] emission, with the most active sources of emission being located near the star's equator. This activity may be due to [[Convection zone|convection]] cells forming at the cooler equator.&lt;ref name=robrade2009/&gt;<br /> {{clear left}}<br /> ===Rotational effects===<br /> [[Image:Altair_PR_image6_(white).jpg|thumb|left|Direct image of Altair, taken with the [[CHARA array]]]]<br /> The angular diameter of Altair was measured [[interferometrically]] by [[R. Hanbury Brown]] and his co-workers at [[Narrabri Stellar Intensity Interferometer|Narrabri Observatory]] in the 1960s. They found a diameter of 3{{nbsp}}[[milliarcseconds]].&lt;ref&gt;{{cite journal |bibcode=1967MNRAS.137..393H |title=The stellar interferometer at Narrabri Observatory-II. The angular diameters of 15 stars |last1=Hanbury Brown |first1=R. |last2=Davis |first2=J. |last3=Allen |first3=L. R. |last4=Rome |first4=J. M. |journal=Monthly Notices of the Royal Astronomical Society |year=1967 |volume=137 |issue=4 |page=393 |doi=10.1093/mnras/137.4.393 |doi-access=free }}&lt;/ref&gt; Although Hanbury Brown et al. realized that Altair would be rotationally flattened, they had insufficient data to experimentally observe its oblateness. Later, using [[infrared]] interferometric measurements made by the [[Palomar Testbed Interferometer]] in 1999 and 2000, Altair was found to be flattened. This work was published by [[Gerard van Belle|G. T. van Belle]], [[David Ciardi|David R. Ciardi]] and their co-authors in 2001.&lt;ref name=pti2001/&gt;<br /> <br /> Theory predicts that, owing to Altair's rapid rotation, its [[surface gravity]] and [[effective temperature]] should be lower at the equator, making the equator less luminous than the poles. This phenomenon, known as [[gravity darkening]] or the [[von Zeipel effect]], was confirmed for Altair by measurements made by the [[Navy Precision Optical Interferometer]] in 2001, and analyzed by Ohishi et al. (2004) and Peterson et al. (2006).&lt;ref name=peterson06/&gt;&lt;ref&gt;{{cite journal | doi = 10.1086/422422| title = Asymmetric Surface Brightness Distribution of Altair Observed with the Navy Prototype Optical Interferometer| year = 2004| last1 = Ohishi| first1 = Naoko| last2 = Nordgren| first2 = Tyler E.| last3 = Hutter| first3 = Donald J.| journal = The Astrophysical Journal| volume = 612| issue = 1| pages = 463–471| arxiv = astro-ph/0405301| bibcode = 2004ApJ...612..463O| s2cid = 15857535}}&lt;/ref&gt; Also, A. Domiciano de Souza et al. (2005) verified gravity darkening using the measurements made by the Palomar and Navy interferometers, together with new measurements made by the VINCI instrument at the [[VLTI]].&lt;ref&gt;{{cite journal | doi = 10.1051/0004-6361:20042476| title = Gravitational-darkening of Altair from interferometry| year = 2005| last1 = Domiciano de Souza| first1 = A. | last2 = Kervella| first2 = P.| last3 = Jankov| first3 = S.| last4 = Vakili| first4 = F.| last5 = Ohishi| first5 = N.| last6 = Nordgren| first6 = T. E.| last7 = Abe| first7 = L.| journal = Astronomy &amp; Astrophysics| volume = 442| issue = 2| pages = 567–578| bibcode = 2005A&amp;A...442..567D| doi-access = free}}&lt;/ref&gt;<br /> <br /> Altair is one of the few [[List of stars with resolved images|stars for which a direct image]] has been obtained.&lt;ref name=nsf&gt;{{cite press release |url=https://www.nsf.gov/news/news_summ.jsp?cntn_id=109612 |title=Gazing up at the Man in the Star? |publisher=[[National Science Foundation]] |date=May 31, 2007 |access-date=2022-08-03 }}&lt;/ref&gt; In 2006 and 2007, J. D. Monnier and his coworkers produced an image of Altair's surface from 2006 infrared observations made with the [[Michigan Infrared Combiner|MIRC]] instrument on the [[CHARA array]] interferometer; this was the first time the surface of any [[main-sequence star]], apart from the Sun, had been imaged.&lt;ref name=nsf/&gt; The false-color image was published in 2007. The equatorial radius of the star was estimated to be 2.03 [[solar radii]], and the polar radius 1.63 solar radii—a 25% increase of the stellar radius from pole to equator.&lt;ref name=monnier07/&gt; The polar axis is inclined by about 60° to the line of sight from the Earth.&lt;ref name=robrade2009/&gt;<br /> {{clear left}}<br /> <br /> ==Etymology, mythology and culture==<br /> [[Image:Altair.jpg|Altair|thumb|left]]<br /> The term ''Al Nesr Al Tair'' appeared in [[Al Achsasi al Mouakket]]'s catalogue, which was translated into [[Latin]] as ''Vultur Volans''.&lt;ref&gt;{{cite journal|last=Knobel|first= E. B.|title=Al Achsasi Al Mouakket, on a catalogue of stars in the Calendarium of Mohammad Al Achsasi Al Mouakket|journal=Monthly Notices of the Royal Astronomical Society|volume=55|issue= 8|pages=429–438|date=June 1895|bibcode=1895MNRAS..55..429K|doi=10.1093/mnras/55.8.429|doi-access=free}}&lt;/ref&gt; This name was applied by the Arabs to the [[asterism (astronomy)|asterism]] of Altair, [[β Aquilae]] and [[γ Aquilae]] and probably goes back to the ancient Babylonians and Sumerians, who called Altair &quot;the eagle star&quot;.&lt;ref name=&quot;Kunitzsch&quot; /&gt;{{Citation page|pages=17-18}} The spelling ''Atair'' has also been used.&lt;ref name=&quot;allen&quot;&gt;{{Cite book |last=Allen |first=Richard Hinckley |url=http://archive.org/details/bub_gb_5xQuAAAAIAAJ |title=Star-names and their meanings |publisher=New York, Leipzig [etc.] G.E. Stechert |others=unknown library |year=1899 |pages=59–60}}&lt;/ref&gt; Medieval [[astrolabe]]s of England and Western Europe depicted Altair and Vega as birds.&lt;ref&gt;{{Cite journal | last1 = Gingerich | first1 = O.| doi = 10.1111/j.1749-6632.1987.tb37197.x | title = Zoomorphic Astrolabes and the Introduction of Arabic Star Names into Europe | journal = Annals of the New York Academy of Sciences | volume = 500 | pages = 89–104 | year = 1987 | issue = 1|bibcode = 1987NYASA.500...89G | s2cid = 84102853}}&lt;/ref&gt;<br /> <br /> The [[Koori]] people of [[Victoria (Australia)|Victoria]] also knew Altair as ''Bunjil'', the [[wedge-tailed eagle]], and β and γ Aquilae are his two wives the [[black swan]]s. The people of the [[Murray River]] knew the star as ''Totyerguil''.&lt;ref name=&quot;mudrooroo1994&quot;&gt;''Aboriginal mythology: an A-Z spanning the history of aboriginal mythology from the earliest legends to the present day'', Mudrooroo, London: HarperCollins, 1994, {{ISBN|1-85538-306-3}}.&lt;/ref&gt;{{Citation page|page=4}} The Murray River was formed when ''Totyerguil'' the hunter speared ''Otjout'', a giant [[Murray cod]], who, when wounded, churned a channel across southern Australia before entering the sky as the constellation [[Delphinus]].&lt;ref name=&quot;mudrooroo1994&quot; /&gt;{{Citation page|page=115}}<br /> <br /> In Chinese belief, the asterism consisting of Altair, β Aquilae and γ Aquilae is known as ''Hé Gǔ'' ({{lang|zh|河鼓}}; lit. &quot;river drum&quot;).&lt;ref name=allen/&gt; The [[Chinese star names|Chinese name]] for Altair is thus ''Hé Gǔ èr'' ({{lang|zh|河鼓二}}; lit. &quot;river drum two&quot;, meaning the &quot;second star of the drum at the river&quot;).&lt;ref&gt;{{in lang|zh}} [http://www.lcsd.gov.hk/CE/Museum/Space/Research/StarName/c_research_chinengstars_ala_alz.htm 香港太空館 - 研究資源 - 亮星中英對照表] {{webarchive|url=https://web.archive.org/web/20081025110153/http://www.lcsd.gov.hk/CE/Museum/Space/Research/StarName/c_research_chinengstars_ala_alz.htm |date=2008-10-25 }}, Hong Kong Space Museum. Accessed on line November 26, 2008.&lt;/ref&gt; However, Altair is better known by its other names: ''Qiān Niú Xīng'' ({{lang|zh-hans|牵牛星}} / {{lang|zh-hant|牽牛星}}) or ''Niú Láng Xīng'' ({{lang|zh|牛郎星}}), translated as the ''cowherd star''.&lt;ref&gt;{{Cite book |last=Mayers |first=William Frederick |url=http://archive.org/details/chinesereadersm01mayegoog |title=The Chinese reader's manual: A handbook of biographical, historical ... |publisher=American Presbyterian Mission Press |others=Harvard University |year=1874 |pages=97–98, 161 |author-link=William Frederick Mayers}}&lt;/ref&gt;&lt;ref name=brown&gt;p. 72, ''China, Japan, Korea Culture and Customs: Culture and Customs'', Ju Brown and John Brown, 2006, {{ISBN|978-1-4196-4893-9}}.&lt;/ref&gt; These names are an allusion to a love story, ''[[The Cowherd and the Weaver Girl]]'', in which Niulang (represented by Altair) and his two children (represented by [[β Aquilae]] and [[γ Aquilae]]) are separated from respectively their wife and mother Zhinu (represented by Vega) by the [[Milky Way]]. They are only permitted to meet once a year, when magpies form a bridge to allow them to cross the Milky Way.&lt;ref name=brown/&gt;&lt;ref&gt;pp. 105–107, ''Magic Lotus Lantern and Other Tales from the Han Chinese'', Haiwang Yuan and Michael Ann Williams, Libraries Unlimited, 2006, {{ISBN|978-1-59158-294-6}}.&lt;/ref&gt;<br /> <br /> The people of [[Micronesia]] called Altair ''Mai-lapa'', meaning &quot;big/old breadfruit&quot;, while the [[Māori people]] called this star ''Poutu-te-rangi'', meaning &quot;pillar of heaven&quot;.&lt;ref&gt;{{Cite book |last1=Ross |first1=Malcolm |url=https://books.google.com/books?id=bJFfm59fVr4C |title=The Lexicon of Proto-Oceanic: The Culture and Environment of Ancestral Oceanic Society. The physical environment. Volume 2 |last2=Pawley |first2=Andrew |last3=Osmond |first3=Meredith |date=2007-03-01 |publisher=ANU E Press |isbn=978-1-921313-19-6 |page=175 |language=en}}&lt;/ref&gt;<br /> <br /> In Western [[astrology]], the star was ill-omened, portending danger from [[reptile]]s.&lt;ref name=allen/&gt;<br /> <br /> This star is one of the asterisms used by [[Bugis]] sailors for navigation, called ''bintoéng timoro'', meaning &quot;eastern star&quot;.&lt;ref name=&quot;kelley11&quot;&gt;{{cite book|author1=Kelley, David H. |author2=Milone, Eugene F. |author3=Aveni, A.F. |title=Exploring Ancient Skies: A Survey of Ancient and Cultural Astronomy|publisher=Springer|location=New York, New York|year=2011|page=344|isbn=978-1-4419-7623-9|url=https://books.google.com/books?id=ILBuYcGASxcC&amp;pg=PA307}}&lt;/ref&gt;<br /> <br /> NASA announced ''Altair'' as the name of the [[Lunar Surface Access Module]] (LSAM) on December 13, 2007.&lt;ref&gt;{{cite news |url=http://www.collectspace.com/news/news-121307a.html |title=NASA names next-gen lunar lander Altair |date=December 13, 2007 |website=.collectSPACE |access-date=2022-08-03 }}&lt;/ref&gt; The Russian-made [[Beriev Be-200]] Altair seaplane is also named after the star.&lt;ref&gt;{{cite press release |url=http://www.beriev.com/eng/Pr_rel_e/pr_58e.html |title=Results of the competition for the best personal names for the Be-103 and the Be-200 amphibious aircraft |publisher=[[Beriev Aircraft Company]] |date=February 12, 2003 |access-date=2022-08-03 |archive-date=2021-11-05 |archive-url=https://web.archive.org/web/20211105063444/http://www.beriev.com/eng/Pr_rel_e/pr_58e.html |url-status=dead }}&lt;/ref&gt;<br /> {{clear left}}<br /> ==Visual companions==<br /> The bright primary [[star]] has the [[multiple star]] designation [[Washington Double Star Catalog|WDS]]&amp;nbsp;19508+0852A and has several faint visual companion stars, WDS&amp;nbsp;19508+0852B, C, D, E, F and G.&lt;ref name=&quot;wds&quot; /&gt; All are much more distant than Altair and not physically associated.&lt;ref&gt;{{cite DR2}}&lt;/ref&gt;<br /> <br /> {{Componentbox begin<br /> |designation=[[Washington Double Star Catalog|WDS]] 19508+0852<br /> |footnote=&lt;ref name=&quot;wds&quot;&gt;Entry 19508+0852,<br /> [http://ad.usno.navy.mil/wds/Webtextfiles/wdsnewframe4.html The Washington Double Star Catalog] {{Webarchive|url=https://web.archive.org/web/20090131051103/http://ad.usno.navy.mil/wds/Webtextfiles/wdsnewframe4.html |date=2009-01-31 }},<br /> [[United States Naval Observatory]]. Accessed online November 25, 2008.&lt;/ref&gt;<br /> |expanded = yes<br /> |centered = yes<br /> }}<br /> {{Componentbox component<br /> |letter = B<br /> |primary = A<br /> |appmag_v = 9.8<br /> |epoch = 2015<br /> |posang= 286<br /> |angdist= 195.8<br /> |ra = {{RA|19|50|40.5}}<br /> |dec = {{DEC|+08|52|13}}<br /> |radec_footnote = &lt;ref name=&quot;sb1&quot;&gt;{{SIMBAD link|CCDM+19508%2B0852B|BD+08 4236B -- Star in double system}}, database entry, [[SIMBAD]]. Accessed online November 25, 2008.&lt;/ref&gt;<br /> |simbad = CCDM+19508%2B0852B<br /> |expanded = yes<br /> }}<br /> {{Componentbox component<br /> |letter = C<br /> |primary = A<br /> |appmag_v = 10.3<br /> |epoch = 2015<br /> |posang= 110<br /> |angdist= 186.4<br /> |ra = {{RA|19|51|00.8}}<br /> |dec = {{DEC|+08|50|58}}<br /> |radec_footnote = &lt;ref name=&quot;sb2&quot;&gt;{{SIMBAD link|CCDM+19508%2B0852C|BD+08 4238 -- Star in double system}}, database entry, [[SIMBAD]]. Accessed online November 25, 2008.&lt;/ref&gt;<br /> |simbad = CCDM+19508%2B0852C<br /> |expanded = yes<br /> }}<br /> {{Componentbox component<br /> |letter = D<br /> |primary = A<br /> |appmag_v = 11.9<br /> |epoch = 2015<br /> |posang= 105<br /> |angdist= 26.8<br /> |ra =<br /> |dec =<br /> |radec_footnote =<br /> |simbad =<br /> |expanded = yes<br /> }}<br /> {{Componentbox component<br /> |letter = E<br /> |primary = A<br /> |appmag_v = 11.0<br /> |epoch = 2015<br /> |posang= 354<br /> |angdist= 157.3<br /> |ra =<br /> |dec =<br /> |radec_footnote =<br /> |simbad =<br /> |expanded = yes<br /> }}<br /> {{Componentbox component<br /> |letter = F<br /> |primary = A<br /> |appmag_v = 10.3<br /> |epoch = 2015<br /> |posang= 48<br /> |angdist= 292.4<br /> |ra = {{RA|19|51|02.0}}<br /> |dec = {{DEC|+08|55|33}}<br /> |radec_footnote =<br /> |simbad = CCDM+19510%2B0856AB<br /> |expanded = yes<br /> }}<br /> {{Componentbox component<br /> |letter = G<br /> |primary = A<br /> |appmag_v = 13.0<br /> |epoch = 2015<br /> |posang= 121<br /> |angdist= 185.1<br /> |ra =<br /> |dec =<br /> |radec_footnote =<br /> |simbad =<br /> |expanded = yes<br /> }}<br /> {{Componentbox end}}<br /> <br /> ==See also==<br /> * [[Lists of stars]]<br /> * [[List of brightest stars]]<br /> * [[List of nearest bright stars]]<br /> * [[Historical brightest stars]]<br /> * [[List of most luminous stars]]<br /> <br /> ==Notes==<br /> {{reflist|group=nb|refs=<br /> <br /> &lt;ref name=rot&gt;Owing to its rapid rotation, Altair's radius is larger at its equator than at its poles; it is also cooler at the equator than at the poles.&lt;/ref&gt;<br /> }}<br /> <br /> ==References==<br /> {{Reflist|refs=<br /> <br /> &lt;ref name=bouchaud2020&gt;{{cite journal |bibcode=2020A&amp;A...633A..78B |title=A realistic two-dimensional model of Altair |last1=Bouchaud |first1=K. |last2=Domiciano De Souza |first2=A. |last3=Rieutord |first3=M. |last4=Reese |first4=D. R. |last5=Kervella |first5=P. |journal=Astronomy and Astrophysics |year=2020 |volume=633 |pages=A78 |doi=10.1051/0004-6361/201936830 |arxiv=1912.03138 |s2cid=208857428 }}&lt;/ref&gt;<br /> <br /> &lt;ref name=aaa474_2_653&gt;{{citation | last1=van Leeuwen | first1=F. | title=Validation of the new Hipparcos reduction | journal=Astronomy and Astrophysics | volume=474 | issue=2 |date=November 2007 | pages=653–664 | doi=10.1051/0004-6361:20078357 | bibcode=2007A&amp;A...474..653V | arxiv=0708.1752 | s2cid=18759600 }}&lt;/ref&gt;<br /> <br /> &lt;ref name=aass85_3_1015&gt;{{citation | last1=Malagnini | first1=M. L. | last2=Morossi | first2=C. | title=Accurate absolute luminosities, effective temperatures, radii, masses and surface gravities for a selected sample of field stars | journal=Astronomy and Astrophysics Supplement Series | volume=85 | issue=3 | pages=1015–1019 |date=November 1990 | bibcode=1990A&amp;AS...85.1015M }}&lt;/ref&gt;<br /> <br /> &lt;ref name=bsc1&gt;[http://webviz.u-strasbg.fr/viz-bin/VizieR-5?-out.add=.&amp;-source=V/50/catalog&amp;recno=7557 HR 7557], database entry, The Bright Star Catalogue, 5th Revised Ed. (Preliminary Version), D. Hoffleit and W. H. Warren, Jr., [[Centre de Données astronomiques de Strasbourg|CDS]] ID [http://vizier.u-strasbg.fr/viz-bin/Cat?V/50 V/50]. Accessed on line November 25, 2008.&lt;/ref&gt;<br /> <br /> &lt;ref name=sb0&gt;{{SIMBAD link|alf+aql|NAME ALTAIR -- Variable Star of delta Sct type}}, database entry, [[SIMBAD]]. Accessed on line November 25, 2008.&lt;/ref&gt;<br /> <br /> &lt;ref name=darlingaltair&gt;{{cite web |url=http://www.daviddarling.info/encyclopedia/A/Altair.html |title=Altair |website=The Internet Encyclopedia of Science |author=David Darling |access-date=2022-08-03 }}&lt;/ref&gt;<br /> <br /> &lt;ref name=peterson06&gt;{{cite journal | doi = 10.1086/497981| title = Resolving the Effects of Rotation in Altair with Long‐Baseline Interferometry| year = 2006| last1 = Peterson| first1 = D. M.| last2 = Hummel| first2 = C. A.| last3 = Pauls| first3 = T. A.| last4 = Armstrong| first4 = J. T.| last5 = Benson| first5 = J. A.| last6 = Gilbreath| first6 = G. C.| last7 = Hindsley| first7 = R. B.| last8 = Hutter| first8 = D. J.| last9 = Johnston| first9 = K. J.| last10 = Mozurkewich| first10 = D.| last11 = Schmitt| first11 = H.| display-authors=3 | journal = The Astrophysical Journal| volume = 636| issue = 2| pages = 1087–1097| arxiv = astro-ph/0509236| bibcode = 2006ApJ...636.1087P| s2cid = 18683397}} See Table 2 for stellar parameters.&lt;/ref&gt;<br /> <br /> &lt;ref name=robrade2009&gt;{{citation | title=Altair - the &quot;hottest&quot; magnetically active star in X-rays | last1=Robrade | first1=J. | last2=Schmitt | first2=J. H. M. M. | journal=Astronomy and Astrophysics | volume=497 | issue=2 | pages=511–520 | date=April 2009 | doi=10.1051/0004-6361/200811348 | bibcode=2009A&amp;A...497..511R | arxiv=0903.0966 | s2cid=14320453 | postscript=. }}&lt;/ref&gt;<br /> <br /> }}<br /> <br /> ==External links==<br /> {{Commons category|Altair}}<br /> * [https://web.archive.org/web/20090414054427/http://origins.jpl.nasa.gov/library/story/072501-a.html Star with Midriff Bulge Eyed by Astronomers], JPL press release, July 25, 2001.<br /> * [https://sl.voxastro.org/library/UVES-POP/details/?star=Altair Spectrum of Altair]<br /> * [http://www.astro.lsa.umich.edu/~monnier/Altair2007/altair2007.html Imaging the Surface of Altair], University of Michigan news release detailing the CHARA array direct imaging of the stellar surface in 2007.<br /> * [http://photojournal.jpl.nasa.gov/catalog/PIA04204 PIA04204: Altair], NASA. Image of Altair from the [[Palomar Testbed Interferometer]].<br /> * [http://www.solstation.com/stars/altair.htm Altair], ''SolStation''.<br /> * [http://news.bbc.co.uk/2/hi/science/nature/6709345.stm Secrets of Sun-like star probed], ''BBC News'', June 1, 2007.<br /> * [http://www.astromart.com/news/news.asp?news_id=697 Astronomers Capture First Images of the Surface Features of Altair], ''Astromart.com''<br /> * [http://aladin.u-strasbg.fr/AladinPreview?-c=NAME+ALTAIR&amp;ident=NAME+ALTAIR&amp;submit=Aladin+previewer Image of Altair] from [[Aladin Sky Atlas|Aladin]].<br /> <br /> {{Sky|19|50|46.9990|+|08|52|05.959|17}}<br /> {{Nearest systems|4}}<br /> {{Stars of Aquila}}<br /> {{Portal bar|Astronomy|Stars|Spaceflight|Outer space|Solar System}}<br /> {{Authority control}}<br /> {{DEFAULTSORT:Altair}}<br /> [[Category:Aquila (constellation)]]<br /> [[Category:A-type main-sequence stars]]<br /> [[Category:Multiple stars|4]]<br /> [[Category:Flamsteed objects|Aquilae, 53]]<br /> [[Category:Bayer objects|Aquilae, Alpha]]<br /> [[Category:Henry Draper Catalogue objects|187642]]<br /> [[Category:Hipparcos objects|097649]]<br /> [[Category:Bright Star Catalogue objects|7557]]<br /> [[Category:Delta Scuti variables]]<br /> [[Category:Stars with proper names|Altair]]<br /> [[Category:Durchmusterung objects|BD+08 4236]]<br /> [[Category:G-Cloud]]<br /> [[Category:Astronomical objects known since antiquity]]<br /> [[Category:Gliese and GJ objects|0768]]<br /> [[Category:TIC objects]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Mesoamerican_writing_systems&diff=1170909459 Mesoamerican writing systems 2023-08-17T22:59:19Z <p>205.189.94.9: </p> <hr /> <div>{{Use American English|date=March 2019}}<br /> {{Short description|One of three cradles of civilization thought to have developed writing independently}}<br /> {{More citations needed|date=January 2008}}<br /> [[Mesoamerica]], along with [[Cuneiform|Mesopotamia]] and [[Oracle bone script|China]], is one of three known places in the world where writing is thought to have [[Invention of writing|developed independently]].&lt;ref&gt;{{cite book |title=The Oxford Companion to Archaeology |date=2012 |publisher=Oxford University Press |isbn=9780195076189 |page=762 |url=https://books.google.com/books?id=ystMAgAAQBAJ&amp;pg=PA762}}&lt;/ref&gt; Mesoamerican scripts deciphered to date are a combination of [[logographic]] and [[syllabary|syllabic]] systems. They are often called hieroglyphs due to the iconic shapes of many of the glyphs, a pattern superficially similar to [[Egyptian hieroglyphs]]. Fifteen distinct writing systems have been identified in pre-Columbian Mesoamerica, many from a single inscription.&lt;ref&gt;{{cite book |last1=Macri |first1=Martha J. |title=&quot;Maya and Other Mesoamerican Scripts,&quot; in The World's Writing Systems |date=1996 |publisher=Oxford |location=England |pages=172–182}}&lt;/ref&gt; The limits of archaeological dating methods make it difficult to establish which was the earliest and hence the progenitor from which the others developed. The best documented and deciphered Mesoamerican writing system, and the most widely known, is the classic [[Maya script]]. Earlier scripts with poorer and varying levels of decipherment include the [[Olmec hieroglyphs]], the [[Zapotec script]], and the [[Isthmian script]], all of which date back to the 1st millennium BC. An extensive [[Mesoamerican literature]] has been conserved, partly in indigenous scripts and partly in postconquest transcriptions in the [[Latin script]]. It is debatable whether a set of human historical behaviors led to all three developing nearly identical systems within the same era.<br /> <br /> == Pre-Classic and Classic Period ==<br /> In Mesoamerica, writing emerged during the [[Mesoamerican chronology|Pre-classic Period]], with Zapotec and Maya writing flourishing during the [[Mesoamerican chronology#Classic Period|Classic Period]].<br /> [[File:Cascajal-text.svg|thumb|right|upright|The 62 glyphs of the [[Cascajal block]]]]<br /> <br /> === Olmec writing ===<br /> {{Main|Olmec hieroglyphs}}<br /> <br /> Early [[Olmec]] ceramics show representations of something that may be codices, suggesting that [[amatl]] bark codices, and by extension well-developed writing, existed in Olmec times.{{citation needed|date=July 2008}} It was also long thought that many of the glyphs present on Olmec monumental sculpture, such as those on the so-called &quot;Ambassador Monument&quot; (La Venta Monument 13), represented an early Olmec script. This suspicion was reinforced in 2002 by the announcement of the discovery of [[San Andrés (Mesoamerican site)#Indications of an Olmec writing system|similar glyphs]] at [[San Andres (Mesoamerican site)|San Andres]].&lt;ref&gt;{{Cite journal|url=https://www.science.org/doi/10.1126/science.1078474|title=Olmec Origins of Mesoamerican Writing|journal=Science|date=6 December 2002|last1=Pohl|first1=Mary E. D.|last2=Pope|first2=Kevin O.|last3=Nagy|first3=Christopher von|volume=298|issue=5600|pages=1984–1987|doi=10.1126/science.1078474|pmid=12471256|s2cid=19494498}}&lt;/ref&gt;<br /> <br /> In September 2006, a report published in [[Science (journal)|''Science'' magazine]] announced the discovery of the [[Cascajal block]], a writing-tablet-sized block of [[Serpentinite|serpentine]] with 62 characters unlike any yet seen in Mesoamerica. This block was discovered by locals in the [[Olmec heartland]] and was dated by the archaeologists to approximately 900 [[Common Era|BCE]] based on other debris. If the authenticity and date can be verified, this will prove to be the earliest writing yet found in Mesoamerica.<br /> [[File:Monument 3, San Jose Mogote.JPG|thumb|upright|left|Monument&amp;nbsp;3 at San José Mogote. The two shaded [[glyph]]s between his legs are likely his name, Earthquake&amp;nbsp;1.]]<br /> <br /> === Zapotec writing ===<br /> {{Main|Zapotec writing}}<br /> <br /> Another candidate for earliest writing system in Mesoamerica is the writing system of the [[Zapotec civilization|Zapotec]] culture. Rising in the late [[Mesoamerican chronology|Pre-Classic era]] after the decline of the Olmec civilization, the Zapotecs of present-day Oaxaca built an empire around [[Monte Albán]]. On a few monuments at this archaeological site, archaeologists have found extended text in a glyphic script. Some signs can be recognized as calendric information but the script as such remains undeciphered. Read in columns from top to bottom, its execution is somewhat cruder than that of the later Classic Maya and this has led epigraphers to believe that the script was also less phonetic than the largely syllabic Maya script. These are, however, speculations.<br /> <br /> The earliest known monument with Zapotec writing is a &quot;Danzante&quot; stone, officially known as Monument&amp;nbsp;3, found in [[San José Mogote]], [[Oaxaca]]. It has a relief of what appears to be a dead and bloodied captive with two glyphic signs between his legs, probably representing his name. First dated to 500–600 BCE, this was earlier considered the earliest writing in Mesoamerica. However, doubts have been expressed as to this dating, and the monument may have been reused. The Zapotec script went out of use only in the late Classic period.<br /> [[File:La Mojarra Estela 1 (Escritura superior).jpg|thumb|upright|right|Detail showing glyphs from 2nd century CE [[La Mojarra Stela&amp;nbsp;1]]. The left column gives a Long Count date of 8.5.16.9.9, or 162 CE. The other columns are glyphs from the [[Epi-Olmec script]].]]<br /> <br /> === Epi-Olmec or Isthmian script ===<br /> {{Main|Epi-Olmec script}}<br /> <br /> A small number of artifacts found in the [[Isthmus of Tehuantepec]] show examples of another early Mesoamerican writing system. They can be seen to contain calendric information but are otherwise undeciphered. The longest of these texts are on [[La Mojarra Stela 1]] and the [[Tuxtla Statuette]]. The writing system used is very close to the Maya script, using affixal glyphs and Long Count dates, but is read only in one column at a time as is the Zapotec script. It has been suggested that this Isthmian or [[Epi-Olmec script]] is the direct predecessor of the Maya script, thus giving the Maya script a non-Maya origin. Another artifact with Epi-Olmec script is the Chiapa de Corzo stela which is the oldest monument of the Americas inscribed with its own date: the [[Maya calendar#Long Count|Long Count]] on the stela dates it to 36 BCE.<br /> <br /> In a 1997 paper, John Justeson and Terrence Kaufman put forward a decipherment of Epi-Olmec. The following year, however, their interpretation was disputed by Stephen Houston and [[Michael D. Coe]], who unsuccessfully applied Justeson and Kaufman's decipherment system against epi-Olmec script from the back of a hitherto unknown mask. The matter remains under dispute.<br /> [[File:Abaj_Takalik_Stela_5.jpg|thumb|upright|left|Stela 5 from Abaj Takalik]]<br /> <br /> === Abaj Takalik and Kaminaljuyú scripts ===<br /> <br /> In the highland Maya archaeological sites of [[Abaj Takalik]] and [[Kaminaljuyú]] writing has been found dating to [[Izapa]] culture. It is likely that in this area in late Pre-Classic times an ancient form of a [[Mixe–Zoquean languages|Mixe–Zoquean language]] was spoken, and the inscriptions found here may be in such a language rather than a Maya one. Some glyphs in this scripts are readable as they are identical to Maya glyphs but the script remains undeciphered. The advanced decay and destruction of these archaeological sites make it improbable that more monuments with these scripts will come to light making possible a decipherment.<br /> [[File:Palenque glyphs-edit1.jpg|thumb|Maya glyphs in stucco at the ''Museo de sitio'' in [[Palenque]], Mexico|alt=]]<br /> <br /> === Maya writing ===<br /> {{Main|Maya script}}<br /> <br /> Maya writing is attested from the mid-preclassic period in the center of Petén in the Maya lowlands, and lately scholars have suggested that the earliest Maya inscriptions may in fact be the oldest of Mesoamerica. The earliest inscriptions in an identifiably Maya script date back to 200–300 BCE. Early examples include the painted inscriptions at the caves of [[Naj Tunich]] and La Cobanerita in [[El Petén]], [[Guatemala]]. The most elaborate inscriptions are considered to be those at classic sites like [[Palenque]], [[Copán]] and [[Tikal]].<br /> <br /> The [[Maya script]] is generally considered to be the most fully developed Mesoamerican writing system, mostly because of its extraordinary aesthetics and because it has been partially deciphered. In Maya writing, logograms and syllable signs are combined. Around 700 different glyphs have been documented, with some 75% having been deciphered. Around 7000 texts in Maya script have been documented.<br /> <br /> Maya writing first developed as only utilizing logograms, but later included the use of phonetic complements in order to differentiate between the semantic meanings of the logograms and for context that allows for syllabic spelling of words.&lt;ref&gt;{{cite journal |last1=Campbell |first1=L. |last2=Kaufman |first2=T. |date=1985 |title=Maya Linguistics: Where Are We Now? |journal=Annual Review of Anthropology |volume=14 |issue=1 |pages=187–198 |doi=10.1146/annurev.an.14.100185.001155}}&lt;/ref&gt;<br /> <br /> Post-classic inscriptions are found at the Yucatán peninsula in sites such as [[Chichén Itza]] and [[Uxmal]] but the style is not nearly as accomplished as the classic Maya inscriptions.<br /> <br /> === Other potential Mesoamerican writing systems ===<br /> Two other potential writing systems of the pre-classic period have been found in Mesoamerica: The Tlatilco cylinder seal was found during the time frame of the Olmec occupation of Tlatilco, and appears to contain a non-pictographic script. The Chiapa de Corzo cylinder seal found at that location in Mexico also appears to be an example of an unknown Mesoamerican script.&lt;ref&gt;{{cite journal |last1=Kelley |first1=David H. |title=A Cylinder Seal from Tlatilco |journal=American Antiquity |date=1966 |volume=31 |issue=5 |pages=744–46|doi=10.2307/2694503 |jstor=2694503}}&lt;/ref&gt;<br /> <br /> Certain iconographic elements in [[Teotihuacan]]o art have been considered as a potential script,&lt;ref name=&quot;:0&quot;&gt;Taube, Karl A. (2000), [http://www.mesoweb.com/bearc/caa/AA01.pdf The Writing System of Ancient Teotihuacan.] Center for Ancient American Studies, Barnardsville, NC.&lt;/ref&gt; although it is attested sparsely and in individual glyphs rather than texts. If it indeed is a writing system, it is &quot;one whose usage is non-textual and only restricted to naming people and places&quot;.&lt;ref name=&quot;:1&quot;&gt;{{Cite web|url=http://www.ancientscripts.com/teotihuacan.html|title=Ancient Scripts: Teotihuacan|website=www.ancientscripts.com|access-date=2020-04-23}}&lt;/ref&gt; In this aspect, it resembles later Central Mexican writing systems such as Mixtec and Aztec.&lt;ref name=&quot;:0&quot; /&gt;&lt;ref name=&quot;:1&quot; /&gt;<br /> <br /> == Post-Classic Period ==<br /> During the [[Mesoamerican chronology|post-classic period]], the Maya glyphic system continued to be used, but much less so. Other post-classic cultures such as the [[Aztec codices|Aztec]] did not have fully developed writing systems, but instead used [[semasiographic]] writing.<br /> <br /> === Mixtec writing ===<br /> {{Main|Mixtec writing}}<br /> [[File:Códice_Vindobonensis.jpg|thumb|Line 37 of the Códice Vindobonensis or Yuta Tnoho|alt=]]<br /> Mixtec writing emerged during the 13th century, much later than the systems previously mentioned. Mixtec is a [[Semasiography|semasiographic]] system that was used by the pre-Hispanic [[Mixtec]]s. Many of its characteristics were later adopted by the Mexica and Mixteca-Puebla writing systems. The origin of the Mixteca-Puebla is the subject of debate amongst experts. The [[Mixtec writing|Mixtec writing system]] consisted of a set of figurative signs and symbols that served as guides for storytellers as they recounted legends. These storytellers were usually priests and other members of the Mixtec upper class.<br /> <br /> Mixtec writing has been categorized as being a mixture of pictorial and logographic, rather than a complete logogram system.&lt;ref&gt;{{cite journal|last=Kubler|first=George|date=1974|title=Review of Picture Writing from Ancient Southern Mexico: Mixtec Place Signs and Maps|journal=American Anthropologist|volume=76|issue=3|pages=670–672|doi=10.1525/aa.1974.76.3.02a00840|jstor=674740}}&lt;/ref&gt;<br /> <br /> Mixtec writing has been preserved through various archaeological artifacts that have survived the passage of time and the destruction of the [[Spanish colonization of the Americas|Spanish conquest]].&lt;ref&gt;{{cite journal|last=Pohl|first=John M. D.|year=2005|title=The Griffin Fragment: A Mixtec Drinking Vessel Portraying the Pace Sign for 'Hill of the Turkey|journal=Record of the Art Museum, Princeton University|volume=64|pages=81–90}}&lt;/ref&gt; Among these objects are four pre-Hispanic [[Codex|codices]] written on tanned deer skin covered with [[stucco]]. These codices are read in [[boustrophedon]], a zigzag style in which the reader follows red lines that indicate the way to read.&lt;ref&gt;{{cite book|last=Jansen|first=Marten|title=Huisi Tacu. Estudio interpretativo de un libro mixteco antiguo. Codex Vindobonensis Mexicanus I|publisher=Centro de Estudios y Documentación Latinoamericanos|year=1982|location=Amsterdam}}&lt;/ref&gt; Most of the current knowledge about the writing of the Mixtecans is due to the work of [[Alfonso Caso]], who undertook the task of deciphering the code based on a set of pre-Columbian and colonial documents of the Mixtec culture.&lt;ref&gt;{{cite book|last=Fagan|first=Brian|title=The Great Archaeologists|publisher=Thames &amp; Hudson|year=2014|isbn=978-0-500-05181-8|location=New York|pages=110–114}}&lt;/ref&gt;<br /> <br /> Although the Mixtecs had a set of symbols that allowed them to record historical dates, they did not use the [[Mesoamerican Long Count calendar|long count calendar]] characteristic of other southeast Mesoamerican writing systems. Instead, the codices that have been preserved record historical events of this pre-Columbian people, especially those events related to expansionism in the era of [[Eight Deer Jaguar Claw|Ocho Venado]], lord of [[Tilantongo]].&lt;ref&gt;{{cite book|last=López Ramos|first=Juan Arturo|title=Esplendor de la antigua Mixteca|publisher=Editorial Trillas|year=1987|isbn=968-24-2613-8|location=México|pages=99–109}}&lt;/ref&gt;<br /> <br /> === Aztec writing ===<br /> {{main|Aztec writing}}<br /> [[File:aztlan codex boturini.jpg|thumb|Detail of first page from the Aztec ''Boturini Codex'', showing the use of semasiological writing combined with phonetic glyph elements.|alt=|left]]<br /> The Aztec writing system is adopted from writing systems used in Central Mexico. It is related to [[Mixtec writing]] and both are thought to descend from [[Zapotec writing]].&lt;ref&gt;{{cite journal|author=Justeson, John S.|author-link=John S. Justeson|date=February 1986|title=The Origin of Writing Systems: Preclassic Mesoamerica|url=http://history.missouristate.edu/chuchiak/template/Justeson.pdf|url-status=dead|format=online facsimile|journal=[[World Archaeology]]|location=London|publisher=[[Routledge &amp; Kegan Paul]]|volume=17|issue=3|pages=449|doi=10.1080/00438243.1986.9979981|issn=0043-8243|oclc=2243103|archive-url=https://web.archive.org/web/20091122144434/http://history.missouristate.edu/chuchiak/template/Justeson.pdf|archive-date=2009-11-22|access-date=2009-06-09}}&lt;/ref&gt; The [[Aztec codices|Aztecs]] used [[semasiographic]] writing, although they have been said to be slowly developing phonetic principles in [[Aztec writing|their writing]] by the use of the [[rebus]] principle. [[Aztec]] name glyphs for example, do combine logographic elements with phonetic readings.<br /> <br /> == Post-Columbian Period ==<br /> When Europeans arrived in the 16th century, they found several writing systems in use that drew from Olmec, Zapotec, and Teotihuacano traditions.&lt;ref&gt;Robert T. Jiménez, &amp; Patrick H. Smith. (2008). Mesoamerican Literacies: Indigenous Writing Systems and Contemporary Possibilities. Reading Research Quarterly, 43(1), 28.&lt;/ref&gt; Books and other written material were commonplace in Mesoamerica when [[Hernán Cortés]] arrived in 1519. Archaeologists have found inside elite Mayan homes personal objects inscribed with the owners' names. In public areas large stone pillars and inscribed monuments have been found clearly meant for the general public.&lt;ref&gt;M.R. Romero. (2003). Los zapotcos, la escritura y la historia [The Zapotecs, writing and history]. In M.A. Romero Frizzi (Ed.), ''Escritura zapoteca: 2,500 anos de historia'' [Zapotec writing:2500 years of history] Mexico, DF: Centro de Investigacion y Estudios Superiores en Antropologia Social.&lt;/ref&gt;<br /> <br /> Early post-Columbian sources preserve and document aspects of indigenous literature (e.g., [[Francisco Ximénez|Ximenez]]'s manuscript of the [[Popol Vuh#Father Ximénez's manuscript|Popol Vuh]]) and writing ([[Diego de Landa]]'s ''[[Relación de las cosas de Yucatán]]'' contained Maya calendar signs and a syllabary). As European Franciscan missionaries arrived they found that the Cholutecans used rebus principles as a way to translate information into Latin as a teaching aid for the Indians to learn Christian prayers.&lt;ref&gt;Mendieta, G.de (1971). Historia Eclesiastica Indiana [A religious History of the Indians]. Mexico, DF: Editorial Porrua (Original work published 1945)&lt;/ref&gt; A number of colonial-era [[Aztec codices]] are preserved, most notably the [[Codex Mendoza]], the [[Florentine Codex]], and the works by [[Diego Durán]]. Codex Mendoza (around 1541) is a mixed pictorial, alphabetic Spanish manuscript.&lt;ref&gt;Berdan, Frances, and Patricia Rieff Anawalt. The Codex Mendoza. 4 vols. Berkeley: University of California Press, 1992.&lt;/ref&gt; The [[Florentine Codex]], compiled 1545-1590 by Franciscan friar [[Bernardino de Sahagún]] includes a history of the [[Spanish conquest of the Aztec Empire]] from the Mexica viewpoint,&lt;ref&gt;Sahagún, Bernardino de. ''El Códice florentino: Manuscrito 218-20 de la Colección Palatina de la Biblioteca Medicea Laurenziana''. Fascimile ed., 3 vols. Florence: Giunti Barbera and México: Secretaría de Gobernación, 1979.&lt;/ref&gt; with bilingual Nahuatl/Spanish alphabetic text and illustrations by native artists.&lt;ref&gt;Sahagún, Bernardino de. ''General History of the Things of New Spain: Florentine Codex.'' Translated by Arthur J. O Anderson and Charles E Dibble. 13 vols. Monographs of the School of American Research 14. Santa Fe: School of American Research; Salt Lake City: University of Utah, 1950-82.&lt;/ref&gt; There are also the works of Dominican [[Diego Durán]] (before 1581), who drew on indigenous pictorials and living informants to create illustrated texts on history and religion.&lt;ref&gt;Durán, Diego. ''The History of the Indies of New Spain.'' Translated by Doris Heyden. Norman: University of Oklahoma Press, 1994. Durán, Diego. ''Book of the Gods and Rites and the Ancient Calendar''. Translated by Fernando Horcasitas and Doris Heyden. Norman: University of Oklahoma Press, 1971.&lt;/ref&gt; The colonial-era codices often contain Aztec pictograms or other pictorial elements.<br /> <br /> Later indigenous literature employed Latin script exclusively, e.g., the Books of [[Chilam Balam]] that date from the 17th c. onwards. Already by the mid-16th c., use of the Latin script for Mesoamerican languages seems to have been well established.&lt;ref name=&quot;:2&quot; /&gt; For writing Maya, colonial manuscripts conventionally adopt a number of special characters and diacritics thought to have been invented by Francisco de la Parra around 1545.&lt;ref name=&quot;:2&quot;&gt;Joseph DeChicchis (2012), Current Trends in Mayan Literacy, In: John C. Maher, Jelisava Dobovsek-Sethna, and Cary Duval (eds.), [http://lib.mainit.org/153/1/literacy-for-dialogue-in-multilingual-societies-2011.pdf Literacy for Dialogue in Multilingual Societies. Proceedings of Linguapax Asia Symposium 2011], Tokyo 2012, p. 71-82&lt;/ref&gt;&lt;ref&gt;''Francisco de la Parra'' [con [[Pedro de Betanzos]]]: ''Arte, pronunciación y orthographia de'' ... cakchequel. ms.&lt;/ref&gt; The original manuscript of the Popol Vuh is also dated to this period (but only indirectly, by its content). The first major work of Mayan literature known to be originally written in Latin script are the [[Annals of the Cakchiquels]] (since 1571).&lt;ref name=&quot;:2&quot; /&gt;<br /> <br /> Since the mid 1990s, Maya intellectuals attended workshops organized by [[Linda Schele]] to learn about Maya writing,&lt;ref&gt;{{Cite web|url=http://www.ancientscripts.com/ma_ws.html|title=Ancient Scripts: Mesoamerican Writing Systems|website=www.ancientscripts.com|access-date=2020-04-24}}&lt;/ref&gt; and with digital technologies, Maya writing may indeed face a resurrection.&lt;ref name=&quot;:2&quot; /&gt; Most notably, this includes work on the representation of Maya glyphs in [[Unicode]] since 2016 (not yet concluded by 2020).&lt;ref&gt;{{Cite web|url=https://linguistics.berkeley.edu/sei/progress-overview.html|title=progress overview|website=linguistics.berkeley.edu|access-date=2020-04-23}}&lt;/ref&gt; The goal of encoding Maya hieroglyphs in Unicode is to facilitate the ''modern'' use of the script. For representing the degree of flexibility and variation of '''classical''' Maya, the expressiveness of Unicode is insufficient (e.g., wrt. the representation of infixes), so, for philological applications, different technologies are required.&lt;ref&gt;{{cite book|first1=Franziska|last1=Diehr|first2=Sven|last2=Gronemeyer|first3=Elisabeth|last3=Wagner|first4=Christian|last4=Prager|first5=Katja|last5=Diederichs|first6=Uwe|last6=Sikora|first7=Maximilian|last7=Brodhun|first8=Nikolai|last8=Grube|chapter-url= https://pdfs.semanticscholar.org/1965/0e9b45f8542d4b459e4a48d47d51d322fa32.pdf|chapter= Modelling vagueness-A criteria-based system for the qualitative assessment of reading proposals for the deciphering of Classic Mayan hieroglyphs|title=Proceedings of the Workshop on Computational Methods in the Humanities 2018 (COMHUM 2018)|date=2018|s2cid=67865187|archive-url=https://web.archive.org/web/20200208125552/https://pdfs.semanticscholar.org/1965/0e9b45f8542d4b459e4a48d47d51d322fa32.pdf|archive-date=2020-02-08|display-authors=1}}&lt;/ref&gt;&lt;ref&gt;{{Cite book|display-authors=1|first1=Christian|last1=Prager|first2=Nikolai|last2=Grube|first3=Maximilian|last3=Brodhun|first4=Katja|last4=Diederichs|first5=Franziska|last5=Diehr|first6=Sven|last6=Gronemeyer|first7=Elisabeth|last7=Wagner|url= https://www.degruyter.com/view/title/539945|title=Crossing Experiences in Digital Epigraphy: From Practice to Discipline|date=2018|publisher=De Gruyter|isbn=978-3-11-060720-8|editor-last=De Santis|editor-first=Annamaria|pages=65–83|language=en|chapter=The Digital Exploration of Maya Hieroglyphic Writing and Language|editor-last2=Rossi|editor-first2=Irene}}&lt;/ref&gt;<br /> <br /> == References ==<br /> {{Reflist}}<br /> * Michael D. Coe and Justin Kerr, ''The Art of the Maya Scribe'', Thames and Hudson. 1997.<br /> * Martinez, Ma. del Carmen Rodríguez; Ponciano Ortíz Ceballos; Michael D. Coe; Richard A. Diehl; Stephen D. Houston; Karl A. Taube; Alfredo Delgado Calderón; &quot;Oldest Writing in the New World&quot;, in ''Science'', 15 September 2006, {{abbr|vol.|volume}} 313, no. 5793, pp.&amp;nbsp;1610–1614.<br /> * Nielsen, Jesper, ''Under slangehimlen'', Aschehoug, Denmark, 2000.<br /> * Sampson, Geoffrey. ''Writing Systems: A Linguistic Introduction''. Hutchinson (London), 1985.<br /> <br /> == External links ==<br /> {{Portal|Indigenous peoples of the Americas|Writing|Society}}<br /> * [http://www.peabody.harvard.edu/node/24 Corpus of Maya Hieroglyphic Inscriptions Program, at the Peabody Museum of Archaeology and Ethnology, Harvard University.]<br /> * [http://www.hup.harvard.edu/results-list.php?collection=1210 Corpus of Maya Hieroglyphic Inscriptions, volumes 1–9. Published by the Peabody Museum Press and distributed by the Harvard University Press.]<br /> <br /> {{Pre-columbian cultures and civilizations}}<br /> {{List of writing systems}}<br /> <br /> [[Category:Mesoamerican writing systems| ]]<br /> [[Category:Logographic writing systems]]<br /> [[Category:Proto-writing]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Necho_II&diff=1170909279 Necho II 2023-08-17T22:57:28Z <p>205.189.94.9: </p> <hr /> <div>{{Short description|Egyptian pharaoh}}<br /> {{Infobox pharaoh<br /> | Name=Necho II<br /> | Image=Necho-KnellingStatue_BrooklynMuseum.png<br /> | ImageSize=200px<br /> | Caption=A small kneeling bronze statuette, likely Necho&amp;nbsp;II, now residing in the [[Brooklyn Museum]]<br /> | NomenHiero=&lt;hiero&gt;n:E1-w&lt;/hiero&gt;<br /> | Nomen=''Necho''<br /> | PrenomenHiero=&lt;hiero&gt;ra-wHm-ib&lt;/hiero&gt;<br /> | Prenomen=''Wahemibre''<br /> | Golden=''Merynetjeru''<br /> | Nebty=''Maakheru''<br /> | Horus=''Maaib''<br /> | HorusHiero=&lt;hiero&gt;S32:ib&lt;/hiero&gt;<br /> | Reign=610&amp;ndash;595 BC<br /> | Died=595 BC<br /> | Predecessor=[[Psamtik I]]<br /> | Successor=[[Psamtik II]]<br /> | Alt=Nekau<br /> | Dynasty=[[Twenty-sixth dynasty of Egypt|26th&amp;nbsp;dynasty]]<br /> | Spouse=[[Khedebneithirbinet I]]<br /> }}<br /> '''Necho II'''&lt;ref&gt;Thomas Dobson. Encyclopædia: Or, A Dictionary of Arts, Sciences, and Miscellaneous Literature. Stone house, no. 41, South Second street, 1798. Page [https://books.google.com/books?id=8Pzg8T5RkGQC&amp;pg=PA785 785]&lt;/ref&gt; (sometimes '''Nekau''',&lt;ref&gt;A History of Egypt, from the XIXth to the XXXth Dynasties. By Sir William Matthew Flinders Petrie. [https://books.google.com/books?id=ycsSAAAAYAAJ&amp;pg=PA336 p336].&lt;/ref&gt; '''Neku''',&lt;ref&gt;The Historians' History of the World: Prolegomena; Egypt, Mesopotamia. Edited by Henry Smith Williams. p183.&lt;/ref&gt; '''Nechoh''',&lt;ref&gt;United States Exploring Expedition: Volume 15. By [[Charles Wilkes]], United States. Congress. [https://books.google.com/books?id=fPhKAAAAYAAJ&amp;pg=PA53 p53]&lt;/ref&gt; or '''Nikuu''';&lt;ref&gt;The Bibliotheca Sacra, Volume 45. Dallas Theological Seminary., 1888.&lt;/ref&gt; Greek: Νεκώς Β';&lt;ref&gt;Essay on the Hieroglyphic System of M. Champollion, Jun., and on the Advantages which it Offers to Sacred Criticism. By J. G. Honoré Greppo. [https://books.google.com/books?id=LYkBAAAAMAAJ&amp;pg=PA128 p128]&lt;/ref&gt;&lt;ref&gt;Herodotus 2,152. 2&lt;/ref&gt;&lt;ref&gt;W. Pape, &quot;Wörterbuch der griechischen Eigennamen&quot;, 1911&lt;/ref&gt; {{hebrew Name|נְכוֹ|Nəḵō|Neḵō}}) of [[Ancient Egypt|Egypt]] was a king of the [[26th Dynasty]] (610–595 BC), which ruled from [[Sais, Egypt|Sais]].&lt;ref&gt;{{Citation |title=The Ancient Fragments |editor-last1=Cory|editor-first1=Isaac Preston |publisher=William Pickering |place=London|year=1828|oclc=1000992106 }}, citing [[Manetho]], the high priest and scribe of Egypt, being by birth a Sebennyte, who wrote his history for [[Ptolemy Philadelphus]] (266 BCE – 228 BCE).&lt;/ref&gt; Necho undertook a number of construction projects across his kingdom.&lt;ref&gt;The history of Egypt By Samuel Sharpe. E. Moxon, 1852. Part 640. [https://books.google.com/books?id=5IBUAAAAYAAJ&amp;pg=PA138 p138].&lt;/ref&gt; In his reign, according to the [[Greece|Greek]] historian [[Herodotus]], Necho II sent out an expedition of Phoenicians, which in three years sailed from the [[Red Sea]] around [[Africa]] to the Strait of Gibraltar and back to Egypt.&lt;ref&gt;Herodotus (4.42)[https://penelope.uchicago.edu/Thayer/E/Roman/Texts/Herodotus/4B*.html#42]&lt;/ref&gt; His son, [[Psammetichus II]], upon succession may have removed Necho's name from monuments.&lt;ref&gt;The Popular Handbook of Archaeology and the Bible. Edited by Norman L. Geisler, Joseph M. Holden. p287.&lt;/ref&gt;<br /> <br /> This is may be a derivation of [[Nebuchadnezzar]]. <br /> <br /> Necho played a significant role in the histories of the [[Neo-Assyrian Empire]], the [[Neo-Babylonian Empire]] and the [[Kingdom of Judah]]. Necho II is most likely the pharaoh Neco who was mentioned in 2 Kings, 2 Chronicles, and Jeremiah of the [[Bible]].&lt;ref&gt;Encyclopædia britannica. Edited by [[Colin MacFarquhar]], [[George Gleig]]. [https://books.google.com/books?id=cthTAAAAYAAJ&amp;pg=PA785 p785]&lt;/ref&gt;&lt;ref&gt;The Holy Bible, According to the Authorized Version (A.D. 1611). Edited by [[Frederic Charles Cook]]. [https://books.google.com/books?id=n4UXAAAAYAAJ&amp;pg=PA131 p131]&lt;/ref&gt;&lt;ref&gt;see [[Hebrew Bible]] / [[Old Testament]]&lt;/ref&gt; The aim of the second of Necho's campaigns was Asiatic conquest,&lt;ref&gt;The temple of Mut in Asher. By [[Margaret Benson]], [[Janet Gourlay|Janet A. Gourlay]], [[Percy Edward Newberry]]. [https://books.google.com/books?id=N4RJAAAAMAAJ&amp;pg=PA276 p276]. (''cf''. Nekau's chief ambition lay in Asiatic conquest)&lt;/ref&gt;&lt;ref&gt;Egypt Under the Pharaohs: A History Derived Entireley from the Monuments. By [[Heinrich Brugsch]], [[George Charles Brodrick|Brodrick]]. [https://books.google.com/books?id=tJI5AQAAMAAJ&amp;pg=PA444 p444] (''cf''. Neku then attempted to assert the Egyptian supremacy in Asia.)&lt;/ref&gt; to contain the westward advance of the Neo-Babylonian Empire, and cut off its trade route across the Euphrates. However, the [[Egyptians]] were defeated by the unexpected attack of the Babylonians and were eventually expelled from Syria.<br /> <br /> The [[Egyptologist]] [[Donald B. Redford]] observed that Necho II was &quot;a man of action from the start, and endowed with an imagination perhaps beyond that of his contemporaries, [who] had the misfortune to foster the impression of being a failure.&quot;&lt;ref&gt;Donald B. Redford, ''Egypt, Canaan, and Israel in Ancient Times'', (Princeton: Princeton University Press, 1992), p. 447-48.&lt;/ref&gt;<br /> <br /> ==Biography==<br /> {{see also|Twenty-sixth Dynasty of Egypt family tree}}<br /> ===Lineage and early life ===<br /> Necho II was the son of [[Psammetichus I]] by his [[Great Royal Wife]] Mehtenweskhet. His prenomen or royal name Wahem-Ib-Re means &quot;Carrying out [the] Heart (i.e., Wish) [of] [[Ra|Re]].&quot;&lt;ref&gt;Peter Clayton, Chronicle of the Pharaohs, Thames and Hudson, 1994. p.195&lt;/ref&gt; Upon his ascension, Necho was faced with the chaos created by the raids of the [[Cimmerians]] and the [[Scythians]], who had not only ravaged Asia west of the Euphrates, but had also helped the Babylonians shatter the Assyrian Empire. That once mighty empire was now reduced to the troops, officials, and nobles who had gathered around a general holding out at [[Harran]], who had taken the throne name of [[Ashur-uballit II]]. Necho attempted to assist this remnant immediately upon his coronation, but the force he sent proved to be too small, and the combined armies were forced to retreat west across the Euphrates.{{Citation needed|date=July 2020}}<br /> <br /> ===Military campaigns===<br /> <br /> ====First campaign====<br /> In the spring of 609 BC, Necho personally led a sizable force to help the Assyrians. At the head of a large army, consisting mainly of his mercenaries, Necho took the coast route [[Via Maris]] into [[Syria]], supported by his Mediterranean fleet along the shore, and proceeded through the low tracts of Philistia and Sharon. At [[Megiddo (place)|Megiddo]] (according to 2 Kings 23) he met the Judean king, [[Josiah]], who likely was called to swear a vassal’s oath to the new pharaoh, and during the meeting had the Judean king killed (an alternative version, in 2 Chronicles 35, of a battle in Megiddo is accepted as ahistorical by modern scholars).&lt;ref&gt;Bernd Schipper, 2010, ''Egypt and the Kingdom of Judah under Josiah and Jehoiakim, p. 218''&lt;/ref&gt; <br /> <br /> [[Herodotus]] reports the campaign of the pharaoh in his ''[[Histories (Herodotus)|Histories]], Book 2:159'':<br /> <br /> {{cquote|Necos, then, stopped work on the canal and turned to war; some of his triremes were constructed by the northern sea, and some in the Arabian Gulf ([[Red Sea]]), by the coast of the Sea of Erythrias. The windlasses for beaching the ships can still be seen. He deployed these ships as needed, while he also engaged in a pitched battle at Magdolos with the Syrians, and conquered them; and after this he took Cadytis ([[Kadesh (Syria)|Kadesh]]), which is a great city of Syria. He sent the clothes he had worn in these battles to the [[Didyma|Branchidae]] of Miletus and dedicated them to Apollo.}}<br /> <br /> Necho soon captured Kadesh on the Orontes and moved forward, joining forces with Ashur-uballit and together they crossed the Euphrates and laid siege to Harran. Although Necho became the first [[pharaoh]] to cross the Euphrates since [[Thutmose III]], he failed to capture Harran, and retreated back to northern [[Syria]]. At this point, Ashur-uballit vanished from history, and the Assyrian Empire was conquered by the Babylonians.<br /> [[File:Tel megido.JPG|right|thumb| Aerial view of [[Tel Megiddo]] site of the [[battle of Megiddo (609 BC)|battle of Megiddo in 609 BC]].]]<br /> <br /> Leaving a sizable force behind, Necho returned to [[Ancient Egypt|Egypt]]. On his return march, he found that the Judeans had selected [[Jehoahaz of Judah|Jehoahaz]] to succeed his father Josiah, whom Necho deposed and replaced with [[Jehoiakim]].&lt;ref&gt;II. Chronicles by [[Philip Chapman Barker]]. [https://books.google.com/books?id=p84UAAAAYAAJ&amp;pg=PA447 p447]–448&lt;/ref&gt; He brought Jehoahaz back to Egypt as his prisoner, where Jehoahaz ended his days (2 Kings 23:31; 2 Chronicles 36:1–4).<br /> <br /> ====Second campaign====<br /> [[File:Battle of Carchemish.png|thumb|In 605 BC, an Egyptian force fought the Babylonians at [[Battle of Carchemish]], helped by the remnants of the army of the former Assyria, but this was met with defeat.]]<br /> The Babylonian king was planning on reasserting his power in Syria. In 609 BC, King [[Nabopolassar]] captured [[Kummuh|Kumukh]], which cut off the Egyptian army, then based at Carchemish. Necho responded the following year by retaking Kumukh after a [[Siege of Kimuhu|four-month siege]], and executed the Babylonian garrison. Nabopolassar gathered another army, which camped at [[Qurumati]] on the Euphrates. However, Nabopolassar's poor health forced him to return to [[Babylon]] in 605 BC. In response, in 606 BC the Egyptians [[Battle of Quramati|attacked]] the leaderless Babylonians (probably then led by the crown prince Nebuchadrezzar) who fled their position.{{Citation needed|date=July 2020}}<br /> <br /> At this point, the aged Nabopolassar passed command of the army to his son [[Nebuchadnezzar II]], who led them to a decisive victory over the Egyptians at [[Battle of Carchemish|Carchemish]] in 605 BC, and pursued the fleeing survivors to [[Hamath]]. Necho's dream of restoring the Egyptian Empire in the Middle East as had occurred under the [[New Kingdom of Egypt|New Kingdom]] was destroyed as Nebuchadnezzar conquered Egyptian territory from the Euphrates to the [[Brook of Egypt]] ([[Book of Jeremiah|Jeremiah]] 46:2; [[Books of Kings|2 Kings]] 23:29) down to [[Judea]]. Although Nebuchadnezzar spent many years in his new conquests on continuous pacification campaigns, Necho was unable to recover any significant part of his lost territories. For example, when [[Ashkalon]] rose in revolt, despite repeated pleas the Egyptians sent no help, and were barely able to repel a [[Battle of Migdol (601 BC)|Babylonian attack]] on their eastern border in 601 BC. When he did repel the Babylonian attack, Necho managed to capture Gaza while pursuing the enemy. Necho turned his attention in his remaining years to forging relationships with new allies: the [[Caria]]ns, and further to the west, the [[ancient Greeks|Greeks]].{{Citation needed|date=July 2020}}<br /> <br /> ===Ambitious projects===<br /> {{See also|Suez Canal#History}}<br /> At some point during his Syrian campaign, Necho II initiated but never completed the ambitious project of cutting a navigable [[canal]] from the [[Pelusium|Pelusiac]] branch of the [[Nile]] to the [[Red Sea]]. [[Canal of the Pharaohs|Necho's Canal]] was the earliest precursor of the [[Suez Canal]].&lt;ref&gt;Redmount, Carol A. &quot;The Wadi Tumilat and the &quot;Canal of the Pharaohs&quot;&quot; ''Journal of Near Eastern Studies'', Vol. 54, No. 2 (April , 1995), pp. 127-135&lt;/ref&gt; It was in connection with a new activity that Necho founded a new city of ''Per-Temu Tjeku'' which translates as 'The House of [[Atum]] of Tjeku' at the site now known as [[Tell el-Maskhuta]],&lt;ref&gt;Shaw, Ian; and Nicholson, Paul. The Dictionary of Ancient Egypt. The British Museum Press, 1995. p.201&lt;/ref&gt; about 15&amp;nbsp;km west of [[Ismailia]]. The waterway was intended to [[Ancient Egyptian trade|facilitate trade between the Mediterranean Sea and the Indian Ocean]].<br /> <br /> Necho also formed an Egyptian navy by recruiting displaced Ionian Greeks. This was an unprecedented act by the pharaoh since most Egyptians had traditionally harboured an inherent distaste for and fear of the sea.&lt;ref&gt;Peter Clayton, Chronicle of the Pharaohs, Thames and Hudson, 1994, p.196&lt;/ref&gt; The navy which Necho created operated along both the Mediterranean and Red Sea coasts.&lt;ref&gt;[[Herodotus]] 2.158; [[Pliny the Elder|Pliny]] N.H. 6.165ff; [[Diodorus Siculus]] 3.43&lt;/ref&gt; Necho II constructed warships,&lt;ref&gt;The Cambridge Ancient History. Edited by John Boardman, N. G. L. Hammond. p49&lt;/ref&gt; including questionably [[triremes]].&lt;ref&gt;''Carthage Must Be Destroyed: The Rise and Fall of an Ancient Civilization''. By Richard Miles. Penguin, Jul 21, 2011. p1781&lt;/ref&gt;<br /> <br /> ===Phoenician expedition===<br /> [[File:Herodotus5m1.jpg|thumb|right|The world according to [[Herodotus]], 440 BC]]<br /> [[File:PtolemyWorldMap.jpg|thumb|right|A 15th-century depiction of the Ptolemy world map, reconstituted from Ptolemy's Geographia (c. 150)]]<br /> At some point between 610 and before 594 BC, Necho reputedly commissioned an expedition of [[Phoenicians]],&lt;ref&gt;Unlikely with the intent of circumnavigating Africa, but for finding an alternative route to Asia than through the area near the [[Levant]]. Also, such voyages were undertaken for trading with more southern African cities; thereafter being blown off-course, if not tasked to sail around the lands.&lt;/ref&gt; who it is said in three years sailed from the Red Sea around Africa back to the mouth of the Nile; and would thereby be the first completion of the [[Cape Route]].&lt;ref&gt;''Israel, India, Persia, Phoenicia, Minor Nations of Western Asia''. Edited by [[Henry Smith Williams]]. [https://books.google.com/books?id=vPULAAAAYAAJ&amp;pg=PA118 p118]&lt;/ref&gt;&lt;ref&gt;Anthony Tony Browder, Nile valley contributions to civilization,Volume 1. 1992 (''cf''. In the Twenty Fifth Dynasty, during the reign of Necho II, navigational technology had advanced to the point where sailors from Kemet successfully circumnavigated Africa and drew an extremely accurate map of the continent.)&lt;/ref&gt; Herodotus' account was handed down to him by [[oral tradition]],&lt;ref&gt;M. J. Cary. ''The Ancient Explorers''. Penguin Books, 1963. Page 114&lt;/ref&gt; but is seen as potentially credible because he stated with disbelief that the Phoenicians &quot;as they sailed on a westerly course round the southern end of Libya (Africa), they had the sun on their right&quot;—to northward of them (''The Histories'' 4.42).&lt;ref&gt;As for Libya, we know it to be washed on all sides by the sea, except where it is attached to Asia. This discovery was first made by Necos, the Egyptian king, who on desisting from the canal which he had begun between the Nile and the Arabian gulf (referring to the Red Sea), sent to sea a number of ships manned by Phoenicians, with orders to make for the Pillars of Hercules, and return to Egypt through them, and by the Mediterranean. The Phoenicians took their departure from Egypt by way of the Erythraean sea, and so sailed into the southern ocean. When autumn came, they went ashore, wherever they might happen to be, and having sown a tract of land with corn, waited until the grain was fit to cut. Having reaped it, they again set sail; and thus it came to pass that two whole years went by, and it was not till the third year that they doubled the Pillars of Hercules, and made good their voyage home. On their return, they declared—I for my part do not believe them, but perhaps others may—that in sailing round Libya they had the sun upon their right hand. In this way was the extent of Libya first discovered. {{cite wikisource |chapter=Book 4 |wslink=History of Herodotus |plaintitle=History of Herodotus}}&lt;/ref&gt; Pliny reported that [[Hanno the Navigator|Hanno]] had circumnavigated Africa, which may have been a conflation with Necho's voyage, while [[Strabo]], [[Polybius]], and [[Ptolemy]] doubted the description;&lt;ref&gt;''The Geographical system of Herodotus'' by James Rennel. [https://books.google.com/books?id=6C0waiOScrEC&amp;pg=PA348 p348]+&lt;/ref&gt; [[History of geography#Greco-Roman world|at the time it was not generally known that Africa was surrounded by an ocean]] (with the southern part of Africa being thought connected to Asia).&lt;ref&gt;''Die umsegelung Asiens und Europas auf der Vega''. Volume 2. By Adolf Erik Nordenskiöld. [https://books.google.com/books?id=8-SfAAAAMAAJ&amp;pg=PA148 p148]&lt;/ref&gt; [[F. C. H. Wendel]], writing in 1890, concurred with Herodotus&lt;ref&gt;''History of Egypt''. By [[F. C. H. Wendel]]. American Book Co., 1890. [https://books.google.com/books?id=9MsXAAAAIAAJ&amp;pg=PA127 p127] (''cf''. Herodotus relates a story of a great maritime enterprise undertaken at this time which seems quite credible. He states that Nekau sent out Phoenician ships from the Red Sea to circumnavigate Africa, and that in the third year of their journey they returned to the Mediterranean through the Straits of Gibraltar.)&lt;/ref&gt; as did [[James Baikie]].&lt;ref&gt;''The Story of the Pharaohs''. By [[James Baikie]]. [https://books.google.com/books?id=TSswAAAAYAAJ&amp;pg=PA316 p316]&lt;/ref&gt; Egyptologist [[Alan B. Lloyd|A. B. Lloyd]] disputed in 1977 that an Egyptian Pharaoh would authorize such an expedition,&lt;ref&gt;Lloyd, Alan B. &quot;Necho and the Red Sea:Some Considerations&quot; ''Journal of Egyptian Archaeology'', Vol. 63, No. 2 (April , 1995), pp. 142-155 https://www.jstor.org/stable/3856314?read-now=1&amp;seq=1#metadata_info_tab_contents&lt;/ref&gt;&lt;ref&gt;Lloyd is to hold the position that [[History of geography|geographical knowledge at the time]] of [[Herodotus|Herodutus]] was such that Greeks would know that such a voyage would entail the sun being on their right but did not believe Africa could extend far enough for this to happen. He suggests that the Greeks at this time understood that anyone going south far enough and then turning west would have the sun on their right but found it unbelievable that Africa reached so far south. He wrote: &quot;Given the context of [[Ancient Egyptian philosophy|Egyptian thought]], [[Ancient Egyptian trade|economic life]], and [[Military of ancient Egypt|military]] interests, it is impossible for one to imagine what stimulus could have motivated Necho in such a scheme and if we cannot provide a reason which is sound within Egyptian terms of reference, then we have good reason to doubt the historicity of the entire episode.&quot; Alan B. Lloyd, &quot;Necho and the Red Sea: Some Considerations&quot;, ''Journal of Egyptian Archaeology'', 63 (1977) p.149.&lt;/ref&gt; except for the reasons of Asiatic conquest&lt;ref&gt;''Twentieth Century''. Twentieth century, 1908. [https://books.google.com/books?id=bnDME1BzBEoC&amp;pg=PA816 p816]&lt;/ref&gt;&lt;ref&gt;'The Historians' History of the World''. Edited by Henry Smith Williams. [https://books.google.com/books?id=BKQ-AAAAYAAJ&amp;pg=PA286 p286] (''cf''. Syria seems to have submitted to him, as far as the countries bordering the Euphrates. Gaza offered resistance, but was taken. But it was only for a short time that Neku II could feel himself a conqueror.)&lt;/ref&gt; and trade in the [[Ancient maritime history|ancient maritime routes]].&lt;ref&gt;''Cosmos: A Sketch of a Physical Description of the Universe''. By [[Alexander von Humboldt]]. [https://books.google.com/books?id=M_W_AAAAIAAJ&amp;pg=PA489 p489]&lt;/ref&gt;&lt;ref&gt;''The Cambridge History of the British Empire''. CUP Archive, 1963. [https://books.google.com/books?id=ISg9AAAAIAAJ&amp;pg=PA56 p56]&lt;/ref&gt;<br /> <br /> ===Death and succession===<br /> Necho II died in 595 BC and was succeeded by his son, [[Psamtik II]], as the next pharaoh of Egypt. Psamtik II, however, apparently removed Necho's name from almost all of his father's monuments for unknown reasons. However, some scholars, such as Roberto Gozzoli, express doubt that this actually happened, arguing that the evidence for this is fragmentary and rather contradictory.&lt;ref&gt;Gozzoli, R. B. (2000), [https://www.academia.edu/353991/The_Statue_BM_EA_37891_and_the_Erasure_of_Necho_IIs_Names ''The Statue BM EA 37891 and the Erasure of Necho II's Names''] Journal of Egyptian Archaeology 86: 67–80&lt;/ref&gt;<br /> <br /> ==Further reading==<br /> ;Pre-1900s<br /> * ''Encyclopædia'', &quot;[https://books.google.com/books?id=8Pzg8T5RkGQC&amp;pg=PA785 Necho]&quot;. Thomas Dobson, at the Stone house, no. 41, South Second street, 1798. p785.<br /> * ''Pantologia'', &quot;[https://books.google.com/books?id=akIKAQAAMAAJ&amp;pg=PT372 Necho]&quot;. J. Walker, 1819. p372.<br /> * ''Journal of the Royal Asiatic Society of Great Britain and Ireland'', Volume 15. [https://books.google.com/books?id=z_kAAAAAYAAJ&amp;pg=PA430 p430].<br /> * ''Essay on the Hieroglyphic System of M. Champollion, Jun., and on the Advantages which it Offers to Sacred Criticism''. By J. G. Honoré Greppo. [https://books.google.com/books?id=LYkBAAAAMAAJ&amp;pg=PA128 p128]–129.<br /> * [https://books.google.com/books?id=9_ULAAAAYAAJ ''Prolegomena; Egypt, Mesopotamia'']. Edited by Henry Smith Williams.<br /> <br /> ;Post-1900s<br /> * Petrie 1905. W.M. Flinders Petrie. [https://babel.hathitrust.org/cgi/pt?id=njp.32101076207016;view=1up;seq=7 ''A History of Egypt''.] From the XIXth to the XXXth Dynasties. London. See: [https://babel.hathitrust.org/cgi/pt?id=njp.32101076207016;view=1up;seq=361 Nekau II, pp. 335–339.]<br /> * [[Max Cary]], [[Eric Herbert Warmington]]. ''The Ancient Explorers''. Methuen &amp; Company, Limited, 1929.<br /> * Peter Clayton (1994). ''Chronicle of the Pharaohs'', Thames and Hudson.<br /> * Arnold 1999. Dieter Arnold. ''Temples of the Last Pharaos''. New York/Oxford<br /> <br /> ==See also==<br /> *[[Necho (crater)]]<br /> *[[Hanno the Navigator]]<br /> *[[List of biblical figures identified in extra-biblical sources]]<br /> *[[Ancient Egyptian trade]]<br /> *[[Bible and history]]<br /> *[[Land of Punt]]<br /> <br /> ==References==<br /> ;General information<br /> *Budge, E. A. W. (1894). [https://books.google.com/books?id=J0tCAAAAIAAJ The mummy: Chapters on Egyptian funereal archaeology]. Cambridge [England]: University Press. [https://books.google.com/books?id=J0tCAAAAIAAJ&amp;pg=PA56 page 56+].<br /> * Budge, E. A. W. (1904). [https://books.google.com/books?id=U9ajVYHEpi4C A history of Egypt from the end of the Neolithic period to the death of Cleopatra VII, B.C. 30]. Books on Egypt and Chaldaea, v. 9-16. London: K. Paul, Trench, Trübner &amp; Co. [https://books.google.com/books?id=U9ajVYHEpi4C&amp;pg=PA218 Page218+].<br /> * [http://www.swan.ac.uk/staff/academic/artshumanities/oth/lloydalan/ Alan B. Lloyd], &quot;Necho and the Red Sea: Some Considerations&quot;, Journal of Egyptian Archaeology, 63 (1977).<br /> *[https://books.google.com/books?id=8DiTX_EsWasC Herodotus] By Alan B. Lloyd. BRILL, 1988.<br /> ;Footnotes<br /> {{Reflist|2}}<br /> <br /> ==External articles==<br /> ;Expedition<br /> *[http://www.touregypt.net/featurestories/nechoafrica.htm Necho II's African Circumnavigation] <br /> *[http://www.reshafim.org.il/ad/egypt/economy/ Ancient Egyptian economy]. www.reshafim.org.il (Maritime economy)<br /> ;Other<br /> * [http://formerthings.com/necho.htm Necho Pharaoh of Egypt]. Egyptian History Archaeology and the Bible.<br /> *[http://www.digitalegypt.ucl.ac.uk/chronology/nekauii.html Nekau (II) Wehemibre]., digitalegypt.ucl.ac.uk<br /> <br /> {{Pharaohs}}<br /> <br /> {{Authority control}}<br /> <br /> {{DEFAULTSORT:Necho II}}<br /> [[Category:Necho II| ]]<br /> [[Category:7th-century BC births]]<br /> [[Category:595 BC deaths]]<br /> [[Category:7th-century BC Pharaohs]]<br /> [[Category:6th-century BC Pharaohs]]<br /> [[Category:Pharaohs of the Twenty-sixth Dynasty of Egypt]]<br /> [[Category:Year of birth unknown]]<br /> [[Category:Exploration of Africa]]<br /> [[Category:Pharaohs in the Bible]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Penult&diff=1170908970 Penult 2023-08-17T22:54:29Z <p>205.189.94.9: </p> <hr /> <div>{{short description|Second to last syllable of a word}}<br /> {{wiktionary|penultimate}}<br /> <br /> '''Penult''' is a linguistics term for the second-to-last [[syllable]] of a word. It is an abbreviation of ''penultimate'', which describes the next-to-last item in a [[series]]. The penult follows the antepenult and precedes the [[ultima (linguistics)|ultima]]. For example, the main stress falls on the penult in such English words as ''banána'', and ''Mississíppi'', and just about all words ending in ''-ic'' such as ''músic'', ''frántic'', and ''phonétic''. Occasionally, &quot;penult&quot; refers to the last word but one of a sentence.<br /> <br /> The terms are often used in reference to languages like [[Latin]] and [[Ancient Greek]], whose position of the [[pitch accent]] or [[stress (linguistics)|stress]] of a word falls only on one of the last three syllables, and sometimes in discussing [[poetry|poetic]] [[meter (poetry)|meter]].<br /> <br /> In certain languages, such as [[Welsh language|Welsh]]&lt;ref&gt;[https://wals.info/languoid/lect/wals_code_wel Welsh] {{webarchive|url=https://web.archive.org/web/20151208235237/http://wals.info/languoid/lect/wals_code_wel |date=2015-12-08}} in the World Atlas of Language Structures&lt;/ref&gt; and [[Polish language|Polish]], stress is always on the penult.&lt;ref&gt;[https://wals.info/chapter/14 Chapter 14: Fixed Stress Locations] {{webarchive|url=https://web.archive.org/web/20151207160058/http://wals.info/chapter/14 |date=2015-12-07}} in the World Atlas of Language Structures&lt;/ref&gt;<br /> <br /> ==See also==<br /> * [[Acute accent]]<br /> ** [[Oxytone]]<br /> ** [[Paroxytone]]<br /> ** [[Proparoxytone]]<br /> * [[Ultima (linguistics)]]<br /> * [[Stress (linguistics)]]<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> [[Category:Phonology]]<br /> [[Category:Greek grammar]]<br /> <br /> <br /> {{phonology-stub}}</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Nipissing,_Ontario&diff=1170908686 Nipissing, Ontario 2023-08-17T22:51:56Z <p>205.189.94.9: </p> <hr /> <div>{{See also|Nipissing (disambiguation)}}<br /> {{Use Canadian English|date=January 2023}}<br /> {{Infobox settlement<br /> | name = Nipissing<br /> | official_name = Township of Nipissing<br /> | settlement_type = [[List of township municipalities in Ontario|Township]] ([[List of municipalities in Ontario#Single-tier municipalities|single-tier]])<br /> | nickname = <br /> | motto = Life the Way it Should Be.<br /> | image_skyline = Nipissing Twp ON.JPG<br /> | imagesize = <br /> | image_caption = Nipissing Township Museum<br /> | image_flag = <br /> | flag_size = 120x100px<br /> | image_shield = <br /> | shield_size = 100x80px<br /> | image_map = <br /> | mapsize = <br /> | pushpin_map = Canada Southern Ontario<br /> | pushpin_label_position = <br /> | pushpin_map_alt = <br /> | pushpin_map_caption = Location of Nipissing Township in Ontario<br /> | coordinates = {{coord|46|03|N|79|33|W|region:CA-ON|display=inline,title}}<br /> | coor_pinpoint = <br /> | coordinates_footnotes = &lt;ref name=&quot;CGNDB&quot;&gt;{{cite cgndb|id= FELWX|title= Nipissing |accessdate=2013-03-04}}&lt;/ref&gt;<br /> | subdivision_type = Country<br /> | subdivision_name = Canada<br /> | subdivision_type1 = Province<br /> | subdivision_name1 = [[Ontario]]<br /> | subdivision_type2 = [[Census divisions of Ontario|District]]<br /> | subdivision_name2 = [[Parry Sound District, Ontario|Parry Sound]]<br /> | established_title = Settled<br /> | established_date = 1862<br /> | established_title2 = Incorporated<br /> | established_date2 = 1888<br /> | government_type = Township<br /> | leader_title = Mayor<br /> | leader_name = Tom Piper<br /> | leader_title1 = Federal riding<br /> | leader_name1 = [[Nipissing—Timiskaming]]<br /> | leader_title2 = Prov. riding<br /> | leader_name2 = [[Nipissing (provincial electoral district)|Nipissing]]<br /> | area_total_km2 = <br /> | area_land_km2 = 387.95<br /> | area_water_km2 =<br /> | area_footnotes = &lt;ref&gt;name=&quot;census2021&quot;&lt;/ref&gt;<br /> | population_as_of = 2021<br /> | population_footnotes = &lt;ref&gt;{{cite web | url=https://www12.statcan.gc.ca/census-recensement/2021/as-sa/fogs-spg/Page.cfm?Lang=E&amp;Dguid=2021A00053549071&amp;r=1 | title=File not found &amp;#124; Fichier non trouvé }}&lt;/ref&gt;<br /> | population_total = 1769<br /> | population_density_km2 = 4.6<br /> | timezone = [[Eastern Time Zone|EST]]<br /> | utc_offset = -5<br /> | timezone_DST = [[Eastern Time Zone|EDT]]<br /> | utc_offset_DST = -4<br /> | postal_code_type = [[Canadian postal code|Postal Code]]<br /> | postal_code = P0H<br /> | area_code = [[Area code 705|705]], [[Area code 249|249]]<br /> | elevation_footnotes = <br /> | elevation_m = <br /> | website = {{URL|http://nipissingtownship.com/}}<br /> | footnotes =<br /> }}<br /> <br /> '''Nipissing''' is an [[Township (Canada)#Ontario|incorporated (political) township]] in [[Parry Sound District, Ontario|Parry Sound District]] in [[Central Ontario]], [[Canada]].&lt;ref name=&quot;CGNDB&quot; /&gt;&lt;ref name=&quot;OntHistoricMining&quot;&gt;{{cite web|url= http://www.geologyontario.mndmf.gov.on.ca/website/historic_claim_maps/P/Pringle.pdf |title= Pringle|work= Geology Ontario - Historic Claim Maps|publisher= [[Ministry of Northern Development, Mines and Forestry|Ontario Ministry of Northern Development, Mines and Forestry]]|access-date= 2013-03-04}}&lt;/ref&gt; It is on [[Lake Nipissing]] and is part of the [[Almaguin Highlands]] region. Nipissing was surveyed between 1874 and 1881, and was incorporated in [[1888]]. Among the first settlers in the area were the Chapman and Beatty families. Nipissing Township annexed [[Gurd Township, Ontario|Gurd Township]] in 1970. The township also contains a community named Nipissing, which is located on the South River near Chapman's Landing, on the South Bay of Lake Nipissing. The township administrative offices are located in Nipissing.<br /> <br /> The township includes the communities of Alsace, Christian Valley, Commanda, Hotham, Nipissing and Wade's Landing.<br /> <br /> ==History==<br /> The founder of Nipissing, John Beattie (John Beatty) arrived by canoe from [[Eganville, Ontario|Eganville]] in 1862. He was looking for land suitable for settlement. To lay claim to the property, he made brush piles, and was granted free land by the [[Government of Ontario]]. Around 1869 James Chapman and his wife, Phoebe Edwards, built their first house and barn at the top of the chutes that later took their name. The family farmed the area and James carried the mail by canoe, dog team and later horse on a route stretching {{convert|200|mi|km}} between the villages of [[Magnetawan, Ontario|Magnetawan]] and [[Mattawa, Ontario|Mattawa]]. The Chapman Valley and Chapman Township near Magnetawan are named after the family. James and Phoebe are among the pioneers buried in the Nipissing village cemetery. The Chapman family donated the land to the municipal government, and the landing became a municipal boat launch, public dock and swimming hole for village children.<br /> <br /> Originally supplies were brought into Nipissing from [[Pembroke, Ontario|Pembroke]] by canoe over the [[Champlain Trail]], as well as the South River. Nipissing village became the main route for shipping supplies. Around [[1875]] a colonization road was completed which connected tiny Nipissing village to [[Rosseau, Ontario|Rosseau]] near [[Parry Sound, Ontario|Parry Sound]] in the south and this created road travel and another route for shipment of supplies. An Ontario Historical Plaque was erected at the Nipissing Township Museum by the province to commemorate the Rosseau-Nipissing Road's role in Ontario's heritage.&lt;ref&gt;{{cite web |title=The Rosseau-Nipissing Road |url=http://www.ontarioplaques.com/Plaques/Plaque_Parry10.html |website=OntarioPlaques.com |publisher=Alan L. Brown |access-date=June 27, 2019}}&lt;/ref&gt; However, in 1886 the [[Northern and Pacific Junction Railway]] connected [[Gravenhurst, Ontario|Gravenhurst]] to [[Callander, Ontario|Callander]] cutting out Nipissing village from its main route and the life of the village as a key port began to fade.<br /> <br /> ==Etymology==<br /> Named in 1879 after the lake, on whose south shore it is located. The community of Nipissing in the township, 25&amp;nbsp;km south of North Bay, was called ''Nipissingan'' in 1870, but its name was changed to Nipissing in 1881.&lt;ref&gt;{{cite book|last=Rayburn|first=Alan|title=Place Names of Ontario|year=1997|publisher=[[University of Toronto Press]]|location=Toronto|isbn=0-8020-7207-0|oclc= 36342881|page=243}}&lt;/ref&gt;<br /> <br /> ==Transportation==<br /> The township is served in its northern part by [[Ontario Highway 534]] and [[Ontario Highway 654]], which connect east to [[Ontario Highway 11]] at the communities of [[Powassan]] and [[Callander, Ontario|Callander]] respectively; Ontario Highway 534 also connects west to [[Restoule Provincial Park]], and via [[Ontario Highway 524]] to [[Ontario Highway 522]]. The township is served across its southern part by Ontario Highway 522, which connects east to Highway 11 at [[Trout Creek, Ontario|Trout Creek]] and west to [[Ontario Highway 69]] at the community of [[Cranberry, Ontario|Cranberry]].<br /> <br /> == Demographics ==<br /> In the [[2021 Canadian census|2021 Census of Population]] conducted by [[Statistics Canada]], Nipissing had a population of {{val|1769|fmt=commas}} living in {{val|746|fmt=commas}} of its {{val|1012|fmt=commas}} total private dwellings, a change of {{percentage|{{#expr:1769-1707}}|1707|1}} from its 2016 population of {{val|1707|fmt=commas}}. With a land area of {{convert|387.95|km2|sqmi|abbr=on}}, it had a population density of {{Pop density|1769|387.95|km2|sqmi|prec=1}} in 2021.&lt;ref name=2021census&gt;{{cite web | url=https://www150.statcan.gc.ca/t1/tbl1/en/tv.action?pid=9810000203&amp;geocode=A000235 | title=Population and dwelling counts: Canada, provinces and territories, census divisions and census subdivisions (municipalities), Ontario | publisher=[[Statistics Canada]] | date=February 9, 2022 | accessdate=April 2, 2022}}&lt;/ref&gt;<br /> {{Canada census<br /> |location = Nipissing<br /> |2021_population=1,769 | 2021_pop_delta=+3.6 | 2021_land_area=387.95 | 2021_pop_density=4.6<br /> |2021_median_age=54.8 | 2021_median_age_m=54.8 | 2021_median_age_f=54.4<br /> |2021_total_pvt_dwell=745 |2021_mean_hh_income= |2021_geocode=2021A00053549071 | 2021_access_date=2022-04-27<br /> |2016_population=1,707 | 2016_pop_delta=+0.2 | 2016_land_area=393.8 | 2016_pop_density=4.3<br /> |2016_median_age=52.0 | 2016_median_age_m=52.7 | 2016_median_age_f=51.2<br /> |2016_total_pvt_dwell=1,051 | 2016_mean_hh_income=70,229 | 2016_access_date=2019-06-27<br /> |2011_population=1,704 | 2011_pop_delta=+3.8 | 2011_land_area=393.6 | 2011_pop_density=4.3<br /> |2011_median_age=49.1 | 2011_median_age_m=49.4 | 2011_median_age_f=49.0<br /> |2011_total_pvt_dwell=993 | 2011_mean_hh_income= | 2011_access_date=2019-06-27<br /> }}<br /> <br /> ==See also==<br /> {{Portal|Ontario}}<br /> *[[List of townships in Ontario]]<br /> <br /> ==References==<br /> {{Reflist}}<br /> {{refbegin}}<br /> Other map sources:<br /> *{{cite map|url= http://www.mto.gov.on.ca/english/traveller/map/images/pdf/southont/sheets/Map8.pdf |format= PDF|title= Map 8|series= Official road map of Ontario|publisher= [[Ministry of Transportation of Ontario]]|scale= 1 : 700,000|date= 2012-01-01|access-date= 2013-03-04}}<br /> *{{cite map|url= http://www.mah.gov.on.ca/AssetFactory.aspx?did=6575 |title= Restructured municipalities - Ontario map #4|year= 2006|series= Restructuring Maps of Ontario|publisher= [[Ministry of Municipal Affairs and Housing (Ontario)|Ontario Ministry of Municipal Affairs and Housing]]|access-date= 2013-03-04}}<br /> *{{cite map|url= http://nipissingtownship.com/web/pdf/Sched_A_11x17.pdf |title= Official Plan - Schedule 'A'|scale= 1 : 40,000|date= 2005-10-28|publisher= Township of Nipissing|access-date= 2013-03-04}}<br /> {{refend}}<br /> <br /> ==External links==<br /> *{{Official website|https://nipissingtownship.com/}}<br /> <br /> {{Geographic location<br /> | Centre = Nipissing<br /> | North = ''[[Lake Nipissing]]''<br /> | Northeast = <br /> | East = [[Callander, Ontario|Callander]]&lt;br /&gt;[[Powassan, Ontario|Powassan]]<br /> | Southeast = [[Unorganized North East Parry Sound District|Unorganized North East Parry Sound]]<br /> | South = [[Machar, Ontario|Machar]]<br /> | Southwest = <br /> | West = [[Unorganized Centre Parry Sound District|Unorganized Centre Parry Sound]]<br /> | Northwest = <br /> | image =<br /> }}<br /> <br /> {{Parry Sound District}}<br /> <br /> [[Category:1862 establishments in Canada]]<br /> [[Category:Municipalities in Parry Sound District]]<br /> [[Category:Single-tier municipalities in Ontario]]<br /> [[Category:Township municipalities in Ontario]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Magnetawan&diff=1170908408 Magnetawan 2023-08-17T22:49:39Z <p>205.189.94.9: </p> <hr /> <div>{{for|the nearby First Nation reserve|Magnetawan 1, Ontario}}<br /> {{Use Canadian English|date=January 2023}}<br /> {{Infobox settlement<br /> | name = Magnetawan<br /> | official_name = Municipality of Magnetawan<br /> | settlement_type = [[List of municipalities in Ontario|Municipality]] ([[List of municipalities in Ontario#Single-tier municipalities|single-tier]])<br /> | nickname = <br /> | motto = <br /> | image_skyline = Magnetawan ON 2.JPG<br /> | image_caption = Magnetawan on the Magnetawan River<br /> | image_flag = <br /> | flag_size = 120x100px<br /> | image_shield = <br /> | shield_size = 100x80px<br /> | image_map = <br /> | mapsize = <br /> | pushpin_map = Canada Southern Ontario<br /> | pushpin_mapsize = 200<br /> | pushpin_label_position=<br /> | coordinates = {{coord|45|40|N|79|38|W|region:CA-ON|display=inline,title}}<br /> | subdivision_type = Country<br /> | subdivision_name = Canada<br /> | subdivision_type1 = Province<br /> | subdivision_name1 = [[Ontario]]<br /> | subdivision_type2 = District<br /> | subdivision_name2 = [[Parry Sound District|Parry Sound]]<br /> | established_title = Settled<br /> | established_date = 1870s<br /> | established_title2 = Incorporated<br /> | established_date2 = January 1, 1998<br /> | government_type = Township<br /> | leader_title = Mayor<br /> | leader_name = Sam Dunnett<br /> | leader_title1 = Federal riding<br /> | leader_name1 = [[Parry Sound—Muskoka]]<br /> | leader_title2 = Prov. riding<br /> | leader_name2 = [[Parry Sound—Muskoka (provincial electoral district)|Parry Sound—Muskoka]]<br /> | area_total_km2 = <br /> | area_land_km2 = 526.31<br /> | area_water_km2 =<br /> | area_footnotes = {{citation needed|date=February 2022}}<br /> | population_as_of = 2021<br /> | population_footnotes = &lt;ref name=&quot;census2021&quot;&gt;{{cite web |title=Census Profile, 2021 Census Magnetawan, Municipality |url=https://www12.statcan.gc.ca/census-recensement/2021/as-sa/fogs-spg/Page.cfm?Lang=E&amp;Dguid=2021A00053549043&amp;r=1 |publisher=Statistics Canada |access-date=February 9, 2022}}&lt;/ref&gt;<br /> | population_total = 1753<br /> | population_density_km2 = 3.3<br /> | timezone = [[Eastern Time Zone|EST]]<br /> | utc_offset = -5<br /> | timezone_DST = [[Eastern Time Zone|EDT]]<br /> | utc_offset_DST = -4<br /> | postal_code_type = [[Canadian postal code|Postal Code]]<br /> | postal_code = P0A<br /> | area_code = [[Area code 705|705]]<br /> | elevation_footnotes = <br /> | elevation_m = <br /> | website = {{URL|http://www.magnetawan.com/}}<br /> | footnotes =<br /> }}<br /> [[File:Magnetawan ON 1.JPG|right|thumb|250px|Municipal office and library]]<br /> <br /> '''Magnetawan''' is a [[township (Canada)|township]] in the [[Almaguin Highlands]] region of the [[Parry Sound District]] in the [[Canada|Canadian]] province of [[Ontario]], as well as the name of the primary population centre in the [[township]].<br /> <br /> The [[Township]] of [[Magnetawan]] was formed in [[1998]] through the amalgamation of the Township of Chapman and the [[Village]] of [[Magnetawan]], along with the unincorporated geographic Townships of Croft and [[Spence]].<br /> <br /> The word Magnetawan in the [[Algonquin language]] means &quot;swiftly flowing river.&quot;&lt;ref name=ebischof6&gt;{{cite web |title=Magnetawan Ontario |url=https://almaguinhighlands.com/almaguin/communities/magnetawan.html |publisher=Almaguin Highlands Ontario Inc. |access-date=June 27, 2019}}&lt;/ref&gt;<br /> <br /> [[Barbara Hanley]], the first woman ever elected [[mayor]] of a community in [[Canada]], was born in [[Magnetawa]]n in [[1882]].<br /> <br /> Magnetawan is the setting for The Rogue Hunter, the tenth book in the popular [[Urban Fantasy]] Argeneau series by Ontario-born author [[Lynsay Sands]].<br /> <br /> ==Communities==<br /> The township comprises the communities of [[Ahmic Harbour]], Ahmic Lake, Cecebe, Cedar Croft, Chikopi, [[Dufferin Bridge]], Magnetawan, North Seguin, Oranmore, Pearceley, Port Anson and Port Carmen, as well as the [[ghost town]] of [[Spence, Ontario|Spence]].<br /> <br /> The community is twinned with the city of [[Baltimore, Maryland]].{{citation needed|date=October 2022}}<br /> <br /> ==History==<br /> The first people to inhabit the region were the [[Hurons]], [[Ojibway]] and Algonquins, who would visit the area in the summer for hunting and fishing but sheltered on [[Georgian Bay]] in the [[winter]]. While some Europeans explored the region in the early 19th century, settlement and colonization by [[Europeans]] was hardly taking place, so much so that the government considered turning the entire region into an [[Indian reserve]].&lt;ref name=MoM8&gt;{{cite web |url=http://www.magnetawan.com/index.php?option=com_content&amp;view=article&amp;id=8:aborginals&amp;catid=1:history&amp;Itemid=13 |title={{sic|Aborgi|nals|nolink=y}} Hurons, Ojibway and Algonquins |publisher=Municipality of Magnetawan |access-date=2010-08-24 |url-status=dead |archive-url=https://web.archive.org/web/20110714020805if_/http://www.magnetawan.com/index.php?option=com_content&amp;view=article&amp;id=8:aborginals&amp;catid=1:history&amp;Itemid=13 |archive-date=2011-07-14 }}&lt;/ref&gt;<br /> <br /> But when [[pine ]]stands in [[southern Ontario]] became depleted, the area attracted loggers and the government changed its mind and encouraged settlement through free land grants, first offered in 1853. Settlement happened slowly but accelerated when the [[List of Ontario Historic Colonization Roads|colonization road]] from [[Rosseau, Ontario|Rosseau]] to [[Nipissing, Ontario|Nipissing]] began being built in 1866.&lt;ref name=MoM8/&gt; In 1868, the government passed the Free Grand Land and Homestead Act and began advertising this extensively in European countries to attract new immigrants.&lt;ref name=MoM9&gt;{{cite web |url=http://www.magnetawan.com/index.php?option=com_content&amp;view=article&amp;id=9:land-grants&amp;catid=1:history&amp;Itemid=13 |title=The Land Grants |publisher=Municipality of Magnetawan |access-date=2010-08-24}}&lt;/ref&gt; Croft Township was surveyed in 1869, Chapman Township in 1870, and the village of [[Magnetawan]] was mapped out in [[1873]].&lt;ref&gt;[http://ontarioplaques.com/Plaques/Plaque_Parry02.html Elise von Koerber], native of [[Baden-Baden]], since 1872 appointed immigration agent by the federal government of Canada. She brought several hundred German-speaking Swiss citizens to [[Magnetawan]]. French-speaking persons mostly went to the [[Doe Lake (Parry Sound District)|Doe Lake]]. In 1881, about 200 [[Swiss]] remained in this region.&lt;/ref&gt;<br /> <br /> The [[Great North Road (Ontario)|Great North Road]], from [[Parry Sound]] to Nipissing, reached the Magnetawan at [[Ahmic Harbour]], in 1870.&lt;ref name=AstridTaimAlmaguin/&gt;<br /> <br /> The {{convert|76|mi|km}} stretch of the colony road, from Rousseau to Nipissing reached the Magnetawan in 1874, where the [[historic village of Magnetawan]] was built.&lt;ref name=AstridTaimAlmaguin/&gt; In 1879 a pair of small steamships started carrying cargo and passengers over the {{convert|20|mi|km}} reach from the rapids at the village upstream to [[Burk's Falls]].<br /> <br /> Burk's Falls was linked to the south by a railroad in 1885.&lt;ref name=AstridTaimAlmaguin/&gt; A lock was completed, enabling navigation west of the rapids at the historic village of Magnetawan, in 1886. Following its completion larger steamships started carrying cargo and passengers, and towing log booms, over the reach from Burk's Falls to Ahmic Harbour. During its first 25 years of operation the lock-keeper recorded steamships transited the lock 17,590 times. The last steamship, a tugboat used to tow log-booms, used the river in 1934.<br /> <br /> ==Geography ==<br /> Core rock samples done by Walfried Schwerdtner in the surrounding area, show mostly foliated Grenville Gneiss.&lt;ref name=ebischof8&gt;{{cite journal | last1 = Schwerdtner | first1 = W | year = 2008 | title = Structure of Ahmic domain and its vicinity, southwestern [[Central Gneiss Belt]], [[Grenville]] Province of Ontario (Canada) | journal = Precambrian Research | volume = 167 | issue = 1–2| pages = 16–34 | doi=10.1016/j.precamres.2008.07.002| bibcode = 2008PreR..167...16S }}&lt;/ref&gt;<br /> <br /> == Demographics ==<br /> In the [[2021 Canadian census|2021 Census of Population]] conducted by [[Statistics Canada]], Magnetawan had a population of {{val|1753|fmt=commas}} living in {{val|825|fmt=commas}} of its {{val|1717|fmt=commas}} total private dwellings, a change of {{percentage|{{#expr:1753-1390}}|1390|1}} from its 2016 population of {{val|1390|fmt=commas}}. With a land area of {{convert|526.31|km2|sqmi|abbr=on}}, it had a population density of {{Pop density|1753|526.31|km2|sqmi|prec=1}} in 2021.&lt;ref name=2021census&gt;{{cite web | url=https://www150.statcan.gc.ca/t1/tbl1/en/tv.action?pid=9810000203&amp;geocode=A000235 | title=Population and dwelling counts: Canada, [[provinces]] and territories, [[census]] divisions and census subdivisions (municipalities), Ontario | publisher=[[Statistics Canada]] | date=February 9, 2022 | accessdate=March 31, 2022}}&lt;/ref&gt;<br /> {{Canada census<br /> |location = Magnetawan<br /> |2021_population=1,753 | 2021_pop_delta=+26.1 | 2021_land_area=526.31 | 2021_pop_density=3.3<br /> |2021_median_age=59.2 | 2021_median_age_m=59.2 | 2021_median_age_f=59.2<br /> |2021_total_pvt_dwell=825 |2021_mean_hh_income= |2021_geocode=2021A00053549043 | 2021_access_date=2022-04-27<br /> |2016_population=1,390 | 2016_pop_delta=-4.4 | 2016_land_area=531.53 | 2016_pop_density=2.6<br /> |2016_median_age=57.4 | 2016_median_age_m=56.6 | 2016_median_age_f=58.1<br /> |2016_total_pvt_dwell=1,698 | 2016_mean_hh_income=54,336 | 2016_access_date=2018-02-18<br /> |2011_population=1,454 | 2011_pop_delta=-9.7 | 2011_land_area=531.83 | 2011_pop_density=2.7<br /> |2011_median_age=54.3 | 2011_median_age_m=54.6 | 2011_median_age_f=54.1<br /> |2011_total_pvt_dwell=1,782 | 2011_mean_hh_income= | 2011_access_date=2014-02-25<br /> |2006_population=1,610 | 2006_pop_delta=+20.0 | 2006_land_area=523.07 | 2006_pop_density=3.1<br /> |2006_median_age=51.3 | 2006_median_age_m=51.6 | 2006_median_age_f=50.9<br /> |2006_total_pvt_dwell=1,901 | 2006_mean_hh_income=43,551 | 2006_access_date=2014-02-25<br /> |2001_population=1,342 | 2001_pop_delta=+1.4 | 2001_land_area=523.07 | 2001_pop_density=2.6<br /> |2001_median_age=48.5 | 2001_median_age_m=49.2 | 2001_median_age_f=47.9<br /> |2001_total_pvt_dwell=1,837 | 2001_mean_hh_income=35,017 | 2001_access_date=2014-02-25<br /> }}<br /> {{Historical populations<br /> |title = Magnetawan historical populations<br /> |type = Canada<br /> |align = centre<br /> |width = <br /> |state = <br /> |shading = <br /> |percentages = <br /> |footnote =&lt;ref&gt;Statistics Canada: [[Canada 1996 Census|1996]], [[Canada 2001 Census|2001]], [[Canada 2006 Census|2006]], [[Canada 2011 Census|2011]], [[Canada 2016 Census|2016]], [[Canada 2021 Census|2021]]&lt;/ref&gt;<br /> |[[Canada 1996 Census|1996]]| 1324<br /> |[[Canada 2001 Census|2001]]| 1342<br /> |[[Canada 2006 Census|2006]]| 1610<br /> |[[Canada 2011 Census|2011]]| 1454<br /> |[[Canada 2016 Census|2016]]| 1390<br /> |[[Canada 2021 Census|2021]]| 1753<br /> }}<br /> <br /> Prior to amalgamation (1998):<br /> * Population total in 1996: 1,324<br /> ** Magnetawan (village): 241<br /> ** Chapman (township): 645<br /> * Population in 1991:<br /> ** Magnetawan (village): 267<br /> ** Chapman (township): 605<br /> <br /> Mother tongue:&lt;ref&gt;{{cite web|url=http://www12.statcan.gc.ca/census-recensement/2006/dp-pd/prof/92-591/details/page.cfm?Lang=E&amp;Geo1=CSD&amp;Code1=3549043&amp;Geo2=PR&amp;Code2=35&amp;Data=Count&amp;SearchText=magnetawan&amp;SearchType=Begins&amp;SearchPR=01&amp;B1=All&amp;Custom=|title=2006 Magnetawan community profile|date=13 March 2007 }}&lt;/ref&gt;<br /> * English as first language: 87.2%<br /> * French as first language: 2.5%<br /> * English and French as first language: 0%<br /> * Other as first language: 10.3%<br /> <br /> ==Local lakes and rivers==<br /> [[File:Ahmic Harbour ON.jpg|right|thumb|230px|Ahmic Harbour]]<br /> * [[Old Man's Lake]]<br /> * [[Ahmic Lake]] Filled with these species of fish: Smallmouth Bass, Largemouth Bass, Pickerel(Walleye)stocked yearly,&lt;ref name=ebischof4&gt;{{cite journal | last1 = Fox | first1 = M. G. | year = 1993 | title = A comparison of zygote survival of native and non-native walleye stocks in two Georgian Bay rivers | doi = 10.1007/BF00007532 | journal = Environmental Biology of Fishes | volume = 38 | issue = 4| pages = 379–383 | s2cid = 21652235 }}&lt;/ref&gt; Crappie, Sunfish, Rock Bass, Whitefish, Catfish, Perch, and Northern Pike.<br /> [[File:Ahmic Lake.jpg|thumbnail|Calm waters by Ahmic Harbor]]<br /> * [[Lake Cecebe]]<br /> * [[Magnetawan River]]<br /> * Beaver Lake<br /> * Horn Lake<br /> <br /> ==Attractions==<br /> Magnetawan is a historic village with a surrounding municipality that provides various attractions. From the museum to the picturesque waters, are all a part of this municipality. The downtown used to consist of a restaurant named The Magnetawan Inn, also June's Inn, as well as a small hotel/bar, and a General Store. In the summer of 2011, on July 30, the General Store burned down taking part of the Magnetawan Inn with it.&lt;ref&gt;{{cite news |last1=Learn |first1=Rob |title=Magnetawan mourns fiery loss |url=https://www.muskokaregion.com/news-story/3568530-magnetawan-mourns-fiery-loss/ |access-date=June 27, 2019 |work=Huntsville Forester |publisher=Metroland Media Group |date=August 1, 2011}}&lt;/ref&gt;<br /> [[File:Downtown General Store.jpg|thumbnail|General Store in town until 2011]]<br /> <br /> In the downtown, there is now a brand new general store/restaurant built between 2012 and 2013. Also, there is a museum, Lions Pavilion Park, farmers market, little shops, locks/dams, and a LCBO store. In the village, there is a school, churches, golf course named Ahmic Lake Golf Club, post office, a library, the municipality offices, and the municipality pavilion.&lt;ref name=&quot;ebischof3&quot;&gt;{{cite web |url=http://www.magnetawanarea.com |publisher=Magnetawan and Area Businesses |title=Let's get started |access-date=June 27, 2019}}&lt;/ref&gt; Magnetawan is also home to many resorts and rentable cottages. Two of the biggest resorts are Woodland Echoes and Ahmic Lake Resort, where the Swiss Country House Restaurant is located.<br /> <br /> Following is a list of unique characteristics about this town.&lt;ref&gt;{{cite web |title=Local Attractions |url=http://magnetawan.com/local-attractions/ |publisher=Municipality of Magnetawan |access-date=June 27, 2019}}&lt;/ref&gt;<br /> <br /> *Echo Rock on Lake Cecebe<br /> *The shipwreck of the steamboat called the Wenoah in Lake Cecebe<br /> *The Trans-Canada Trail<br /> *Hand Operated Dam and Locks<br /> *Knoepfli and Fagans Falls<br /> *The Lighthouse on the Magnetawan River<br /> *Echo Beach Cottage Resort&lt;ref&gt;{{cite web |title=Echo Beach Cottage Resort |url=http://echobeachcottageresort.ca/ |access-date=June 27, 2019}}&lt;/ref&gt;<br /> *Camp Kahquah &lt;ref name=ebischof7&gt;{{cite web |title=Camp Kahquah|access-date=February 24, 2014 |url=http://www.campkahquah.com/}}&lt;/ref&gt;<br /> *Golfing<br /> *Multipurpose Pavilion<br /> *Horseback riding<br /> *Fishing and boating in the local lakes<br /> *Hunting<br /> *Snowmobiling and dog sledding<br /> <br /> ==See also==<br /> *[[List of townships in Ontario]]<br /> <br /> ==References==<br /> {{reflist|refs=<br /> &lt;ref name=AstridTaimAlmaguin&gt;<br /> {{cite news <br /> | url = https://books.google.com/books?id=l2a6CwAAQBAJ&amp;q=Astrid+Taim<br /> | title = Astrid Taim's Almaguin Chronicles 2-Book Bundle: Almaguin / Almaguin Chronicles<br /> | author = Astrid Taim<br /> | publisher = [[Dundurn Press]]<br /> | year = 2016<br /> | isbn = 9781459737006<br /> | access-date = 2018-09-08<br /> | url-status = live <br /> }}<br /> &lt;/ref&gt;<br /> }}<br /> <br /> ==External links==<br /> {{Commons category|Magnetawan, Ontario}}<br /> {{Wikivoyage}}<br /> *{{official|http://www.magnetawan.com}}<br /> <br /> {{Geographic location<br /> | Centre = Magnetawan<br /> | North = [[Unorganized Centre Parry Sound District|Unorganized Centre Parry Sound]]<br /> | Northeast = [[Machar, Ontario|Machar]]<br /> | East = [[Strong, Ontario|Strong]]<br /> | Southeast = [[Ryerson, Ontario|Ryerson]]<br /> | South = [[McMurrich/Monteith]]<br /> | Southwest = [[Seguin, Ontario|Seguin]]<br /> | West = [[McKellar, Ontario|McKellar]]<br /> | Northwest = [[Whitestone, Ontario|Whitestone]]<br /> }}<br /> {{Parry Sound District}}<br /> <br /> [[Category:Designated places in Ontario]]<br /> [[Category:Municipalities in Parry Sound District]]<br /> [[Category:Single-tier municipalities in Ontario]]<br /> [[Category:Magnetawan River]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Darling_River&diff=1170908040 Darling River 2023-08-17T22:46:47Z <p>205.189.94.9: </p> <hr /> <div>{{distinguish|Great Darling Anabranch}}<br /> {{Use dmy dates|date=March 2020}}<br /> {{Use Australian English|date=June 2011}}<br /> {{short description|Major river in Australia}}<br /> {{Infobox river<br /> | name = Darling River<br /> | native_name ={{native name|drl|Barka}}<br /> | name_other = <br /> | name_etymology = <br /> &lt;!---------------------- IMAGE &amp; MAP --&gt;<br /> | image = Aerial view of the Darling River.jpg<br /> | image_size = 320<br /> | image_caption = Aerial view of the Darling River near [[Menindee]]<br /> | map = Murray-catchment-map MJC02.png<br /> | map_size = <br /> | map_caption = The Darling is a major tributary of the Murray-Darling system<br /> | pushpin_map = <br /> | pushpin_map_size = <br /> | pushpin_map_caption= <br /> &lt;!---------------------- LOCATION --&gt;<br /> | subdivision_type1 = Country<br /> | subdivision_name1 = [[Australia]]<br /> | subdivision_type2 = State<br /> | subdivision_name2 = [[New South Wales]]<br /> | subdivision_type3 = <br /> | subdivision_name3 = <br /> | subdivision_type4 = <br /> | subdivision_name4 = <br /> | subdivision_type5 = Cities<br /> | subdivision_name5 = [[Bourke, New South Wales|Bourke]], [[Wilcannia, New South Wales|Wilcannia]], [[Menindee, New South Wales|Menindee]], [[Wentworth, New South Wales|Wentworth]]<br /> &lt;!---------------------- PHYSICAL CHARACTERISTICS --&gt;<br /> | length = {{convert|1472|km|mi|abbr=on}} <br /> | width_min = <br /> | width_avg = <br /> | width_max = <br /> | depth_min = <br /> | depth_avg = <br /> | depth_max = <br /> | discharge1_location= <br /> | discharge1_min = <br /> | discharge1_avg = {{convert|100|m3/s|cuft/s|abbr=on}} approx.<br /> | discharge1_max = <br /> &lt;!---------------------- BASIN FEATURES --&gt;<br /> | source1 = confluence of [[Barwon River (New South Wales)|Barwon]] and [[Culgoa River|Culgoa]] Rivers<br /> | source1_location = near [[Brewarrina]], [[New South Wales|NSW]]<br /> | source1_coordinates= {{coord|29|57|31|S|146|18|28|E|display=inline}}<br /> | source1_elevation = {{convert|119|m|abbr=on}}<br /> | mouth = confluence with [[Murray River]]<br /> | mouth_location = [[Wentworth, New South Wales|Wentworth]], [[New South Wales|NSW]]<br /> | mouth_coordinates = {{coord|34|6|47|S|141|54|43|E|display=inline,title}}<br /> | mouth_elevation = {{convert|35|m|abbr=on}}<br /> | progression = <br /> | river_system = [[Murray River]], [[Murray-Darling basin]]<br /> | basin_size = {{convert|609283|km2|abbr=on}}<br /> | tributaries_left = [[Barwon River (New South Wales)|Barwon River]], [[Bogan River|Little Bogan River]]<br /> | tributaries_right = [[Culgoa River]], [[Warrego River]], [[Paroo River]]<br /> | custom_label = <br /> | custom_data = <br /> | extra = <br /> }}<br /> The '''Darling River''' ([[Paakantyi (Darling language)|Paakantyi]]: ''Baaka'' or ''Barka'') is the third-longest river in [[Australia]], measuring {{convert|1472|km|mi|0}} from its source in northern [[New South Wales]] to its confluence with the [[Murray River]] at [[Wentworth, New South Wales|Wentworth]], New South Wales. Including its longest contiguous tributaries it is {{convert|2844|km|mi|0|abbr=on}} long, making it the longest river system in Australia.&lt;ref&gt;{{cite web|url=http://www.ga.gov.au/education/geoscience-basics/landforms/longest-rivers.html|title=(Australia's) Longest Rivers|publisher=[[Geoscience Australia]]|date=16 October 2008|access-date=2009-02-16}}&lt;/ref&gt; The Darling River is the [[outback]]'s most famous waterway.&lt;ref name=&quot;drthr&quot;&gt;{{cite news |url=http://www.couriermail.com.au/travel/australia/an-historic-route-darling/story-e6freqxf-1111118648230 |title=Darling River townships offer historic route |author=Sally Macmillan |date=24 January 2009 |newspaper=[[The Courier-Mail]] |publisher=Queensland Newspapers |access-date=30 October 2010 |archive-date=12 June 2012 |archive-url=https://web.archive.org/web/20120612033509/http://www.couriermail.com.au/travel/australia/an-historic-route-darling/story-e6freqxf-1111118648230 |url-status=dead }}&lt;/ref&gt;<br /> <br /> It is assumed that all known media was produced pre-20th century north of [[Magnetawan]], [[Ontario]], and despite being of an [[Australian]] river, is a wholly [[Canadian]] lithographic sheet.<br /> <br /> The Darling is in poor [[ecological health|health]],&lt;ref&gt;{{cite web |title=Challenges facing the Murray–Darling Basin |url=https://www.mdba.gov.au/issues-murray-darling-basin |website=Murray-Darling Basin Authority |date=24 September 2020 |access-date=8 September 2021}}&lt;/ref&gt; suffering from over-allocation of its waters to [[irrigation]],&lt;ref&gt;{{cite news |last1=DAVIES |first1=Anne |title=NSW exceeds Barwon-Darling water allocations in first year of compliance after regime overhaul |url=https://www.theguardian.com/australia-news/2021/aug/03/nsw-exceeds-barwon-darling-water-allocations-in-first-year-of-compliance-after-regime-overhaul |access-date=8 September 2021 |work=The Guardian |date=3 August 2021}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last1=McCORMICK |first1=Bill |title=Murray-Darling Basin water issues |url=https://www.aph.gov.au/About_Parliament/Parliamentary_Departments/Parliamentary_Library/pubs/BriefingBook43p/murraydarlingissues |website=Parliamentary Library |publisher=Commonwealth of Australia |access-date=8 September 2021}}&lt;/ref&gt; [[pollution]] from [[pesticide]] runoff,&lt;ref&gt;{{cite web |title=Two thirds of farmland at risk of pesticide pollution |url=https://www.sydney.edu.au/news-opinion/news/2021/03/30/two-thirds-of-farmland-at-risk-of-pesticide-pollution.html |website=University of Sydney |access-date=8 September 2021 |date=30 March 2021}}&lt;/ref&gt;&lt;ref&gt;{{cite news |last1=Nearmy |first1=Tracey |title=Thirst turns to anger as Australia's mighty river runs dry |url=https://www.reuters.com/article/us-australia-drought-widerimage/thirst-turns-to-anger-as-australias-mighty-river-runs-dry-idUSKBN1X22TT |access-date=15 April 2020 |work=Reuters |date=24 October 2019}}&lt;/ref&gt; and prolonged [[drought in Australia|drought]]. During drought periods in 2019 it barely flowed at all. The river has a high salt content and declining [[water quality]]. Increased rainfall in its catchment in 2010 improved its flow, but the health of the river will depend on long-term management.&lt;ref&gt;{{cite web|url=https://www.mercurynews.com/2019/10/23/anger-grows-in-australia-as-the-darling-river-dries-up/|website=mercurynews.com|title=Anger grows in Australia as the Darling River dries up|date=23 October 2019}}&lt;/ref&gt;<br /> <br /> The [[Division of Darling]], [[Division of Riverina-Darling]], [[Electoral district of Darling]] and [[Electoral district of Lachlan and Lower Darling]] were named after the river.<br /> <br /> ==History==<br /> [[File:William Piguenit - The Flood in the Darling 1890.jpg|thumb|left|230px|''The flood in the Darling'', 1890, oil on canvas by [[William Charles Piguenit]]]]<br /> Aboriginal peoples have lived along the Darling River for tens of thousands of years. The [[Barkindji]] people called it ''Baaka''&lt;ref&gt;{{cite web |url=https://www.abc.net.au/news/2020-05-13/families-set-up-tent-town-on-darling-river-to-avoid-covid-19/12237976 |date=13 May 2020 |access-date=9 July 2020 |title=Indigenous community sets up camp on Darling River to avoid coronavirus risk in overcrowded homes |first=Aimee |last=Volkofsky |work=[[ABC News and Current Affairs|ABC News]] |publisher=[[Australian Broadcasting Corporation]] |quote=The Darling River, known locally as the Baaka, is central to Barkindji culture }}&lt;/ref&gt; or ''Barka'', &quot;Barkindji&quot; meaning &quot;people of the Barka&quot;.<br /> <br /> The Queensland headwaters of the Darling (the area now known as the [[Darling Downs]]) were gradually colonized from 1815 onward. In 1828 the explorers [[Charles Sturt]] and [[Hamilton Hume]] were sent by the Governor of New South Wales, [[Ralph Darling|Sir Ralph Darling]], to investigate the course of the [[Macquarie River]]. He visited the Bogan River and then, early in 1829, the upper Darling, which he named after the Governor. In 1835, Major [[Thomas Mitchell (explorer)|Thomas Mitchell]] travelled a {{convert|483|km|adj=on}} portion of the Darling River.&lt;ref name=&quot;mstl&quot;&gt;{{cite book |url=http://www.adb.online.anu.edu.au/biogs/A020206b.htm |title=Mitchell, Sir Thomas Livingstone (1792–1855) |author=Baker, D. W. A. |date=1967 |work=[[Australian Dictionary of Biography]] |publisher=Melbourne University Publishing |access-date=17 March 2011 }}&lt;/ref&gt; Although his party never reached the junction with the Murray River he correctly assumed the rivers joined.<br /> <br /> In 1856, the [[Blandowski Expedition]] set off for the junction of the Darling and Murray Rivers to discover and collect fish species for the National Museum.&lt;ref name=&quot;adbwb&quot;&gt;{{cite book |url=http://adbonline.anu.edu.au/biogs/A030174b.htm |title=Blandowski, William (1822–1878) |work=[[Australian Dictionary of Biography]] |publisher=[[Australian National University]] |access-date=31 January 2011 }}&lt;/ref&gt; The expedition was a success with 17,400 specimens arriving in Adelaide the next year.<br /> <br /> Although its flow is extraordinarily irregular (the river dried up forty-five times between 1885 and 1960), in the later 19th century the Darling became a major transportation route, the [[pastoral farming|pastoralist]]s of western New South Wales using it to send their wool by shallow-draft [[paddle steamer]] from busy river ports such as [[Bourke, New South Wales|Bourke]] and [[Wilcannia, New South Wales|Wilcannia]] to the South Australian railheads at [[Morgan, South Australia|Morgan]] and [[Murray Bridge, South Australia|Murray Bridge]]. But over the past century the river's importance as a transportation route has declined.<br /> <br /> In 1992, the Darling River suffered from severe [[Cylindrospermopsin|cyanobacterial bloom]] that stretched the length of the river.&lt;ref name=&quot;ab&quot;&gt;{{Cite news |url=http://www.clw.csiro.au/issues/water/rivers_estuaries/algal.html |title=Algal Blooms |access-date=15 March 2011 |date=28 January 2011 |publisher=CSIRO Land and Water |url-status=dead |archive-url=https://web.archive.org/web/20110402183918/http://www.clw.csiro.au/issues/water/rivers_estuaries/algal.html |archive-date=2 April 2011 |df=dmy }}&lt;/ref&gt; The presence of phosphorus was essential for the toxic algae to flourish. Flow rates, turbulence, turbidity and temperature were other contributing factors.<br /> <br /> In 2008, the Federal government purchased [[Toorale Station]] in northern New South Wales for $23&amp;nbsp;million. The purchase allowed the government to return {{convert|11|GL|e9impgal+e9USgal|spell=in|lk=on}} of [[environmental flow]]s back into the Darling.&lt;ref name=&quot;wsd&quot;&gt;{{Cite news |url=http://www.theaustralian.com.au/news/nation/wong-slaps-down-critics-of-23bn-water-purchase/story-e6frg6nf-1225817491470 |title=Wong slaps down critics of $23m Darling River water purchase |author=Franklin, Matthew |access-date=30 October 2010 |date=9 January 2010 |newspaper=[[The Australian]] |publisher=News Limited }}&lt;/ref&gt;<br /> <br /> In 2019, a crisis on the Lower Darling saw up to 1 million fish die. A report by the [[Australia Institute]] said this was largely due to the decisions by the Murray-Darling Basin Authority on instructions from the New South Wales government. It said the reasons for those decisions appeared to be about building the case for the new [[Broken Hill]] pipeline and the [[Menindee Lakes]] project. Maryanne Slattery, senior water researcher with the Australia Institute; &quot;To blame the fish kill on the drought is a cop-out, it is because water releases were made from the lakes when this simply shouldn't have happened.&lt;ref&gt;{{cite web |title=New South Wales government largely culpable for fish kill, report finds |date=2019-01-18 |website=[[The Guardian]] |archive-url=https://web.archive.org/web/20230328085125/https://www.theguardian.com/australia-news/2019/jan/19/murray-darling-basin-authority-and-nsw-largely-culpable-for-fish-kill-report-finds |archive-date=2023-03-28 |url-status=live |url=https://www.theguardian.com/australia-news/2019/jan/19/murray-darling-basin-authority-and-nsw-largely-culpable-for-fish-kill-report-finds}}&lt;/ref&gt;<br /> <br /> ==Course==<br /> The whole [[Murray–Darling basin|Murray–Darling river system]], one of the largest in the world, drains all of New South Wales west of the [[Great Dividing Range]], much of northern [[Victoria (Australia)|Victoria]] and southern Queensland and parts of [[South Australia]]. Its meandering course is three times longer than the direct distance it traverses.&lt;ref name=&quot;swr&quot;&gt;{{cite web|url=http://www2.mdbc.gov.au/nrm/water_issues/surface_water.html |title=Surface Water Resources |date=29 October 2006 |publisher=Murray Darling Basin Commission |access-date=31 January 2011 |url-status=dead |archive-url=https://web.archive.org/web/20110219013719/http://www2.mdbc.gov.au/nrm/water_issues/surface_water.html |archive-date=19 February 2011 |df=dmy }}&lt;/ref&gt;<br /> <br /> Much of the land that the Darling flows through are plains and is therefore relatively flat, having an average gradient of just 16&amp;nbsp;mm per kilometre.&lt;ref name=&quot;tdr&quot;/&gt; Officially the Darling begins between [[Brewarrina]] and [[Bourke, New South Wales|Bourke]] at the [[confluence]] of the [[Culgoa River|Culgoa]] and [[Barwon River (New South Wales)|Barwon]] rivers; streams whose tributaries rise in the ranges of southern [[Queensland]] and northern [[New South Wales]] west of the [[Great Dividing Range]]. These tributaries include the [[Balonne River]] (of which the Culgoa is one of three main branches) and its tributaries; the Condamine [which rises in the Main Range about 100&amp;nbsp;km inland from Pt. Danger, on the Queensland/New South Wales border], the [[Macintyre River]] and its tributaries such as the [[Dumaresq River]] and the [[Severn River (New South Wales)|Severn River]]s (there are two – one either side of the state border); the [[Gwydir River]]; the [[Namoi River]]; the [[Castlereagh River]]; and the [[Macquarie River]]. Other rivers join the Darling near Bourke or below – the [[Bogan River]], the [[Warrego River]] and [[Paroo River]].<br /> [[Image:Darling River Louth.JPG|thumb|Darling River at [[Louth, New South Wales|Louth]]]]<br /> South east of [[Broken Hill, New South Wales|Broken Hill]], the [[Menindee Lakes]] are a series of lakes that were once connected to the Darling River by short creeks.&lt;ref name=&quot;dtdml&quot;&gt;{{cite web |url=http://www.discoveringthedarling.com.au/index.php?pgid=62 |title=Menindee Lakes |work=Discovering the Darling |publisher=Murray Darling Environmental Foundation |access-date=16 January 2012 |archive-date=3 April 2011 |archive-url=https://web.archive.org/web/20110403030121/http://www.discoveringthedarling.com.au/index.php?pgid=62 |url-status=dead }}&lt;/ref&gt; The Menindee Lake Scheme has reduced the frequency of flooding in the Menindee Lakes. As a result, about 13,800 hectares of [[Muehlenbeckia florulenta|lignum]] and 8,700 hectares of [[Eucalyptus largiflorens|Black box]] have been destroyed.&lt;ref name=&quot;dtdml&quot;/&gt; Weirs and constant low flows have fragmented the river system and blocked fish passage.<br /> <br /> The Darling River runs south-south-west, leaving the [[Far West (New South Wales)|Far West]] region of New South Wales, to join the [[Murray River]] on the New South Wales – Victoria border at [[Wentworth, New South Wales]].<br /> <br /> The [[Barrier Highway]] at Wilcannia, the [[Silver City Highway]] at Wentworth and the [[Broken Hill railway line]] at Menindee, all cross the Darling River. Part of the river north of Menindee marks the border of [[Kinchega National Park]]. In response to the [[1956 Murray River flood]], a weir was constructed at [[Menindee, New South Wales|Menindee]] to mitigate flows from the Darling River.<br /> <br /> The north of the Darling River is in the [[Southeast Australia temperate savanna]] [[ecoregion]] and the southwest of the Darling is part of the [[Murray Darling Depression]] ecoregion.<br /> <br /> ===Population centres===<br /> <br /> Major settlements along the river include Brewarrina, [[Bourke, New South Wales|Bourke]], [[Louth, New South Wales|Louth]], [[Tilpa, New South Wales|Tilpa]], [[Wilcannia, New South Wales|Wilcannia]], [[Menindee, New South Wales|Menindee]], [[Pooncarie, New South Wales|Pooncarie]] and [[Wentworth, New South Wales|Wentworth]]. Wentworth was Australia's busiest inland port in the late 1880s.&lt;ref name=&quot;drthr&quot;/&gt;<br /> <br /> Navigation by [[steamboat]] to Brewarrina was first achieved in 1859.&lt;ref name=&quot;tdr&quot;&gt;{{cite web|url=http://www.centraldarling.nsw.gov.au/about/1001.html |title=The Darling River |publisher=Central Darling Shire Council |access-date=30 October 2010 |url-status=dead |archive-url=https://web.archive.org/web/20110215213644/http://centraldarling.nsw.gov.au/about/1001.html |archive-date=15 February 2011 |df=dmy }}&lt;/ref&gt; Brewarrina was also the location of intertribal meetings for [[Indigenous Australians]] who speak [[Darling language|Darling]] and live in the river basin. Ancient [[fish trap]]s in the river provided food for feasts. These [[Brewarrina Aboriginal Fish Traps|heritage listed rock formations]] have been estimated at more than 40,000 years old making them the oldest man-made structure on the planet.&lt;ref name=&quot;drthr&quot;/&gt;<br /> <br /> ==In popular culture==<br /> Australian poet [[Henry Lawson]] wrote a well-known ironic tribute to the Darling River.&lt;ref&gt;{{cite web | last = Lawson | first = Henry | title = The Darling River | publisher = Classic Reader | url = http://www.classicreader.com/read.php/bookid.741/sec./ | access-date = 2008-05-28}}&lt;/ref&gt; To quote another Henry Lawson poem:<br /> <br /> {{quote|The skies are brass and the plains are bare, &lt;br /&gt;Death and ruin are everywhere;&lt;br /&gt;And all that is left of the last year's flood&lt;br /&gt;Is a sickly stream on the grey-black mud;&lt;br /&gt;The salt-springs bubble and the quagmires quiver,&lt;br /&gt;And this is the dirge of the Darling River.|Henry Lawson}}<br /> <br /> He also wrote about the river in ''[[The Union Buries Its Dead]]'' and &quot;Andy's Gone With Cattle&quot;. Other [[bush poet]]s who have written about the river include Scots-Australian [[William Henry Ogilvie|Will H. Ogilvie]] (1869–1963) and [[Breaker Morant]] (1864–1902).&lt;ref name=&quot;bsct&quot;&gt;{{cite web|url=http://www.bourke.local-e.nsw.gov.au/tourism/13731/1018.html |title=The Darling River |publisher=Bourke Shire Council |access-date=31 January 2011 |url-status=dead |archive-url=https://web.archive.org/web/20110216033255/http://www.bourke.local-e.nsw.gov.au/tourism/13731/1018.html |archive-date=16 February 2011 |df=dmy }}&lt;/ref&gt;<br /> <br /> The Australian band [[Midnight Oil]] wrote a song called &quot;The Barka-Darling River&quot; for their album [[Resist (Midnight Oil album)|''Resist'']], drawing attention to the negative effects of cotton farming on the environment and people connected to the river.<br /> <br /> == Gallery ==<br /> &lt;gallery caption=&quot;&quot; class=&quot;center&quot;&gt;<br /> Image:Bourke_Darling_River.jpg|The Darling River from Bourke wharf (2010)<br /> Image:Old bridge over Darling in Bourke.JPG|Old North Bourke Bridge, opened in 1883 (2014)<br /> Image:Bridge over the Darling at North Bourke-1 (5141753186).jpg| Lifting span of the old North Bourke Bridge<br /> File:AU-NSW-North Bourke-Old North Bourke bridge northside-2021.jpg|Old North Bourke bridge, in flood, northern side, North Bourke (2021)<br /> File:AU-NSW-North Bourke-Old North Bourke bridge southside-2021.jpg|Old North Bourke bridge, in flood, southern side, North Bourke (2021)<br /> &lt;/gallery&gt;<br /> <br /> ==See also==<br /> {{stack|{{Portal|New South Wales|Environment|Water}}}}<br /> *[[Darling River hardyhead]]<br /> *[[Great Darling Anabranch]]<br /> *{{Section link|List of rivers of Australia|New South Wales}}<br /> *[[List of Darling River distances]]<br /> *[[Water security in Australia]]<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==External links==<br /> {{Commons category|Darling River|&lt;br/&gt;Darling River}}<br /> * {{cite web|url=http://www.environment.nsw.gov.au/ieo/MacquarieBogan/maplg.htm|title=Macquarie-Bogan River catchment|format=map|work=Office of Environment and Heritage|publisher=[[Government of New South Wales]]}}<br /> * {{cite web|url=http://www.environment.nsw.gov.au/ieo/FarWest/maplg.htm|title=Barwon, Darling and Far Western catchments|format=map|work=Office of Environment and Heritage|publisher=[[Government of New South Wales]]}}<br /> * [http://www.news.com.au/dailytelegraph/story/0,22049,21854160-5006009,00.html &quot;A river runs through it&quot;] Daily Telegraph article – 6 June 2007<br /> * [https://www.flickr.com/photos/19959143@N00/sets/72157594202630154/detail Photos of the Darling/Barwon river between Brewarrina and Bourke, taken over 2003–2006.] Flickr<br /> <br /> {{Rivers of the Darling River catchment |state=autocollapse}}<br /> {{Rivers of New South Wales |state=autocollapse}}<br /> {{Rivers of the Murray–Darling basin|state=collapsed}}<br /> <br /> {{Authority control}}<br /> <br /> [[Category:Darling River| ]]<br /> [[Category:Rivers in the Riverina]]<br /> [[Category:Far West (New South Wales)]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Dispilio_Tablet&diff=1170907766 Dispilio Tablet 2023-08-17T22:44:38Z <p>205.189.94.9: </p> <hr /> <div>{{short description|Wooden tablet bearing inscribed markings, discovered in Dispilio, Greece}}<br /> [[File:Dispilio signs.jpg|thumb|right|A: samples of carved &quot;signs&quot; on the wooden Dispilio tablet and clay finds from Dispilio, Greece. B: samples of Linear A signs. C: samples of signs on Paleo-European clay tablets.]]<br /> <br /> The '''Dispilio tablet''' is a wooden tablet bearing inscribed markings, unearthed during [[George Hourmouziadis]]'s excavations of [[Dispilio]] in [[Greece]], and [[carbon 14]]-dated to 5202 (± 123) BC.&lt;ref&gt;{{cite journal |last1=Facorellis |first1=Yorgos |last2=Sofronidou |first2=Marina |last3=Hourmouziadis |first3=Giorgos |year=2014 |title=Radiocarbon dating of the Neolithic lakeside settlement of Dispilio, Kastoria, Northern Greece |journal=Radiocarbon |doi=10.2458/56.17456 |volume=56 |number=2 |pages=511–528 |s2cid=128879693 |url=https://journals.uair.arizona.edu/index.php/radiocarbon/article/view/17456/pdf}}&lt;/ref&gt; It was discovered in 1993 in a [[Neolithic]] lakeshore settlement that occupied an artificial island&lt;ref&gt;Whitley, James. &quot;Archaeology in Greece 2003–2004&quot;. ''Archaeological Reports'', No. 50 (2003, pp. 1–92), p. 43.&lt;/ref&gt; near the modern village of Dispilio on Lake Kastoria in [[Kastoria (regional unit)|Kastoria]], [[Western Macedonia]], [[Greece]].<br /> <br /> ==Discovery==<br /> {{further|Dispilio}}<br /> The lake settlement itself was discovered during the dry winter of 1932, which lowered the lake level and revealed traces of the settlement. A preliminary survey was made in 1935 by [[Antonios Keramopoulos]]. Excavations began in 1992, led by George Hourmouziadis, professor of prehistoric archaeology at the [[Aristotle University of Thessaloniki]]. The site appears to have been occupied over a long period, from the final stages of the [[Neolithic|Middle Neolithic]] (5600–5000 BC) to the Final Neolithic (3000 BC). A number of items were found, including ceramics, wooden structural elements, the remains of wooden walkways,&lt;ref&gt;Similar walkways have been found on the [[Somerset Levels]] (Whitley 2003:43).&lt;/ref&gt; seeds, bones, figurines, personal ornaments, flutes and a tablet with marks on it.<br /> <br /> The tablet's discovery was announced at a symposium in February 1994 at the [[University of Thessaloniki]].&lt;ref&gt;OWENS, GARETH A.. &quot;BALKAN NEOLITHIC SCRIPTS&quot; , vol. 38, no. 1-2, 1999, pp. 114-120&lt;/ref&gt; The site's paleoenvironment, botany, fishing techniques, tools and ceramics were described informally in a magazine article in 2000,&lt;ref&gt;''Eptakyklos: literary and archaeological magazine'', June 2000&lt;/ref&gt; and by Hourmouziadis in 2002.{{citation needed|date=February 2022}}<br /> <br /> The tablet itself was partially damaged when it was exposed to the oxygen-rich environment outside of the mud and water in which it was immersed for a long period of time, and so it was placed under conservation. {{Asof|2023}}, the full academic publication of the tablet apparently awaits the completion of conservation work. On the understanding and confirmation of such mathematics acquired in the dissemination of the tablet, information could directly collate to dimensions of architectural commonplaces.<br /> <br /> ==See also==<br /> {{Portal|Greece}}<br /> {{columnslist|colwidth=22em|<br /> *[[Phaistos Disc]]<br /> *[[Arkalochori Axe]]<br /> *[[Vinča culture]]<br /> *[[Vinča symbols]] (sometimes referred to as the Old European script)<br /> *[[Tărtăria tablets]]<br /> *[[Neolithic Europe]]<br /> *[[Sitovo inscription]]<br /> }}<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==Sources==<br /> *G. H. Hourmouziadis, ed., ''Dispilio, 7500 Years After''. Thessaloniki, 2002.<br /> *G. H. Hourmouziadis, ''Ανασκαφής Εγκόλπιον''. Athens, 2006.<br /> <br /> ==External links==<br /> *[http://anaskamma.wordpress.com/ Anaskamma, an academic journal of the excavational team]<br /> <br /> [[Category:6th-millennium BC works]]<br /> [[Category:1993 archaeological discoveries]]<br /> [[Category:1993 in Greece]]<br /> [[Category:Archaeology of Greece]]<br /> [[Category:Inscriptions in undeciphered writing systems]]<br /> [[Category:Inscriptions in unknown languages]]<br /> [[Category:Neolithic]]<br /> [[Category:Neolithic Macedonia (region)]]<br /> [[Category:Pre-Indo-Europeans]]<br /> [[Category:Proto-writing]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=T%C4%83rt%C4%83ria_tablets&diff=1170907611 Tărtăria tablets 2023-08-17T22:43:25Z <p>205.189.94.9: </p> <hr /> <div>{{Short description|Schytian Neolithic artefacts purported to contain writing}}<br /> {{more footnotes needed|date=February 2022}}<br /> [[File:Tartaria amulet retouched.PNG|thumb|[[Neolithic]] clay [[amulet]] (retouched), part of the Tărtăria tablets set, supposedly dated to {{circa|5500–2750 BC}} and associated with the [[Turdaş-Vinča culture]].|248x248px]]<br /> <br /> The '''Tărtăria tablets''' ({{IPA-ro|tərtəˈri.a}}) are three [[clay tablet|tablets]], reportedly discovered in 1961 at a [[Neolithic]] site in the village of [[Tărtăria |Tărtăria (Alsótárlaka)]] (about {{convert|30|km|mi|0|abbr=on}} from [[Alba Iulia]]), from Transylvania.{{sfn|Merlini|Lazarovici|2008|p=111}}<br /> <br /> The tablets bear incised symbols associated with the [[Text corpus|corpus]] of the [[Vinča symbols]] and have been the subject of considerable controversy among [[archaeologist]]s, some of whom have argued that the symbols represent the earliest known form of [[writing]] in the world. Accurately dating the tablets is difficult as the [[stratigraphy]] pertaining to their discovery is disputed, and a heat treatment performed after their discovery has prevented the possibility of directly [[radiocarbon dating]] the tablets.{{sfn|Merlini|Lazarovici|2008|pp=115-124}}<br /> <br /> Based on the account of their discovery which associates the tablets with the [[Vinča culture]] and on indirect radiocarbon evidence, some scientists propose that the tablets date to around {{circa|5300 BC}}, predating Mesopotamian [[pictograph]]ic [[proto-writing]].{{sfn|Merlini|Lazarovici|2008|pp=111, 131}} Some scholars have disputed the authenticity of the account of their discovery, suggesting the tablets are an intrusion from the upper strata of the site.{{sfn|Merlini|Lazarovici|2008|pp=132–134}} Other scholars, contesting the radiocarbon dates for Neolithic Southeastern Europe, have suggested that Tărtăria signs are in some way related to Mesopotamian proto-writing, particularly Sumerian [[proto-cuneiform]], which they argued was contemporary.{{sfn|Merlini|Lazarovici|2008|pp=125–130}}<br /> Some look to be drawings of furniture, perhaps check dried yogurt and berries for a relational dynamic on the subject.<br /> == Discovery ==<br /> [[File:Tartaria2.jpg|thumb|One of the Tărtăria tablets]]<br /> In 1961, members of a team led by Nicolae Vlassa (an archaeologist at the [[National Museum of Transylvanian History]], [[Cluj-Napoca]]) reportedly unearthed three inscribed but [[Pottery#Firing|unfired]] [[clay tablet]]s, twenty-six clay and stone [[figurine]]s, a shell bracelet, and the burnt,{{dubious|See discussion page: NOT burnt, wrong and outdated claim.|date=October 2018}} broken, and disarticulated bones of an adult female sometimes referred to as &quot;Milady Tărtăria&quot;.{{sfn|Merlini|Lazarovici|2008|pp=111–117, 166-180}}<br /> <br /> There is no consensus on the interpretation of the burial, but it has been suggested that the body was likely that of a respected local wise-person, [[shaman]], or spirit-medium.&lt;ref name=&quot;Whittle&quot; /&gt;<br /> <br /> === Disputed authenticity ===<br /> It is disputed whether the tablets were actually found at the reported site, and Vlassa never discussed the circumstances of the find of the stratigraphy.{{sfn|Merlini|Lazarovici|2008|pp=111–117}}<br /> <br /> The authenticity of the engravings has also been disputed. A recent claim of [[forgery]] is based on the similarity between some of the symbols and reproductions of Sumerian symbols in popular Romanian literature available at the time of the discovery.&lt;ref&gt;Qasim, Erika: ''Die Tărtăria-Täfelchen – eine Neubewertung''. In: ''Das Altertum'', {{ISSN|0002-6646}}, vol. 58, 4 (2013), p. 307–318&lt;/ref&gt;<br /> <br /> == Description ==<br /> [[File:Tatárlaka.PNG|thumb|upright=1.2|Illustrations of each of three tablets]]<br /> Two of the tablets are rectangular and the third is round.{{sfn|Merlini|Lazarovici|2008|p=116}} They are all small, the round one being only {{convert|6|cm|in|frac=2|abbr=on}} across, and two—the round one and one rectangular tablet—have holes drilled through them. All three have symbols inscribed on only one face.{{sfn|Merlini|Lazarovici|2008|p=116}} The unpierced rectangular tablet depicts a horned animal, an unclear figure, and a vegetal motif such as a branch or tree. The others consist of a variety of mainly abstract symbols.&lt;ref name=&quot;Whittle&quot;&gt;[[Alasdair W. R. Whittle]], ''Europe in the Neolithic: The Creation of New Worlds'', p. 101. Cambridge University Press, 1996.&lt;/ref&gt;<br /> <br /> === Dating ===<br /> Workers at the conservation department of the [[Cluj]] museum baked the originally unbaked clay tablets in order to preserve them, making it impossible to directly date the tablets with the [[radiocarbon dating|carbon 14 method]].{{sfn|Merlini|Lazarovici|2008|pp=118–119}}<br /> <br /> The tablets are generally believed to have belonged to the [[Vinča culture|Vinča-Turdaș culture]], which was originally thought to have originated around 2700 BCE by Serbian and Romanian archaeologists. The discovery garnered attention from the archeological world because it predates the first [[Minoan civilisation|Minoan]] writing, the oldest known writing in Europe.<br /> <br /> Subsequent [[radiocarbon dating]] of the other Tărtăria finds, extended by association also to the tablets, pushed the date of the site (and therefore of the whole Vinča culture) to approximately 5500 BCE, the time of the early [[Eridu]] phase of the [[Sumer]]ian civilization in [[Mesopotamia]].&lt;ref name=&quot;Becker&quot;&gt;Carl J. Becker, ''A Modern Theory Of Language Evolution'', p. 346 (iUniverse, 2004).&lt;/ref&gt; Still, this is disputed in light of apparently contradictory [[Stratification (archeology)|stratigraphic]] evidence.&lt;ref&gt;[[H. W. F. Saggs]], ''Civilization Before Greece and Rome'', p. 75 (Yale University Press, 1998).&lt;/ref&gt;<br /> <br /> It has been controversially claimed that if the symbols are indeed a form of writing, then writing in the [[Danubian culture]] would far predate the earliest Sumerian [[cuneiform script]] or [[Egyptian hieroglyphs]]. Thus, they would be the world's earliest known form of writing.<br /> <br /> == Historical context ==<br /> <br /> === Hypothesis of Danubian culture ===<br /> The term [[Danubian culture]] was proposed by [[V. Gordon Childe]] to describe the first agrarian society in central and eastern Europe. This hypothesis and the appearance of writing in this space is supported by Marco Merlini,&lt;ref&gt;Marco Merlini &quot;La scrittura è natta in Europa&quot;, Avverbi, Roma, 2004&lt;/ref&gt; [[Harald Haarmann]], Joan Marler,&lt;ref&gt;Harald Haarmann, Joan Marler, An introduction to the study of the Danube Script, Journal of Archeomythology, Vol.4, 2008&lt;/ref&gt; Gheorghe Lazarovici,&lt;ref&gt;Gheorghe Lazarovici, Cornelia-Magda Lazarovici, Marco Merlini. TĂRTĂRIA and the sacred tablets, Editura Mega, Cluj-Napoca, 2011 {{ISBN|978-606-543-160-7}}&lt;/ref&gt; and many others.<br /> <br /> === Proposed links to Sumerian culture ===<br /> [[Colin Renfrew]] argues that the apparent similarities with Sumerian symbols are deceptive: {{blockquote|&quot;To me, the comparison made between the signs on the Tărtăria tablets and those of proto-literate Sumeria carry very little weight. They are all simple pictographs, and a sign for a goat in one culture is bound to look much like the sign for a goat in another. To call these Balkan signs 'writing' is perhaps to imply that they had an independent significance of their own communicable to another person without oral contact. This I doubt.&quot;&lt;ref&gt;[[Colin Renfrew]], ''Before civilization: The radiocarbon revolution and prehistoric Europe'', p. 186 (Jonathan Cape, 1973)&lt;/ref&gt;}}<br /> <br /> === Possibly related finds in the region ===<br /> <br /> ==== Artifacts bearing Vinča symbols ====<br /> {{main|Vinča symbols}}<br /> The Vinča symbols have been known since the late 19th century excavation by [[Zsófia Torma]] (1832–1899){{sfn|Gimbutas|2001|p=50}} at the Neolithic site of [[Turdaș]] (Hungarian: ''Tordos'') in [[Transylvania]], at the time part of [[Austria-Hungary]], the [[type site]] of the [[Tordos culture]], a late, regional variation of the Vinča culture.<br /> <br /> ==== Other artifacts ====<br /> This group of artifacts, including the tablets, have some relation with the culture developed in the [[Black Sea]] – [[Aegean Sea]] area. Similar artefacts have been found in [[Bulgaria]] (e.g. the [[Gradeshnitsa tablets]]) and northern [[Macedonia (Greece)|Greece]] (the [[Dispilio Tablet]]). The material and the style used for the Tartaria artefacts show some similarities to those used in the [[Cyclades]] area, as two of the statuettes are made of alabaster.{{original research inline|date=February 2022}}{{Citation needed|date=December 2011}}<br /> <br /> == Purpose and meaning ==<br /> The meaning (if any) of the symbols is unknown, and their nature has been the subject of much debate.<br /> <br /> === Writing system ===<br /> Scholars who conclude that the inscribed symbols are writing are basing their assessment on a few assumptions that are not universally endorsed:<br /> *The existence of similar signs on other artifacts of the Danube civilization suggest that there was an inventory of standard shapes used by [[scribe]]s.<br /> *The symbols are highly standardised and have a rectilinear shape comparable to that manifested by archaic [[writing system]]s.<br /> *The information communicated by each character was specific, with an unequivocal meaning.<br /> *The inscriptions are sequenced in rows, whether horizontal, vertical, or circular.<br /> <br /> If they do comprise a script, it is not known what kind of writing system they represent. Vlassa interpreted one of the Tărtăria tablets as a hunting scene and the other two with signs as a kind of primitive writing similar to the early pictograms of the [[Sumer]]ians.{{citation needed|date=October 2018}} Some archaeologists who support the idea that they do represent writing, notably [[Marija Gimbutas]], have proposed that they are fragments of a system dubbed the [[Old European Script]].<br /> <br /> === Non-linguistic signs ===<br /> One problem is the lack of independent indications of literacy existing in the Balkans at this period. Sarunas Milisauskas comments that &quot;it is extremely difficult to demonstrate archaeologically whether a corpus of symbols constitutes a writing system&quot; and notes that the first known writing systems were all developed by early states to facilitate record-keeping in complex organised societies in the Middle East and Mediterranean. There is no evidence of organised states in the European Neolithic, thus it is unlikely they would have needed the administrative systems facilitated by writing. David Anthony notes that [[Chinese character]]s were first used for ritual and commemorative purposes associated with the 'sacred power' of kings; it is possible that a similar usage accounts for the Tărtăria symbols.&lt;ref&gt;Sarunas Milisauskas, ''European Prehistory: A Survey'', pp. 236–37 (Kluwer Academic / Plenum Publishers, 2002)&lt;/ref&gt; Some scholars have suggested that the symbols may have been used as [[House mark|marks of ownership]] or as the focus of religious rituals.&lt;ref name=&quot;Becker&quot;/&gt;<br /> <br /> An alternative suggestion is that they may have been merely uncomprehending imitations of more advanced cultures, although this explanation is made rather unlikely by the great antiquity of the tablets—there were no known literate cultures at the time from which the symbols could have been adopted.&lt;ref name=&quot;Becker&quot;/&gt;<br /> <br /> Others consider the [[pictogram]]s to be accompanied by random scribbles.{{dubious|So: there are indeed pictograms, but they are &quot;accompanied by random scribbles&quot;? Clearer pls.!|date=October 2018}}&lt;ref name=&quot;Becker&quot;/&gt;<br /> <br /> == See also ==<br /> {{columnslist|colwidth=30em|<br /> * [[Gradeshnitsa tablets]]<br /> * [[Dispilio Tablet]]<br /> * [[Sitovo inscription]]<br /> * [[Prehistoric Romania]]<br /> * [[Prehistory of Transylvania]]<br /> * [[Cucuteni–Trypillian culture]]<br /> }}<br /> <br /> == References ==<br /> {{reflist}}<br /> <br /> == Sources ==<br /> {{refbegin}}<br /> * {{Citation | last = Evans | first = A | year = 1895 | title = Cretan pictographs and prae-Phoenician script. With an account of a sepulchral deposit at Hagios Onuphrios near Phaestos in its relation primitive Cretan and Aegean culture | publisher= G.P.Putnams sons| pages = 166}}<br /> * Falkenstein, A. (1965) Zu den Tontafeln aus Târtària, Germania 43 : 269–273.<br /> * {{cite book |author-last=Gimbutas |author-first=Marija |author-link=Marija Gimbutas |editor-last=Dexter |editor-first=Miriam Robbins |title=The Living Goddesses |year=2001 |orig-year=1999 |publisher=University of California Press |location=Berkeley |isbn=9780520229150}}<br /> * {{Citation | last = Haarmann | first = H | year = 1990 | contribution = Writing from Old Europe | title = The Journal of Indo-European Studies | issue = 17}}<br /> * {{Citation | last = Jongbloed | first = Dominique | year = 2011 | title = Civilisations antédiluviennes | publisher = Alphée ed | language = French}}<br /> * Kenanidis, I.; Papakitsos, G. (2015) A Comparative Linguistic Study about the Sumerian Influence on the Creation of the Aegean Scripts.<br /> * Klára, Friedrich The Mystery of Tatárlaka (Dobogó-Historical journal, 2004/9.-2005/6.)<br /> * Klára, Friedrich (2005) - Szakács Gábor: Graved in stone, carved in wood...<br /> * {{Citation | last = Makkay | first = J | year = 1969 | contribution = The Late Neolithic Tordos Group of Signs | title = Alba Regia | issue = 10 | pages = 9–50}}.<br /> * {{Citation | last = Makkay | first = J | year = 1984 | title = Early Stamp Seals in South-East Europe | place = Budapest}}.<br /> * Mandics, Gy., Záhonyi, A.: The message oh Tartaria and Tordos. Fríg (Pilisvörösvár, Hungary), 2018.<br /> * {{cite journal|last1 = Merlini|first1 = Marco|last2 = Lazarovici|first2 = Gheorghe|title = Settling discovery circumstances, dating and utilization of the Tărtăria tablets|journal = Acta Terrae Septemcastrensis|url = http://arheologie.ulbsibiu.ro/publicatii/ats/ats8/acta%207.pdf|volume = VII|year = 2008|publisher = [[Lucian Blaga University of Sibiu]]|location = Sibiu, Romania|issn = 1583-1817}}<br /> * {{cite journal |last1=Paliga |first1=Sorin |title=The tablets of Tǎrtǎria. An enigma? A reconsideration and further perspectives |journal=Dialogues d'histoire ancienne |year=1993 |volume=19 |issue=1 |pages=9–43 |doi=10.3406/dha.1993.2073 |url=http://www.persee.fr/doc/dha_0755-7256_1993_num_19_1_2073}}<br /> * Schmandt-Besserat, Denise (1992) Before Writing, University of Texas Press, Austin. Volume I: From Counting to Cuneiform.<br /> * Vaiman, A. A. (1994) On the Quasi-Sumerian tablets from Tartaria. Археологические вести. Спб, 1994. Вып. 3. Аннотации. — ИИМК РАН<br /> * {{Citation | last = Winn | first = Sham MM | year = 1973 | title = The Signs of the Vinca Culture}}.<br /> * {{Citation | last = Winn | first = Sham MM | year = 1981 | title = Pre-writing in Southeast Europe: The Sign System of the Vinca culture | publisher = BAR}}.<br /> {{refend}}<br /> <br /> == External links ==<br /> {{Sister links|auto=yes}}<br /> * {{Citation | url = http://www.prehistory.it/ftp/arta_populara01.htm | title = Signs on Tărtăria Tablets found in the Romanian folkloric art | first = Ioana | last = Crişan | publisher = PreHistory | place = IT}}.<br /> * {{Citation | url = http://www.prehistory.it/ftp/tartaria_tablets/merlinitartaria.htm | title = Milady Tărtăria and the discovery of the Tărtăria Tablets | first = Marco | last = Merlini | publisher = PreHistory | place = IT}}.<br /> <br /> {{DEFAULTSORT:Tartaria tablets}}<br /> [[Category:Neolithic]]<br /> [[Category:Archaeological sites in Romania]]<br /> [[Category:Prehistory of Southeastern Europe]]<br /> [[Category:Inscriptions in undeciphered writing systems]]<br /> [[Category:Clay tablets]]<br /> [[Category:Pre-Indo-Europeans]]<br /> [[Category:Proto-writing]]<br /> [[Category:Vinča culture]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Jiahu_symbols&diff=1170907481 Jiahu symbols 2023-08-17T22:42:20Z <p>205.189.94.9: </p> <hr /> <div>{{Short description|Ancient carvings on artifacts in China}}<br /> {{Use American English|date = February 2019}}<br /> [[Image:Jiahu writing.svg|right|thumb|100px|Example of Jiahu symbols.]]<br /> The '''Jiahu symbols''' ({{zh|s=贾湖契刻符号|t=賈湖契刻符號|p=Jiǎhú qìkè fúhào}}) consist of 16 distinct markings on [[Prehistory|prehistoric]] artifacts found in [[Jiahu]], a [[List of Neolithic cultures of China|neolithic]] [[Peiligang culture]] site found in [[Henan]], [[People's Republic of China|China]], and excavated in 1989. The Jiahu symbols are dated to around 6000&amp;nbsp;BC.&lt;ref&gt;{{cite book |last1=Underhill |first1=Anne P. |title=A Companion to Chinese Archaeology |date=2013 |publisher=John Wiley &amp; Sons |isbn=978-1-118-32578-0 |page=248 |url=https://books.google.com/books?id=I3XG3H_WlM8C&amp;pg=PT248 |language=en}}&lt;/ref&gt; The archaeologists who made the original finds believed the markings to be similar in form to some characters used in the much later [[oracle bone script]] (e.g. similar markings of {{lang|zh-Hani|目}} &quot;eye&quot;, {{lang|zh-Hani|日}} &quot;sun; day&quot;), but most doubt that the markings represent systematic writing.&lt;ref&gt;{{cite news |title='Earliest writing' found in China |first=Paul |last=Rincon |date=17 April 2003 |url=http://news.bbc.co.uk/2/hi/science/nature/2956925.stm |work=BBC News }}&lt;/ref&gt; A 2003 report in ''[[Antiquity (journal)|Antiquity]]'' interpreted them &quot;not as writing itself, but as features of a lengthy period of sign-use which led eventually to a fully-fledged system of writing.&quot;&lt;ref&gt;{{cite journal |last1=Li|first1=X |first2=Garman|last2=Harbottle |author3=Zhang Juzhong |author4=Wang Changsui |url=https://www.researchgate.net/publication/273292851 |title=The earliest writing? Sign use in the seventh millennium BC at Jiahu, Henan Province, China |year=2003 |journal=Antiquity |volume=77 |issue=295 |pages=31–44|doi=10.1017/S0003598X00061329 |s2cid=162602307 }}&lt;/ref&gt; The earliest known body of writing in the oracle bone script dates much later to the reign of the late [[Shang dynasty]] king [[Wu Ding]], which started in about c. 1250 BC&lt;ref&gt;{{cite book |title=The Origin and Early Development of the Chinese Writing System |first=William G. |last=Boltz |orig-year=1994 |year=2003 |series=American Oriental Series |volume=78 |publisher=American Oriental Society |location=New Haven, Connecticut, USA |isbn=978-0-940490-18-5 |page=31}}&lt;/ref&gt; or 1200 BC, BCE being before common era, about 3200-3275 years before current day.&lt;ref name=&quot;Keightley&quot;&gt;{{cite book |last1=Keightley |first1=David N. |title=Sources of Shang History: The Oracle-bone Inscriptions of Bronze Age China |date=1985 |publisher=University of California Press |isbn=978-0-520-05455-4 |page=3 |url=https://books.google.com/books?id=8j3pPZqFQVkC&amp;q=1200&amp;pg=PP1 |access-date=31 May 2020 |language=en}}&lt;/ref&gt;<br /> <br /> ==See also==<br /> *[[Gudi (instrument)]]<br /> *[[Neolithic signs in China]]<br /> *[[Undeciphered writing systems]]<br /> <br /> ==References==<br /> {{Reflist}}<br /> <br /> [[Category:Prehistoric China]]<br /> [[Category:Proto-writing]]<br /> [[Category:Undeciphered writing systems]]<br /> <br /> <br /> {{china-hist-stub}}<br /> {{writingsystem-stub}}</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Deneb&diff=1170907020 Deneb 2023-08-17T22:38:51Z <p>205.189.94.9: </p> <hr /> <div>{{short description|Star in the constellation Cygnus}}<br /> {{About|the star}}<br /> {{Starbox begin}}<br /> {{Starbox image<br /> | image=<br /> {{Location mark<br /> | image=Cygnus constellation map.svg<br /> | float=center | width=250 | position=right<br /> | mark=Red circle.svg | mark_width=10 | mark_link=Deneb (star)<br /> | x%=32.5 | y%=36.9<br /> }}<br /> | caption=Location of Deneb (circled)<br /> }}<br /> {{Starbox observe<br /> | epoch=J2000<br /> | constell=[[Cygnus (constellation)|Cygnus]]<br /> | pronounce={{IPAc-en|'|d|ɛ|n|ɛ|b|}}, {{IPAc-en|'|d|ɛ|n|ə|b|}}&lt;ref name=merriam&gt;{{cite book|author=Merriam-Webster, Inc|title=Merriam-Webster's Collegiate Dictionary|url=https://books.google.com/books?id=53-PQgAACAAJ|year=1998|publisher=Merriam-Webster|isbn=978-0-87779-714-2}}&lt;/ref&gt;<br /> | ra={{RA|20|41|25.9}}&lt;ref name=hipparcos/&gt;|dec={{DEC|+45|16|49}}&lt;ref name=hipparcos/&gt;<br /> | appmag_v=1.25&lt;ref name=ducati&gt;{{cite journal|bibcode=2002yCat.2237....0D|title=VizieR On-Line Data Catalog: Catalogue of Stellar Photometry in Johnson's 11-color system|journal=CDS/ADC Collection of Electronic Catalogues|volume=2237|pages=0|last1=Ducati|first1=J. R.|year=2002}}&lt;/ref&gt; {{nowrap|(1.21–1.29&lt;ref name=gcvs&gt;{{cite journal|bibcode=2009yCat....102025S|title=VizieR Online Data Catalog: General Catalogue of Variable Stars (Samus+ 2007–2013)|journal=VizieR On-Line Data Catalog: B/GCVS. Originally Published in: 2009yCat....102025S|volume=1|pages=02025|last1=Samus|first1=N. N.|last2=Durlevich|first2=O. V.|year=2009|display-authors=etal}}&lt;/ref&gt;)}}<br /> }}<br /> {{Starbox character<br /> | class=A2&amp;nbsp;Ia&lt;ref name=baas25_1319&gt;{{Cite journal |last1=Garrison |first1=R. F. |title=Anchor Points for the MK System of Spectral Classification |journal=[[Bulletin of the American Astronomical Society]] |volume=25 |page=1319 |year=1993 |bibcode=1993AAS...183.1710G |url=http://www.astro.utoronto.ca/~garrison/mkstds.html |access-date=2012-02-04 |archive-date=2019-06-25 |archive-url=https://web.archive.org/web/20190625094716/http://www.astro.utoronto.ca/~garrison/mkstds.html |url-status=dead }}&lt;/ref&gt;<br /> | b-v=+0.09&lt;ref name=ducati/&gt;<br /> | u-b=&amp;minus;0.23&lt;ref name=ducati/&gt;<br /> | variable=[[Alpha Cygni variable|Alpha Cygni]]&lt;ref name=gcvs/&gt;<br /> }}<br /> {{Starbox astrometry<br /> | radial_v=&amp;minus;4.5&lt;ref name=pulkovo&gt;{{cite journal|bibcode=2006AstL...32..759G|title=Pulkovo Compilation of Radial Velocities for 35 495 Hipparcos stars in a common system|journal=Astronomy Letters|volume=32|issue=11|pages=759–771|last1=Gontcharov|first1=G. A.|year=2006|doi=10.1134/S1063773706110065|arxiv = 1606.08053 |s2cid=119231169}}&lt;/ref&gt;<br /> | prop_mo_ra=1.99&lt;ref name=hipparcos/&gt;<br /> | prop_mo_dec=1.95&lt;ref name=hipparcos/&gt;<br /> | parallax=2.29<br /> | p_error=0.32<br /> | parallax_footnote=&lt;ref name=hipparcos/&gt;<br /> | dist_ly={{val|2,615|215|fmt=commas}}<br /> | dist_pc={{val|802|66}}&lt;ref name=schiller/&gt;<br /> | absmag_v=&amp;minus;8.38&lt;ref name=schiller/&gt;<br /> }}<br /> {{Starbox detail<br /> |source=&lt;ref name=schiller&gt;{{cite journal<br /> | last1=Schiller | first1=F.<br /> | last2=Przybilla | first2=N.<br /> | date=2008<br /> | title=Quantitative spectroscopy of Deneb<br /> | journal=[[Astronomy &amp; Astrophysics]]<br /> | volume=479 | issue=3 | pages=849–858<br /> | arxiv=0712.0040<br /> | bibcode=2008A&amp;A...479..849S<br /> | doi=10.1051/0004-6361:20078590<br /> | s2cid=119225384<br /> }}&lt;/ref&gt;<br /> | mass = {{Val|19|4}}<br /> | radius = {{Val|203|17}}<br /> | luminosity = {{Val|196000|32000|fmt=commas}}<br /> | temperature = {{Val|8525|75|fmt=commas}}<br /> | rotational_velocity = {{Val|20|2}}<br /> | gravity = {{Val|1.10|0.05}}<br /> | metal_fe = −0.25<br /> | age_myr=<br /> }}<br /> {{Starbox catalog<br /> | names={{odlist | name=Arided | name2=Aridif | name3=Gallina | name4=Arrioph | B=α Cygni | F=50 Cygni | BD=+44°3541 | FK5=777 | HD=197345 | HIP=102098 | HR=7924 | SAO=49941 }}<br /> }}<br /> {{Starbox reference<br /> | Simbad=Deneb<br /> }}<br /> {{Starbox end}}<br /> <br /> '''Deneb''' ({{IPAc-en|ˈ|d|ɛ|n|ɛ|b}}) is a [[first-magnitude star]] in the [[constellation]] of [[Cygnus (constellation)|Cygnus]]. Deneb is one of the vertices of the [[Asterism (astronomy)|asterism]] known as the [[Summer Triangle]] and the &quot;head&quot; of the [[Northern Cross (asterism)|Northern Cross]]. It is the brightest [[star]] in Cygnus and the 19th [[List of brightest stars|brightest star]] in the [[night sky]], with an average [[apparent magnitude]] of +1.25. A blue-white [[supergiant]], Deneb rivals [[Rigel]] as the most luminous [[first-magnitude star]]. However, its distance, and hence luminosity, is poorly known; its [[luminosity]] is somewhere between 55,000 and 196,000 times [[Solar luminosity|that of the Sun]]. Its [[Bayer designation]] is '''α Cygni''', which is [[Latinisation of names|Latinised]] to '''Alpha Cygni''', abbreviated to '''Alpha Cyg''' or '''α Cyg'''. Assorted [[methods of detection]] confirm [[Deneb]] to be a [[luminous type]] [[star]] <br /> <br /> As better cataloguing and organization methods of sorting stellar and cosmic data are developed, The degree to which we can rely on quantitative studies of this class of stars as a whole will improve. The [[penultimate]] goal of [[astronomy]], and especially [[catalogs]] of [[astronomers]], [[atlases]] of [[stars]] and and [[benchmark objects]] is to establish a stable, confirmable [[database]] with plenty of bias and error. <br /> Long-term [[historical]] [[observation]] [[data]] show [[Deneb]] to be long stable at between 7525 [[Kelvin]] through 8250 [[Kelvin]] across [[long years]] to an [[accuracy]] of [[nonzero]]. <br /> <br /> This is a non-linear term reference solar system, which can be used as a candle or candlewatt in further research on astrophysics and map collection as a unitary art of education. This is known to be an informationally lucrative hobby.<br /> <br /> ==Nomenclature==<br /> <br /> [[File:CygnusCC.jpg|thumb|left|upright|Deneb is the brighest star in the constellation of Cygnus (top)]]<br /> ''α Cygni'' (Latinised to ''Alpha Cygni'') is the star's [[Bayer designation|designation]] given by [[Johann Bayer]] in 1603. The traditional name ''Deneb'' is derived from the [[Arabic]] word for &quot;tail&quot;, from the phrase ذنب الدجاجة ''Dhanab al-Dajājah'', or &quot;tail of the hen&quot;.&lt;ref name=allen/&gt; The [[IAU Working Group on Star Names]] has recognised the name ''Deneb'' for this star, and it is entered in their Catalog of Star Names.&lt;ref name=&quot;IAU-CSN&quot;&gt;{{cite web |url=http://www.pas.rochester.edu/~emamajek/WGSN/IAU-CSN.txt |title=IAU Catalog of Star Names |website=University of Rochester |access-date=28 July 2016}}&lt;/ref&gt;<br /> <br /> ''Denebadigege'' was used in the ''[[Alfonsine Tables]]'',&lt;ref name=Kunitzsch86&gt;<br /> {{cite journal<br /> |last=Kunitzsch |first=Paul<br /> |date=1986<br /> |title=The Star Catalogue Commonly Appended to the Alfonsine Tables<br /> |journal=[[Journal for the History of Astronomy]]<br /> |volume=17 |issue=49 |pages=89–98<br /> |bibcode=1986JHA....17...89K<br /> |doi=10.1177/002182868601700202<br /> |s2cid=118597258<br /> }}&lt;/ref&gt; other variants include ''Deneb Adige'', ''Denebedigege'' and ''Arided''. This latter name was derived from ''Al Ridhādh'', a name for the constellation. [[Johann Bayer]] called it ''Arrioph'', derived from ''Aridf'' and ''Al Ridf'', 'the hindmost' or ''Gallina''. German poet and author [[Philipp von Zesen|Philippus Caesius]] termed it ''Os rosae'', or ''Rosemund'' in German, or ''Uropygium'' – the parson's nose.&lt;ref name=allen&gt;<br /> {{cite book<br /> |last=Allen<br /> |first=Richard Hinckley<br /> |date=1963<br /> |title=Star Names: Their Lore and Meaning<br /> |page=[https://archive.org/details/starnamestheirlo00alle/page/195 195]<br /> |edition=Reprint<br /> |publisher=[[Dover Publications]]<br /> |isbn=978-0-486-21079-7<br /> |url-access=registration<br /> |url=https://archive.org/details/starnamestheirlo00alle/page/195<br /> }}&lt;/ref&gt; The names ''Arided'' and ''Aridif'' have fallen out of use.<br /> <br /> An older traditional name is '''Arided''' {{IPAc-en|'|ær|ɪ|d|E|d}}, from the Arabic ''ar-ridf'' 'the one sitting behind the rider' (or just 'the follower'), perhaps referring to the other major stars of Cygnus, which were called ''al-fawāris'' 'the riders'.&lt;ref name=Kunitzsch&gt;{{cite book<br /> |last1=Kunitzsch |first1=Paul<br /> |last2=Smart |first2=Tim<br /> |date = 2006 |edition = 2nd rev.<br /> |title = A Dictionary of Modern star Names: A Short Guide to 254 Star Names and Their Derivations<br /> |publisher = Sky Pub |location = Cambridge, Massachusetts<br /> |isbn = 978-1-931559-44-7<br /> }}&lt;/ref&gt;<br /> <br /> ==Observation==<br /> [[File:Summer triangle.png|left|thumb|upright=1.2|The [[Summer Triangle]]]]<br /> The 19th [[List of brightest stars|brightest star]] in the night sky, Deneb [[culmination|culminates]] each year on October 23 at 6 PM and September 7 at 9 PM,&lt;ref name=&quot;south2015&quot;&gt;{{cite web |title=The Constellations : Part 3 Culmination Times|url=http://www.southastrodel.com/Page20502.htm|website=Southern Astronomical Delights|first=Andrew|last=James|date=2015-06-17 |access-date=2019-04-02}}&lt;/ref&gt; corresponding to [[summer]] evenings in the [[northern hemisphere]].&lt;ref name=&quot;summer&quot; /&gt; It never dips below the horizon at or above 45° north latitude, just grazing the northern horizon at its lowest point at such locations as [[Minneapolis]], [[Montreal|Montréal]] and [[Turin]]. In the [[Southern Hemisphere|southern hemisphere]], Deneb is not visible south of [[45th parallel south|45° parallel south]], so it just barely rises above the horizon in [[South Africa]], southern [[Australia]], and northern [[New Zealand]] during the southern winter.<br /> <br /> Deneb is located at the tip of the [[Northern Cross (asterism)|Northern Cross]] asterism made up of the brightest stars in Cygnus, the others being [[Albireo]] (Beta Cygni), [[Gamma Cygni]], [[Delta Cygni]], and [[Epsilon Cygni]].&lt;ref name=&quot;summer&quot;&gt;{{cite journal|bibcode=1937ASPL....3...23S|title=Stars of the Summer Sky|journal=Astronomical Society of the Pacific Leaflets|volume=3|issue=102|pages=23|last1=Smith|first1=C. E.|year=1937}}&lt;/ref&gt; It also lies at one [[Vertex (geometry)|vertex]] of the prominent and widely spaced [[Asterism (astronomy)|asterism]] called the [[Summer Triangle]], shared with the first-[[apparent magnitude|magnitude]] stars [[Vega]] in the constellation [[Lyra]] and [[Altair]] in [[Aquila (constellation)|Aquila]].&lt;ref name=&quot;pasachoff2000&quot;&gt;<br /> {{Cite book<br /> |last1=Pasachoff |first1=J. M.<br /> |date=2000<br /> |title=A Field Guide to Stars and Planets<br /> |edition=4th<br /> |publisher=[[Houghton Mifflin]]<br /> |isbn=978-0-395-93431-9<br /> }}&lt;/ref&gt;&lt;ref name=upgren1998&gt;<br /> {{Cite book<br /> |last=Upgren |first=A. R.<br /> |date=1998<br /> |title=Night Has a Thousand Eyes: A Naked-Eye Guide to the Sky, Its Science, and Lore<br /> |publisher=[[Basic Books]]<br /> |isbn=978-0-306-45790-6<br /> }}&lt;/ref&gt; This outline of stars is the approximate shape of a [[right triangle]],&lt;!--Image on side shows this is self-evident--&gt; with Deneb located at one of the acute angles.<br /> <br /> The [[stellar spectrum|spectra]] of Alpha Cygni has been observed by astronomers since at least 1888, and by 1910 the variable [[radial velocity]] had become apparent. This led to the early suggestion by [[Edwin Brant Frost|E. B. Frost]] that this is a [[binary star]] system.&lt;ref name=Lee1910&gt;{{cite journal<br /> | title=Four stars having variable radial velocities<br /> | last=Lee | first=O. J.<br /> | journal=Astrophysical Journal<br /> | volume=31 | pages=176–179 | date=March 1910<br /> | doi=10.1086/141741 | bibcode=1910ApJ....31..176L<br /> }}&lt;/ref&gt; In 1935, the work of [[George Frederic Paddock|G. F. Paddock]] and others had established that this star was [[variable star|variable]] in luminosity with a dominant period of 11.7&amp;nbsp;days and possibly with other, lower amplitude periods.&lt;ref name=Abt1957&gt;{{cite journal<br /> | title=The Variability of Supergiants<br /> | last=Abt | first=Helmut A.<br /> | journal=Astrophysical Journal<br /> | volume=126 | page=138 | date=July 1957<br /> | doi=10.1086/146379 | bibcode=1957ApJ...126..138A<br /> }}&lt;/ref&gt; By 1954, closer examination of the star's [[Calcium K line|calcium H and K lines]] showed a stationary core, which indicated the variable velocity was instead being caused by motion of the [[Stellar atmosphere|star's atmosphere]]. This variation ranged from +6 to −9&amp;nbsp;km/s around the star's mean radial velocity.&lt;ref&gt;{{cite journal<br /> | title=The Stationary Calcium Lines of Alpha Cygni<br /> | last1=Struve | first1=Otto | last2=Huang | first2=S. S.<br /> | journal=Publications of the Astronomical Society of the Pacific<br /> | volume=66 | issue=392 | page=251 | date=October 1954<br /> | doi=10.1086/126710 | bibcode=1954PASP...66..251S<br /> | s2cid=121714858 | doi-access=free }}&lt;/ref&gt; Other, similar supergiants were found to have variable velocities, with this star being a typical member.&lt;ref name=Abt1957/&gt;<br /> <br /> ===Pole star===<br /> Due to the [[Earth|Earth's]] [[axial precession]], Deneb will be an approximate [[pole star]] (7° off of the north celestial pole) at around [[10th millennium#Astronomical events|9800 AD]].&lt;ref&gt;{{cite web |title=Deneb |url=http://stars.astro.illinois.edu/sow/deneb.html |website=[[University of Illinois]] |first=James B. |last=Kaler |date=1998-06-19 |access-date=2018-04-25}}&lt;/ref&gt; The north pole of [[Mars]] points to the midpoint of the line connecting Deneb and the star [[Alderamin]].&lt;ref name=&quot;Barlow&quot;&gt;<br /> {{cite book |last=Barlow |first=N. G. |url=https://archive.org/details/marsintroduction00barl_258 |title=Mars: An introduction to its interior, surface and atmosphere |date=2008 |publisher=[[Cambridge University Press]] |isbn=978-0-521-85226-5 |page=[https://archive.org/details/marsintroduction00barl_258/page/n30 21] |url-access=limited}}&lt;/ref&gt;<br /> {| class=&quot;wikitable&quot; style=&quot;margin: 1em auto 1em auto;&quot;<br /> ! width=&quot;120&quot; align=&quot;center&quot;|Preceded by<br /> ! width=&quot;160&quot; align=&quot;center&quot;|[[Pole Star]]<br /> ! width=&quot;120&quot; align=&quot;center&quot;|Succeeded by<br /> |-<br /> |align=&quot;center&quot;|'''[[Alderamin]]'''<br /> |align=&quot;center&quot;|8700 AD to 11000 AD<br /> |align=&quot;center&quot;|'''[[Delta Cygni]]'''<br /> |}<br /> <br /> ==Physical characteristics==<br /> Deneb's adopted distance from the Earth is around {{convert|802|pc|ly}}.&lt;ref name=schiller/&gt; This is derived by a variety of different methods, including spectral luminosity classes, atmospheric modelling, stellar evolution models, assumed membership of the [[Cygnus OB7]] association, and direct measurement of angular diameter. These methods give different distances, and all have significant margins of error. The original derivation of a [[parallax]] using measurements from the astrometric satellite [[Hipparcos]] gave an uncertain result of 1.01 ± 0.57 mas&lt;ref name=aaa323_L49&gt;{{Cite journal |last1=Perryman |first1=M. A. C. |last2=Lindegren |first2=L. |year=1997 |title=The Hipparcos Catalogue |journal=[[Astronomy and Astrophysics]] |volume=323 |pages=L49–L52 |bibcode=1997A&amp;A...323L..49P |last3=Kovalevsky |first3=J. |last4=Hoeg |first4=E. |last5=Bastian |first5=U. |last6=Bernacca |first6=P. L. |last7=Crézé |first7=M. |last8=Donati |first8=F. |last9=Grenon |first9=M. |last10=Grewing |first10=M. |last11=Van Leeuwen |first11=F. |last12=Van Der Marel |first12=H. |last13=Mignard |first13=F. |last14=Murray |first14=C. A. |last15=Le Poole |first15=R. S. |last16=Schrijver |first16=H. |last17=Turon |first17=C. |last18=Arenou |first18=F. |last19=Froeschlé |first19=M. |last20=Petersen |first20=C. S. }}&lt;/ref&gt;&lt;ref name=GSM&gt;{{Cite book |last=Perryman |first=M. |date=2010 |title=The Making of History's Greatest Star Map |publisher=[[Springer-Verlag]] |doi=10.1007/978-3-642-11602-5 |isbn=978-3-642-11601-8 |series=Astronomers' Universe |url=https://cds.cern.ch/record/1338896 |type=Submitted manuscript |bibcode=2010mhgs.book.....P }}&lt;/ref&gt; that was consistent with this distance. However, a more recent reanalysis gives the much larger parallax whose distance is barely half the current accepted value.&lt;ref name=hipparcos/&gt; One 2008 calculation using the Hipparcos data puts the most likely distance at {{convert|475|pc|ly}}, with an uncertainty of around 15%.&lt;ref name=maiz&gt;{{cite arXiv |title=Accurate distances to nearby massive stars with the new reduction of the Hipparcos raw data |pages=2553 |last1=Maíz Apellániz |first1=J. |last2=Alfaro |first2=E. J. |last3=Sota |first3=A. |year=2008 |class=astro-ph |eprint=0804.2553}}&lt;/ref&gt; The controversy over whether the direct Hipparcos measurements can be ignored in favour of a wide range of indirect stellar models and interstellar distance scales is similar to the better known situation with the [[Pleiades]].&lt;ref name=hipparcos&gt;{{cite journal |arxiv=0708.1752 |bibcode=2007A&amp;A...474..653V |doi=10.1051/0004-6361:20078357 |title=Validation of the new Hipparcos reduction |journal=Astronomy and Astrophysics |volume=474 |issue=2 |pages=653–664 |year=2007 |last1=Van Leeuwen |first1=F. |s2cid=18759600}}&lt;/ref&gt;<br /> <br /> Deneb's [[absolute magnitude]] is estimated as &amp;minus;8.4, placing it among the visually brightest stars known, with an estimated luminosity nearly {{solar luminosity|200,000|link=y}}. This is towards the upper end of values published over the past few decades, which vary between {{solar luminosity|55,000}} and {{solar luminosity|196,000}}.&lt;ref name=chesneau&gt;{{cite journal |bibcode=2010A&amp;A...521A...5C |title=Time, spatial, and spectral resolution of the Hα line-formation region of Deneb and Rigel with the VEGA/CHARA interferometer |journal=Astronomy and Astrophysics |volume=521 |pages=A5 |last1=Chesneau |first1=O. |last2=Dessart |first2=L. |last3=Mourard |first3=D. |last4=Bério |first4=Ph. |last5=Buil |first5=Ch. |last6=Bonneau |first6=D. |last7=Borges Fernandes |first7=M. |last8=Clausse |first8=J. M. |last9=Delaa |first9=O. |last10=Marcotto |first10=A. |last11=Meilland |first11=A. |last12=Millour |first12=F. |last13=Nardetto |first13=N. |last14=Perraut |first14=K. |last15=Roussel |first15=A. |last16=Spang |first16=A. |last17=Stee |first17=P. |last18=Tallon-Bosc |first18=I. |last19=McAlister |first19=H. |last20=Ten Brummelaar |first20=T. |last21=Sturmann |first21=J. |last22=Sturmann |first22=L. |last23=Turner |first23=N. |last24=Farrington |first24=C. |last25=Goldfinger |first25=P. J. |year=2010 |doi=10.1051/0004-6361/201014509 |arxiv=1007.2095 |s2cid=10340205 |url=https://hal.archives-ouvertes.fr/hal-00501515}}&lt;/ref&gt;&lt;ref&gt;<br /> {{Cite journal <br /> |last1=van de Kamp |first1=P.<br /> |date=1953<br /> |title=The Twenty Brightest Stars<br /> |journal=[[Publications of the Astronomical Society of the Pacific]]<br /> |volume=65 |issue=382<br /> |pages=30<br /> |bibcode=1953PASP...65...30V<br /> |doi=10.1086/126523 <br /> |doi-access=free<br /> }}&lt;/ref&gt;&lt;ref&gt;<br /> {{Cite journal<br /> |last1=Lamers |first1=H. J. G. L. M.<br /> |last2=Stalio |first2=R.<br /> |last3=Kondo |first3=Y.<br /> |date=1978<br /> |title=A study of mass loss from the mid-ultraviolet spectrum of α Cygni (A2 Ia), β Orionis (B8 Ia), and η Leonis (A0 Ib)<br /> |journal=[[The Astrophysical Journal]]<br /> |volume=223 |pages=207<br /> |bibcode=1978ApJ...223..207L<br /> |doi=10.1086/156252<br /> }}&lt;/ref&gt;<br /> <br /> Deneb is the most luminous first magnitude star, that is, stars with a brighter apparent magnitude than 1.5. Deneb is also the most distant of the 30 [[List of brightest stars|brightest stars]] by a factor of almost 2.&lt;ref&gt;{{cite web<br /> | title=The 172 Brightest Stars | work=STARS<br /> | first=James B. | last=Kaler | date=2017<br /> | url=http://stars.astro.illinois.edu/sow/bright.html | access-date=2021-09-17<br /> }}&lt;/ref&gt; Based on its temperature and luminosity, and also on direct measurements of its tiny [[angular diameter]] (a mere 0.002 seconds of arc), Deneb appears to have a diameter of about 200 times [[Solar radius|that of the Sun]];&lt;ref name=chesneau/&gt; if placed at the center of the [[Solar System]], Deneb would extend out to the [[Earth's orbit|orbit of the Earth]]. It is one of the [[List of largest stars|largest white 'A' spectral type stars known]].<br /> <br /> Deneb is a bluish-white star of [[stellar classification|spectral type]] A2Ia, with a surface temperature of 8,500 [[kelvin]]. Since 1943, its [[stellar spectrum|spectrum]] has served as one of the stable references by which other stars are classified.&lt;ref name=baas25_1319/&gt; Its mass is estimated at 19 {{Solar mass|link=y}}. [[Stellar wind]]s causes matter to be lost at an average rate of {{Solar mass|8±3{{e|-7}}}} per year, 100,000 times the Sun's rate of mass loss or equivalent to about one [[Earth mass]] per 500 years.&lt;ref&gt;{{cite journal |bibcode=2002ApJ...570..344A |title=The Spectral Energy Distribution and Mass-Loss Rate of the A-Type Supergiant Deneb |journal=The Astrophysical Journal |volume=570 |issue=1 |pages=344 |last1=Aufdenberg |first1=J. P. |last2=Hauschildt |first2=P. H. |last3=Baron |first3=E. |last4=Nordgren |first4=T. E. |last5=Burnley |first5=A. W. |last6=Howarth |first6=I. D. |last7=Gordon |first7=K. D. |last8=Stansberry |first8=J. A. |year=2002 |doi=10.1086/339740 |arxiv=astro-ph/0201218 |s2cid=13260314}}&lt;/ref&gt;<br /> <br /> ===Evolutionary state===<br /> Deneb spent much of its early life as an [[O-type main-sequence star]] of about {{solar mass|23}}, but it has now exhausted the [[hydrogen]] in its core and expanded to become a supergiant.&lt;ref name=schiller/&gt;&lt;ref name=georgy/&gt; Stars in the mass range of Deneb eventually expand to become the most luminous [[red supergiants]], and within a few million years their cores will collapse producing a [[supernova]] explosion. It is now known that red supergiants up to a certain mass explode as the commonly seen [[type II supernova|type II-P supernova]]e, but more massive ones lose their outer layers to become hotter again. Depending on their initial masses and the rate of mass loss, they may explode as [[yellow hypergiant]]s or [[luminous blue variable]]s, or they may become [[Wolf-Rayet star]]s before exploding in a [[Type Ib and Ic supernovae|type Ib or Ic supernova]]. Identifying whether Deneb is currently evolving towards a red supergiant or is currently evolving bluewards again would place valuable constraints on the classes of stars that explode as red supergiants and those that explode as hotter stars.&lt;ref name=georgy&gt;{{cite journal |bibcode=2014MNRAS.439L...6G |title=The puzzle of the CNO abundances of α Cygni variables resolved by the Ledoux criterion |journal=Monthly Notices of the Royal Astronomical Society: Letters |volume=439 |issue=1 |pages=L6–L10 |last1=Georgy |first1=Cyril |last2=Saio |first2=Hideyuki |last3=Meynet |first3=Georges |year=2014 |doi=10.1093/mnrasl/slt165 |arxiv=1311.4744 |s2cid=118557550}}&lt;/ref&gt;<br /> <br /> Stars evolving red-wards for the first time are most likely fusing hydrogen in a shell around a [[helium]] core that has not yet grown hot enough to start fusion to [[carbon]] and [[oxygen]]. Convection has begun [[Stellar evolution#Mature stars#Mid-sized stars#Red-giant-branch phase|dredging]] up fusion products but these do not reach the surface. Post-red supergiant stars are expected to show those fusion products at the surface due to stronger convection during the red supergiant phase and due to loss of the obscuring outer layers of the star. Deneb is thought to be increasing its temperature after a period as a red supergiant, although current models do not exactly reproduce the surface elements showing in its spectrum.&lt;ref name=georgy/&gt;<br /> <br /> ===Variable star===<br /> [[File:AlphaCygLightCurve.png|thumb|left|A [[Photometric_system#Photometric_letters|visual band]] [[light curve]] for Deneb, adapted from Yüce and Adelman (2019)&lt;ref name=&quot;Yuca2019&quot;/&gt;]]<br /> Deneb is the prototype of the [[Alpha Cygni variable|Alpha Cygni]] (α Cygni) [[variable star]]s,&lt;ref name=&quot;Richardson2011&quot; /&gt;&lt;ref name=&quot;Yuca2019&quot;&gt;{{cite journal<br /> |last1=Yüce|first1=K.<br /> |last2=Adelman |first2=S..J.<br /> |title=On the variability of the A0 supergiants 9 Per, HR 1035, 13 Mon, Deneb, and HR 8020 as seen in FCAPT Strömgren photometry<br /> |date=2019<br /> |journal=New Astronomy<br /> |volume=66<br /> |pages=88–99<br /> |doi=10.1016/j.newast.2018.07.002<br /> |bibcode = 2019NewA...66...88Y|s2cid=126285732<br /> }}&lt;/ref&gt; whose small irregular amplitudes and rapid pulsations can cause its magnitude to vary anywhere between 1.21 and 1.29.&lt;ref name=&quot;gscvquery&quot;&gt;{{Cite web |url=http://www.sai.msu.su/gcvs/cgi-bin/search.cgi?search=alf+Cyg|title=GCVS Query forms|website=Sternberg Astronomical Institute|access-date=2019-01-07<br /> }}&lt;/ref&gt; Its variable velocity discovered by Lee in 1910,&lt;ref name=Lee1910/&gt; but it was not formally placed as a unique class of variable stars until the 1985 4th edition of the General Catalogue of Variable Stars.&lt;ref name=&quot;GCVS4&quot;&gt;{{cite journal |bibcode=1996yCat.2139....0K |title=VizieR Online Data Catalog: General Catalog of Variable Stars, 4th Ed. (GCVS4) (/gcvs4Kholopov+ 1988) |journal=VizieR On-Line Data Catalog: II/139B. Originally Published in: Moscow: Nauka Publishing House (1985-1988) |volume=2139 |pages=0 |last1=Kholopov |first1=P. N. |last2=Samus' |first2=N. N. |last3=Frolov |first3=M. S. |last4=Goranskij |first4=V. P. |last5=Gorynya |first5=N. A. |last6=Kireeva |first6=N. N. |last7=Kukarkina |first7=N. P. |last8=Kurochkin |first8=N. E. |last9=Medvedeva |first9=G. I. |last10=Perova |first10=N. B.|date=1996}}&lt;/ref&gt; The cause of the pulsations of Alpha Cygni variable stars are not fully understood, but their [[irregular variable|irregular nature]] seems to be due to [[Beat (acoustics)|beat]]ing of multiple pulsation periods. Analysis of radial velocities determined 16 different harmonic pulsation modes with periods ranging between 6.9 and 100.8 days.&lt;ref name=&quot;Lucy1976&quot;/&gt; A longer period of about 800 days probably also exists.&lt;ref name=&quot;Yuca2019&quot; /&gt;<br /> <br /> ===Possible spectroscopic companion===<br /> Deneb has been reported as a possible single line spectroscopic [[Binary star|binary]] with a period of about 850 days, where the spectral lines from the star suggest cyclical radial velocity changes.&lt;ref name=&quot;Lucy1976&quot;&gt;{{cite journal|bibcode=1976ApJ...206..499L|title=An analysis of the variable radial velocity of alpha Cygni|journal=Astrophysical Journal|volume=206|pages=499|last1=Lucy|first1=L. B.|year=1976|doi=10.1086/154405|doi-access=free}}&lt;/ref&gt; Later investigations have found no evidence supporting the existence of a companion.&lt;ref name=&quot;Richardson2011&quot;&gt;{{cite journal |bibcode=2011AJ....141...17R |title=A Five-year Spectroscopic and Photometric Campaign on the Prototypical α Cygni Variable and A-type Supergiant Star Deneb |journal=The Astronomical Journal |volume=141 |issue=1 |pages=17 |last1=Richardson |first1=N. D. |last2=Morrison |first2=N. D. |last3=Kryukova |first3=E. E. |last4=Adelman |first4=S. J. |year=2011 |doi=10.1088/0004-6256/141/1/17 |arxiv=1009.5994 |s2cid=118300333}}&lt;/ref&gt;<br /> <br /> ==Etymology and cultural significance==<br /> [[File:Wide-field view of the Summer Triangle.jpg|thumb|upright=1.2|Wide-field view of the [[Summer Triangle]] and the [[Milky Way]]. Deneb is at the left-centre of the picture.&lt;!-- Not sure if you can find Deneb unless you have experience. --&gt;]]Names similar to Deneb have been given to at least seven different stars, most notably [[Beta Ceti|Deneb Kaitos]], the brightest star in the constellation of [[Cetus]]; [[Delta Capricorni|Deneb Algedi]], the brightest star in [[Capricornus]]; and [[Denebola]], the second brightest star in [[Leo (constellation)|Leo]]. All these stars are referring to the tail of the animals that their respective constellations represent.<br /> <br /> In Chinese, {{lang|zh|天津}} ({{lang|zh-Latn|Tiān Jīn}}), meaning ''[[Girl (Chinese constellation)|Celestial Ford]]'', refers to an asterism consisting of Deneb, [[Gamma Cygni]], [[Delta Cygni]], [[30 Cygni]], [[Nu Cygni]], [[Tau Cygni]], [[Upsilon Cygni]], [[Zeta Cygni]] and [[Epsilon Cygni]].&lt;ref&gt;{{cite book |author=陳久金|title=中國星座神話|url=https://books.google.com/books?id=0Vex0rYzdu8C|year=2005|publisher=五南圖書出版股份有限公司|isbn=978-986-7332-25-7}}&lt;/ref&gt; Consequently, the [[Chinese star names|Chinese name]] for Deneb itself is {{lang|zh|天津四}} ({{lang|zh-Latn|Tiān Jīn sì}}, {{lang-en|the Fourth Star of the Celestial Ford}}).&lt;ref&gt;{{cite web|language=zh |url=http://www.lcsd.gov.hk/CE/Museum/Space/Research/StarName/c_research_chinengstars_c_d.htm |title=香港太空館 - 研究資源 - 亮星中英對照表] |archive-url=https://web.archive.org/web/20081025110153/http://www.lcsd.gov.hk/CE/Museum/Space/Research/StarName/c_research_chinengstars_c_d.htm |archive-date=2008-10-25 | access-date=2019-01-09 | website=Hong Kong Space Museum}}&lt;/ref&gt;<br /> <br /> In the Chinese love story of [[Qi Xi]], Deneb marks the [[magpie]] bridge across the [[Milky Way]], which allows the separated lovers Niu Lang ([[Altair]]) and Zhi Nü ([[Vega]]) to be reunited on one special night of the year in late summer. In other versions of the story, Deneb is a fairy who acts as chaperone when the lovers meet.<br /> <br /> ===Namesakes===<br /> [[USS Arided (AK-73)|USS ''Arided'']] was a [[United States Navy]] [[Crater class cargo ship|''Crater''-class cargo ship]] named after the star. [[SS Deneb|SS ''Deneb'']] was an Italian merchant vessel that bore this name from 1951 until she was scrapped in 1966.<br /> <br /> ===In fiction===<br /> {{main|Deneb in fiction}}<br /> The star Deneb, and hypothetical planets orbiting it, have been used many times in [[literature]], [[film]], [[electronic game]]s, and [[music]]. Examples include several episodes of the ''[[Star Trek]]'' [[TV series]], the ''[[Silver Surfer]]'' comic book, the [[Rush (band)|Rush]] [[album]]s ''[[A Farewell to Kings]]'' and ''[[Hemispheres (Rush album)|Hemispheres]]'', the ''[[Descent: FreeSpace – The Great War]]'' [[computer game]], ''[[Stellaris (video game)|Stellaris]]'', and the [[science fiction]] [[novel]] ''[[Hyperion (Simmons novel)|Hyperion]]''.<br /> <br /> ==See also==<br /> * [[List of bright stars]]<br /> <br /> ==References==<br /> {{Reflist|30em|refs=}}<br /> <br /> {{Sky|20|41|25.9|+|45|16|49|1400}}<br /> {{Stars of Cygnus}}<br /> {{Portal bar|Astronomy|Stars|Outer space}}<br /> &lt;!-- Properties --&gt;<br /> <br /> [[Category:A-type supergiants]]<br /> [[Category:Alpha Cygni variables]]<br /> [[Category:Emission-line stars]]<br /> &lt;!-- Other --&gt;<br /> [[Category:Northern pole stars]]<br /> [[Category:Cygnus (constellation)]]<br /> [[Category:Bayer objects|Cygni, Alpha]]<br /> [[Category:Durchmusterung objects|BD+44 3541]]<br /> [[Category:Flamsteed objects|Cygni, 50]]<br /> [[Category:Henry Draper Catalogue objects|197345]]<br /> [[Category:Hipparcos objects|102098]]<br /> [[Category:Bright Star Catalogue objects|7924]]<br /> [[Category:Arabic words and phrases]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Deneb&diff=1170906860 Deneb 2023-08-17T22:37:17Z <p>205.189.94.9: </p> <hr /> <div>{{short description|Star in the constellation Cygnus}}<br /> {{About|the star}}<br /> {{Starbox begin}}<br /> {{Starbox image<br /> | image=<br /> {{Location mark<br /> | image=Cygnus constellation map.svg<br /> | float=center | width=250 | position=right<br /> | mark=Red circle.svg | mark_width=10 | mark_link=Deneb (star)<br /> | x%=32.5 | y%=36.9<br /> }}<br /> | caption=Location of Deneb (circled)<br /> }}<br /> {{Starbox observe<br /> | epoch=J2000<br /> | constell=[[Cygnus (constellation)|Cygnus]]<br /> | pronounce={{IPAc-en|'|d|ɛ|n|ɛ|b|}}, {{IPAc-en|'|d|ɛ|n|ə|b|}}&lt;ref name=merriam&gt;{{cite book|author=Merriam-Webster, Inc|title=Merriam-Webster's Collegiate Dictionary|url=https://books.google.com/books?id=53-PQgAACAAJ|year=1998|publisher=Merriam-Webster|isbn=978-0-87779-714-2}}&lt;/ref&gt;<br /> | ra={{RA|20|41|25.9}}&lt;ref name=hipparcos/&gt;|dec={{DEC|+45|16|49}}&lt;ref name=hipparcos/&gt;<br /> | appmag_v=1.25&lt;ref name=ducati&gt;{{cite journal|bibcode=2002yCat.2237....0D|title=VizieR On-Line Data Catalog: Catalogue of Stellar Photometry in Johnson's 11-color system|journal=CDS/ADC Collection of Electronic Catalogues|volume=2237|pages=0|last1=Ducati|first1=J. R.|year=2002}}&lt;/ref&gt; {{nowrap|(1.21–1.29&lt;ref name=gcvs&gt;{{cite journal|bibcode=2009yCat....102025S|title=VizieR Online Data Catalog: General Catalogue of Variable Stars (Samus+ 2007–2013)|journal=VizieR On-Line Data Catalog: B/GCVS. Originally Published in: 2009yCat....102025S|volume=1|pages=02025|last1=Samus|first1=N. N.|last2=Durlevich|first2=O. V.|year=2009|display-authors=etal}}&lt;/ref&gt;)}}<br /> }}<br /> {{Starbox character<br /> | class=A2&amp;nbsp;Ia&lt;ref name=baas25_1319&gt;{{Cite journal |last1=Garrison |first1=R. F. |title=Anchor Points for the MK System of Spectral Classification |journal=[[Bulletin of the American Astronomical Society]] |volume=25 |page=1319 |year=1993 |bibcode=1993AAS...183.1710G |url=http://www.astro.utoronto.ca/~garrison/mkstds.html |access-date=2012-02-04 |archive-date=2019-06-25 |archive-url=https://web.archive.org/web/20190625094716/http://www.astro.utoronto.ca/~garrison/mkstds.html |url-status=dead }}&lt;/ref&gt;<br /> | b-v=+0.09&lt;ref name=ducati/&gt;<br /> | u-b=&amp;minus;0.23&lt;ref name=ducati/&gt;<br /> | variable=[[Alpha Cygni variable|Alpha Cygni]]&lt;ref name=gcvs/&gt;<br /> }}<br /> {{Starbox astrometry<br /> | radial_v=&amp;minus;4.5&lt;ref name=pulkovo&gt;{{cite journal|bibcode=2006AstL...32..759G|title=Pulkovo Compilation of Radial Velocities for 35 495 Hipparcos stars in a common system|journal=Astronomy Letters|volume=32|issue=11|pages=759–771|last1=Gontcharov|first1=G. A.|year=2006|doi=10.1134/S1063773706110065|arxiv = 1606.08053 |s2cid=119231169}}&lt;/ref&gt;<br /> | prop_mo_ra=1.99&lt;ref name=hipparcos/&gt;<br /> | prop_mo_dec=1.95&lt;ref name=hipparcos/&gt;<br /> | parallax=2.29<br /> | p_error=0.32<br /> | parallax_footnote=&lt;ref name=hipparcos/&gt;<br /> | dist_ly={{val|2,615|215|fmt=commas}}<br /> | dist_pc={{val|802|66}}&lt;ref name=schiller/&gt;<br /> | absmag_v=&amp;minus;8.38&lt;ref name=schiller/&gt;<br /> }}<br /> {{Starbox detail<br /> |source=&lt;ref name=schiller&gt;{{cite journal<br /> | last1=Schiller | first1=F.<br /> | last2=Przybilla | first2=N.<br /> | date=2008<br /> | title=Quantitative spectroscopy of Deneb<br /> | journal=[[Astronomy &amp; Astrophysics]]<br /> | volume=479 | issue=3 | pages=849–858<br /> | arxiv=0712.0040<br /> | bibcode=2008A&amp;A...479..849S<br /> | doi=10.1051/0004-6361:20078590<br /> | s2cid=119225384<br /> }}&lt;/ref&gt;<br /> | mass = {{Val|19|4}}<br /> | radius = {{Val|203|17}}<br /> | luminosity = {{Val|196000|32000|fmt=commas}}<br /> | temperature = {{Val|8525|75|fmt=commas}}<br /> | rotational_velocity = {{Val|20|2}}<br /> | gravity = {{Val|1.10|0.05}}<br /> | metal_fe = −0.25<br /> | age_myr=<br /> }}<br /> {{Starbox catalog<br /> | names={{odlist | name=Arided | name2=Aridif | name3=Gallina | name4=Arrioph | B=α Cygni | F=50 Cygni | BD=+44°3541 | FK5=777 | HD=197345 | HIP=102098 | HR=7924 | SAO=49941 }}<br /> }}<br /> {{Starbox reference<br /> | Simbad=Deneb<br /> }}<br /> {{Starbox end}}<br /> <br /> '''Deneb''' ({{IPAc-en|ˈ|d|ɛ|n|ɛ|b}}) is a [[first-magnitude star]] in the [[constellation]] of [[Cygnus (constellation)|Cygnus]]. Deneb is one of the vertices of the [[Asterism (astronomy)|asterism]] known as the [[Summer Triangle]] and the &quot;head&quot; of the [[Northern Cross (asterism)|Northern Cross]]. It is the brightest [[star]] in Cygnus and the 19th [[List of brightest stars|brightest star]] in the [[night sky]], with an average [[apparent magnitude]] of +1.25. A blue-white [[supergiant]], Deneb rivals [[Rigel]] as the most luminous [[first-magnitude star]]. However, its distance, and hence luminosity, is poorly known; its luminosity is somewhere between 55,000 and 196,000 times [[Solar luminosity|that of the Sun]]. Its [[Bayer designation]] is '''α Cygni''', which is [[Latinisation of names|Latinised]] to '''Alpha Cygni''', abbreviated to '''Alpha Cyg''' or '''α Cyg'''. Assorted [[methods of detection]] confirm [[Deneb]] to be a [[luminous type]] [[star]] <br /> <br /> As better cataloguing and organization methods of sorting stellar and cosmic data are developed, The degree to which we can rely on quantitative studies of this class of stars as a whole will improve. The pentulitmate goal of astronomy, and especially catalogs of astronomers, atlases of stars and and benchmark objects is to establish a stable, confirmable database with plenty of bias and error. <br /> Long-term historical observation data show Deneb to be long stable at between 7525 Kelvin through 8250 Kelvin across long years to an accuracy of nonzero. <br /> <br /> This is a non-linear term reference solar system, which can be used as a candle or candlewatt in further research on astrophysics and map collection as a unitary art of education. This is known to be an informationally lucrative hobby.<br /> <br /> ==Nomenclature==<br /> <br /> [[File:CygnusCC.jpg|thumb|left|upright|Deneb is the brighest star in the constellation of Cygnus (top)]]<br /> ''α Cygni'' (Latinised to ''Alpha Cygni'') is the star's [[Bayer designation|designation]] given by [[Johann Bayer]] in 1603. The traditional name ''Deneb'' is derived from the [[Arabic]] word for &quot;tail&quot;, from the phrase ذنب الدجاجة ''Dhanab al-Dajājah'', or &quot;tail of the hen&quot;.&lt;ref name=allen/&gt; The [[IAU Working Group on Star Names]] has recognised the name ''Deneb'' for this star, and it is entered in their Catalog of Star Names.&lt;ref name=&quot;IAU-CSN&quot;&gt;{{cite web |url=http://www.pas.rochester.edu/~emamajek/WGSN/IAU-CSN.txt |title=IAU Catalog of Star Names |website=University of Rochester |access-date=28 July 2016}}&lt;/ref&gt;<br /> <br /> ''Denebadigege'' was used in the ''[[Alfonsine Tables]]'',&lt;ref name=Kunitzsch86&gt;<br /> {{cite journal<br /> |last=Kunitzsch |first=Paul<br /> |date=1986<br /> |title=The Star Catalogue Commonly Appended to the Alfonsine Tables<br /> |journal=[[Journal for the History of Astronomy]]<br /> |volume=17 |issue=49 |pages=89–98<br /> |bibcode=1986JHA....17...89K<br /> |doi=10.1177/002182868601700202<br /> |s2cid=118597258<br /> }}&lt;/ref&gt; other variants include ''Deneb Adige'', ''Denebedigege'' and ''Arided''. This latter name was derived from ''Al Ridhādh'', a name for the constellation. [[Johann Bayer]] called it ''Arrioph'', derived from ''Aridf'' and ''Al Ridf'', 'the hindmost' or ''Gallina''. German poet and author [[Philipp von Zesen|Philippus Caesius]] termed it ''Os rosae'', or ''Rosemund'' in German, or ''Uropygium'' – the parson's nose.&lt;ref name=allen&gt;<br /> {{cite book<br /> |last=Allen<br /> |first=Richard Hinckley<br /> |date=1963<br /> |title=Star Names: Their Lore and Meaning<br /> |page=[https://archive.org/details/starnamestheirlo00alle/page/195 195]<br /> |edition=Reprint<br /> |publisher=[[Dover Publications]]<br /> |isbn=978-0-486-21079-7<br /> |url-access=registration<br /> |url=https://archive.org/details/starnamestheirlo00alle/page/195<br /> }}&lt;/ref&gt; The names ''Arided'' and ''Aridif'' have fallen out of use.<br /> <br /> An older traditional name is '''Arided''' {{IPAc-en|'|ær|ɪ|d|E|d}}, from the Arabic ''ar-ridf'' 'the one sitting behind the rider' (or just 'the follower'), perhaps referring to the other major stars of Cygnus, which were called ''al-fawāris'' 'the riders'.&lt;ref name=Kunitzsch&gt;{{cite book<br /> |last1=Kunitzsch |first1=Paul<br /> |last2=Smart |first2=Tim<br /> |date = 2006 |edition = 2nd rev.<br /> |title = A Dictionary of Modern star Names: A Short Guide to 254 Star Names and Their Derivations<br /> |publisher = Sky Pub |location = Cambridge, Massachusetts<br /> |isbn = 978-1-931559-44-7<br /> }}&lt;/ref&gt;<br /> <br /> ==Observation==<br /> [[File:Summer triangle.png|left|thumb|upright=1.2|The [[Summer Triangle]]]]<br /> The 19th [[List of brightest stars|brightest star]] in the night sky, Deneb [[culmination|culminates]] each year on October 23 at 6 PM and September 7 at 9 PM,&lt;ref name=&quot;south2015&quot;&gt;{{cite web |title=The Constellations : Part 3 Culmination Times|url=http://www.southastrodel.com/Page20502.htm|website=Southern Astronomical Delights|first=Andrew|last=James|date=2015-06-17 |access-date=2019-04-02}}&lt;/ref&gt; corresponding to [[summer]] evenings in the [[northern hemisphere]].&lt;ref name=&quot;summer&quot; /&gt; It never dips below the horizon at or above 45° north latitude, just grazing the northern horizon at its lowest point at such locations as [[Minneapolis]], [[Montreal|Montréal]] and [[Turin]]. In the [[Southern Hemisphere|southern hemisphere]], Deneb is not visible south of [[45th parallel south|45° parallel south]], so it just barely rises above the horizon in [[South Africa]], southern [[Australia]], and northern [[New Zealand]] during the southern winter.<br /> <br /> Deneb is located at the tip of the [[Northern Cross (asterism)|Northern Cross]] asterism made up of the brightest stars in Cygnus, the others being [[Albireo]] (Beta Cygni), [[Gamma Cygni]], [[Delta Cygni]], and [[Epsilon Cygni]].&lt;ref name=&quot;summer&quot;&gt;{{cite journal|bibcode=1937ASPL....3...23S|title=Stars of the Summer Sky|journal=Astronomical Society of the Pacific Leaflets|volume=3|issue=102|pages=23|last1=Smith|first1=C. E.|year=1937}}&lt;/ref&gt; It also lies at one [[Vertex (geometry)|vertex]] of the prominent and widely spaced [[Asterism (astronomy)|asterism]] called the [[Summer Triangle]], shared with the first-[[apparent magnitude|magnitude]] stars [[Vega]] in the constellation [[Lyra]] and [[Altair]] in [[Aquila (constellation)|Aquila]].&lt;ref name=&quot;pasachoff2000&quot;&gt;<br /> {{Cite book<br /> |last1=Pasachoff |first1=J. M.<br /> |date=2000<br /> |title=A Field Guide to Stars and Planets<br /> |edition=4th<br /> |publisher=[[Houghton Mifflin]]<br /> |isbn=978-0-395-93431-9<br /> }}&lt;/ref&gt;&lt;ref name=upgren1998&gt;<br /> {{Cite book<br /> |last=Upgren |first=A. R.<br /> |date=1998<br /> |title=Night Has a Thousand Eyes: A Naked-Eye Guide to the Sky, Its Science, and Lore<br /> |publisher=[[Basic Books]]<br /> |isbn=978-0-306-45790-6<br /> }}&lt;/ref&gt; This outline of stars is the approximate shape of a [[right triangle]],&lt;!--Image on side shows this is self-evident--&gt; with Deneb located at one of the acute angles.<br /> <br /> The [[stellar spectrum|spectra]] of Alpha Cygni has been observed by astronomers since at least 1888, and by 1910 the variable [[radial velocity]] had become apparent. This led to the early suggestion by [[Edwin Brant Frost|E. B. Frost]] that this is a [[binary star]] system.&lt;ref name=Lee1910&gt;{{cite journal<br /> | title=Four stars having variable radial velocities<br /> | last=Lee | first=O. J.<br /> | journal=Astrophysical Journal<br /> | volume=31 | pages=176–179 | date=March 1910<br /> | doi=10.1086/141741 | bibcode=1910ApJ....31..176L<br /> }}&lt;/ref&gt; In 1935, the work of [[George Frederic Paddock|G. F. Paddock]] and others had established that this star was [[variable star|variable]] in luminosity with a dominant period of 11.7&amp;nbsp;days and possibly with other, lower amplitude periods.&lt;ref name=Abt1957&gt;{{cite journal<br /> | title=The Variability of Supergiants<br /> | last=Abt | first=Helmut A.<br /> | journal=Astrophysical Journal<br /> | volume=126 | page=138 | date=July 1957<br /> | doi=10.1086/146379 | bibcode=1957ApJ...126..138A<br /> }}&lt;/ref&gt; By 1954, closer examination of the star's [[Calcium K line|calcium H and K lines]] showed a stationary core, which indicated the variable velocity was instead being caused by motion of the [[Stellar atmosphere|star's atmosphere]]. This variation ranged from +6 to −9&amp;nbsp;km/s around the star's mean radial velocity.&lt;ref&gt;{{cite journal<br /> | title=The Stationary Calcium Lines of Alpha Cygni<br /> | last1=Struve | first1=Otto | last2=Huang | first2=S. S.<br /> | journal=Publications of the Astronomical Society of the Pacific<br /> | volume=66 | issue=392 | page=251 | date=October 1954<br /> | doi=10.1086/126710 | bibcode=1954PASP...66..251S<br /> | s2cid=121714858 | doi-access=free }}&lt;/ref&gt; Other, similar supergiants were found to have variable velocities, with this star being a typical member.&lt;ref name=Abt1957/&gt;<br /> <br /> ===Pole star===<br /> Due to the [[Earth|Earth's]] [[axial precession]], Deneb will be an approximate [[pole star]] (7° off of the north celestial pole) at around [[10th millennium#Astronomical events|9800 AD]].&lt;ref&gt;{{cite web |title=Deneb |url=http://stars.astro.illinois.edu/sow/deneb.html |website=[[University of Illinois]] |first=James B. |last=Kaler |date=1998-06-19 |access-date=2018-04-25}}&lt;/ref&gt; The north pole of [[Mars]] points to the midpoint of the line connecting Deneb and the star [[Alderamin]].&lt;ref name=&quot;Barlow&quot;&gt;<br /> {{cite book |last=Barlow |first=N. G. |url=https://archive.org/details/marsintroduction00barl_258 |title=Mars: An introduction to its interior, surface and atmosphere |date=2008 |publisher=[[Cambridge University Press]] |isbn=978-0-521-85226-5 |page=[https://archive.org/details/marsintroduction00barl_258/page/n30 21] |url-access=limited}}&lt;/ref&gt;<br /> {| class=&quot;wikitable&quot; style=&quot;margin: 1em auto 1em auto;&quot;<br /> ! width=&quot;120&quot; align=&quot;center&quot;|Preceded by<br /> ! width=&quot;160&quot; align=&quot;center&quot;|[[Pole Star]]<br /> ! width=&quot;120&quot; align=&quot;center&quot;|Succeeded by<br /> |-<br /> |align=&quot;center&quot;|'''[[Alderamin]]'''<br /> |align=&quot;center&quot;|8700 AD to 11000 AD<br /> |align=&quot;center&quot;|'''[[Delta Cygni]]'''<br /> |}<br /> <br /> ==Physical characteristics==<br /> Deneb's adopted distance from the Earth is around {{convert|802|pc|ly}}.&lt;ref name=schiller/&gt; This is derived by a variety of different methods, including spectral luminosity classes, atmospheric modelling, stellar evolution models, assumed membership of the [[Cygnus OB7]] association, and direct measurement of angular diameter. These methods give different distances, and all have significant margins of error. The original derivation of a [[parallax]] using measurements from the astrometric satellite [[Hipparcos]] gave an uncertain result of 1.01 ± 0.57 mas&lt;ref name=aaa323_L49&gt;{{Cite journal |last1=Perryman |first1=M. A. C. |last2=Lindegren |first2=L. |year=1997 |title=The Hipparcos Catalogue |journal=[[Astronomy and Astrophysics]] |volume=323 |pages=L49–L52 |bibcode=1997A&amp;A...323L..49P |last3=Kovalevsky |first3=J. |last4=Hoeg |first4=E. |last5=Bastian |first5=U. |last6=Bernacca |first6=P. L. |last7=Crézé |first7=M. |last8=Donati |first8=F. |last9=Grenon |first9=M. |last10=Grewing |first10=M. |last11=Van Leeuwen |first11=F. |last12=Van Der Marel |first12=H. |last13=Mignard |first13=F. |last14=Murray |first14=C. A. |last15=Le Poole |first15=R. S. |last16=Schrijver |first16=H. |last17=Turon |first17=C. |last18=Arenou |first18=F. |last19=Froeschlé |first19=M. |last20=Petersen |first20=C. S. }}&lt;/ref&gt;&lt;ref name=GSM&gt;{{Cite book |last=Perryman |first=M. |date=2010 |title=The Making of History's Greatest Star Map |publisher=[[Springer-Verlag]] |doi=10.1007/978-3-642-11602-5 |isbn=978-3-642-11601-8 |series=Astronomers' Universe |url=https://cds.cern.ch/record/1338896 |type=Submitted manuscript |bibcode=2010mhgs.book.....P }}&lt;/ref&gt; that was consistent with this distance. However, a more recent reanalysis gives the much larger parallax whose distance is barely half the current accepted value.&lt;ref name=hipparcos/&gt; One 2008 calculation using the Hipparcos data puts the most likely distance at {{convert|475|pc|ly}}, with an uncertainty of around 15%.&lt;ref name=maiz&gt;{{cite arXiv |title=Accurate distances to nearby massive stars with the new reduction of the Hipparcos raw data |pages=2553 |last1=Maíz Apellániz |first1=J. |last2=Alfaro |first2=E. J. |last3=Sota |first3=A. |year=2008 |class=astro-ph |eprint=0804.2553}}&lt;/ref&gt; The controversy over whether the direct Hipparcos measurements can be ignored in favour of a wide range of indirect stellar models and interstellar distance scales is similar to the better known situation with the [[Pleiades]].&lt;ref name=hipparcos&gt;{{cite journal |arxiv=0708.1752 |bibcode=2007A&amp;A...474..653V |doi=10.1051/0004-6361:20078357 |title=Validation of the new Hipparcos reduction |journal=Astronomy and Astrophysics |volume=474 |issue=2 |pages=653–664 |year=2007 |last1=Van Leeuwen |first1=F. |s2cid=18759600}}&lt;/ref&gt;<br /> <br /> Deneb's [[absolute magnitude]] is estimated as &amp;minus;8.4, placing it among the visually brightest stars known, with an estimated luminosity nearly {{solar luminosity|200,000|link=y}}. This is towards the upper end of values published over the past few decades, which vary between {{solar luminosity|55,000}} and {{solar luminosity|196,000}}.&lt;ref name=chesneau&gt;{{cite journal |bibcode=2010A&amp;A...521A...5C |title=Time, spatial, and spectral resolution of the Hα line-formation region of Deneb and Rigel with the VEGA/CHARA interferometer |journal=Astronomy and Astrophysics |volume=521 |pages=A5 |last1=Chesneau |first1=O. |last2=Dessart |first2=L. |last3=Mourard |first3=D. |last4=Bério |first4=Ph. |last5=Buil |first5=Ch. |last6=Bonneau |first6=D. |last7=Borges Fernandes |first7=M. |last8=Clausse |first8=J. M. |last9=Delaa |first9=O. |last10=Marcotto |first10=A. |last11=Meilland |first11=A. |last12=Millour |first12=F. |last13=Nardetto |first13=N. |last14=Perraut |first14=K. |last15=Roussel |first15=A. |last16=Spang |first16=A. |last17=Stee |first17=P. |last18=Tallon-Bosc |first18=I. |last19=McAlister |first19=H. |last20=Ten Brummelaar |first20=T. |last21=Sturmann |first21=J. |last22=Sturmann |first22=L. |last23=Turner |first23=N. |last24=Farrington |first24=C. |last25=Goldfinger |first25=P. J. |year=2010 |doi=10.1051/0004-6361/201014509 |arxiv=1007.2095 |s2cid=10340205 |url=https://hal.archives-ouvertes.fr/hal-00501515}}&lt;/ref&gt;&lt;ref&gt;<br /> {{Cite journal <br /> |last1=van de Kamp |first1=P.<br /> |date=1953<br /> |title=The Twenty Brightest Stars<br /> |journal=[[Publications of the Astronomical Society of the Pacific]]<br /> |volume=65 |issue=382<br /> |pages=30<br /> |bibcode=1953PASP...65...30V<br /> |doi=10.1086/126523 <br /> |doi-access=free<br /> }}&lt;/ref&gt;&lt;ref&gt;<br /> {{Cite journal<br /> |last1=Lamers |first1=H. J. G. L. M.<br /> |last2=Stalio |first2=R.<br /> |last3=Kondo |first3=Y.<br /> |date=1978<br /> |title=A study of mass loss from the mid-ultraviolet spectrum of α Cygni (A2 Ia), β Orionis (B8 Ia), and η Leonis (A0 Ib)<br /> |journal=[[The Astrophysical Journal]]<br /> |volume=223 |pages=207<br /> |bibcode=1978ApJ...223..207L<br /> |doi=10.1086/156252<br /> }}&lt;/ref&gt;<br /> <br /> Deneb is the most luminous first magnitude star, that is, stars with a brighter apparent magnitude than 1.5. Deneb is also the most distant of the 30 [[List of brightest stars|brightest stars]] by a factor of almost 2.&lt;ref&gt;{{cite web<br /> | title=The 172 Brightest Stars | work=STARS<br /> | first=James B. | last=Kaler | date=2017<br /> | url=http://stars.astro.illinois.edu/sow/bright.html | access-date=2021-09-17<br /> }}&lt;/ref&gt; Based on its temperature and luminosity, and also on direct measurements of its tiny [[angular diameter]] (a mere 0.002 seconds of arc), Deneb appears to have a diameter of about 200 times [[Solar radius|that of the Sun]];&lt;ref name=chesneau/&gt; if placed at the center of the [[Solar System]], Deneb would extend out to the [[Earth's orbit|orbit of the Earth]]. It is one of the [[List of largest stars|largest white 'A' spectral type stars known]].<br /> <br /> Deneb is a bluish-white star of [[stellar classification|spectral type]] A2Ia, with a surface temperature of 8,500 [[kelvin]]. Since 1943, its [[stellar spectrum|spectrum]] has served as one of the stable references by which other stars are classified.&lt;ref name=baas25_1319/&gt; Its mass is estimated at 19 {{Solar mass|link=y}}. [[Stellar wind]]s causes matter to be lost at an average rate of {{Solar mass|8±3{{e|-7}}}} per year, 100,000 times the Sun's rate of mass loss or equivalent to about one [[Earth mass]] per 500 years.&lt;ref&gt;{{cite journal |bibcode=2002ApJ...570..344A |title=The Spectral Energy Distribution and Mass-Loss Rate of the A-Type Supergiant Deneb |journal=The Astrophysical Journal |volume=570 |issue=1 |pages=344 |last1=Aufdenberg |first1=J. P. |last2=Hauschildt |first2=P. H. |last3=Baron |first3=E. |last4=Nordgren |first4=T. E. |last5=Burnley |first5=A. W. |last6=Howarth |first6=I. D. |last7=Gordon |first7=K. D. |last8=Stansberry |first8=J. A. |year=2002 |doi=10.1086/339740 |arxiv=astro-ph/0201218 |s2cid=13260314}}&lt;/ref&gt;<br /> <br /> ===Evolutionary state===<br /> Deneb spent much of its early life as an [[O-type main-sequence star]] of about {{solar mass|23}}, but it has now exhausted the [[hydrogen]] in its core and expanded to become a supergiant.&lt;ref name=schiller/&gt;&lt;ref name=georgy/&gt; Stars in the mass range of Deneb eventually expand to become the most luminous [[red supergiants]], and within a few million years their cores will collapse producing a [[supernova]] explosion. It is now known that red supergiants up to a certain mass explode as the commonly seen [[type II supernova|type II-P supernova]]e, but more massive ones lose their outer layers to become hotter again. Depending on their initial masses and the rate of mass loss, they may explode as [[yellow hypergiant]]s or [[luminous blue variable]]s, or they may become [[Wolf-Rayet star]]s before exploding in a [[Type Ib and Ic supernovae|type Ib or Ic supernova]]. Identifying whether Deneb is currently evolving towards a red supergiant or is currently evolving bluewards again would place valuable constraints on the classes of stars that explode as red supergiants and those that explode as hotter stars.&lt;ref name=georgy&gt;{{cite journal |bibcode=2014MNRAS.439L...6G |title=The puzzle of the CNO abundances of α Cygni variables resolved by the Ledoux criterion |journal=Monthly Notices of the Royal Astronomical Society: Letters |volume=439 |issue=1 |pages=L6–L10 |last1=Georgy |first1=Cyril |last2=Saio |first2=Hideyuki |last3=Meynet |first3=Georges |year=2014 |doi=10.1093/mnrasl/slt165 |arxiv=1311.4744 |s2cid=118557550}}&lt;/ref&gt;<br /> <br /> Stars evolving red-wards for the first time are most likely fusing hydrogen in a shell around a [[helium]] core that has not yet grown hot enough to start fusion to [[carbon]] and [[oxygen]]. Convection has begun [[Stellar evolution#Mature stars#Mid-sized stars#Red-giant-branch phase|dredging]] up fusion products but these do not reach the surface. Post-red supergiant stars are expected to show those fusion products at the surface due to stronger convection during the red supergiant phase and due to loss of the obscuring outer layers of the star. Deneb is thought to be increasing its temperature after a period as a red supergiant, although current models do not exactly reproduce the surface elements showing in its spectrum.&lt;ref name=georgy/&gt;<br /> <br /> ===Variable star===<br /> [[File:AlphaCygLightCurve.png|thumb|left|A [[Photometric_system#Photometric_letters|visual band]] [[light curve]] for Deneb, adapted from Yüce and Adelman (2019)&lt;ref name=&quot;Yuca2019&quot;/&gt;]]<br /> Deneb is the prototype of the [[Alpha Cygni variable|Alpha Cygni]] (α Cygni) [[variable star]]s,&lt;ref name=&quot;Richardson2011&quot; /&gt;&lt;ref name=&quot;Yuca2019&quot;&gt;{{cite journal<br /> |last1=Yüce|first1=K.<br /> |last2=Adelman |first2=S..J.<br /> |title=On the variability of the A0 supergiants 9 Per, HR 1035, 13 Mon, Deneb, and HR 8020 as seen in FCAPT Strömgren photometry<br /> |date=2019<br /> |journal=New Astronomy<br /> |volume=66<br /> |pages=88–99<br /> |doi=10.1016/j.newast.2018.07.002<br /> |bibcode = 2019NewA...66...88Y|s2cid=126285732<br /> }}&lt;/ref&gt; whose small irregular amplitudes and rapid pulsations can cause its magnitude to vary anywhere between 1.21 and 1.29.&lt;ref name=&quot;gscvquery&quot;&gt;{{Cite web |url=http://www.sai.msu.su/gcvs/cgi-bin/search.cgi?search=alf+Cyg|title=GCVS Query forms|website=Sternberg Astronomical Institute|access-date=2019-01-07<br /> }}&lt;/ref&gt; Its variable velocity discovered by Lee in 1910,&lt;ref name=Lee1910/&gt; but it was not formally placed as a unique class of variable stars until the 1985 4th edition of the General Catalogue of Variable Stars.&lt;ref name=&quot;GCVS4&quot;&gt;{{cite journal |bibcode=1996yCat.2139....0K |title=VizieR Online Data Catalog: General Catalog of Variable Stars, 4th Ed. (GCVS4) (/gcvs4Kholopov+ 1988) |journal=VizieR On-Line Data Catalog: II/139B. Originally Published in: Moscow: Nauka Publishing House (1985-1988) |volume=2139 |pages=0 |last1=Kholopov |first1=P. N. |last2=Samus' |first2=N. N. |last3=Frolov |first3=M. S. |last4=Goranskij |first4=V. P. |last5=Gorynya |first5=N. A. |last6=Kireeva |first6=N. N. |last7=Kukarkina |first7=N. P. |last8=Kurochkin |first8=N. E. |last9=Medvedeva |first9=G. I. |last10=Perova |first10=N. B.|date=1996}}&lt;/ref&gt; The cause of the pulsations of Alpha Cygni variable stars are not fully understood, but their [[irregular variable|irregular nature]] seems to be due to [[Beat (acoustics)|beat]]ing of multiple pulsation periods. Analysis of radial velocities determined 16 different harmonic pulsation modes with periods ranging between 6.9 and 100.8 days.&lt;ref name=&quot;Lucy1976&quot;/&gt; A longer period of about 800 days probably also exists.&lt;ref name=&quot;Yuca2019&quot; /&gt;<br /> <br /> ===Possible spectroscopic companion===<br /> Deneb has been reported as a possible single line spectroscopic [[Binary star|binary]] with a period of about 850 days, where the spectral lines from the star suggest cyclical radial velocity changes.&lt;ref name=&quot;Lucy1976&quot;&gt;{{cite journal|bibcode=1976ApJ...206..499L|title=An analysis of the variable radial velocity of alpha Cygni|journal=Astrophysical Journal|volume=206|pages=499|last1=Lucy|first1=L. B.|year=1976|doi=10.1086/154405|doi-access=free}}&lt;/ref&gt; Later investigations have found no evidence supporting the existence of a companion.&lt;ref name=&quot;Richardson2011&quot;&gt;{{cite journal |bibcode=2011AJ....141...17R |title=A Five-year Spectroscopic and Photometric Campaign on the Prototypical α Cygni Variable and A-type Supergiant Star Deneb |journal=The Astronomical Journal |volume=141 |issue=1 |pages=17 |last1=Richardson |first1=N. D. |last2=Morrison |first2=N. D. |last3=Kryukova |first3=E. E. |last4=Adelman |first4=S. J. |year=2011 |doi=10.1088/0004-6256/141/1/17 |arxiv=1009.5994 |s2cid=118300333}}&lt;/ref&gt;<br /> <br /> ==Etymology and cultural significance==<br /> [[File:Wide-field view of the Summer Triangle.jpg|thumb|upright=1.2|Wide-field view of the [[Summer Triangle]] and the [[Milky Way]]. Deneb is at the left-centre of the picture.&lt;!-- Not sure if you can find Deneb unless you have experience. --&gt;]]Names similar to Deneb have been given to at least seven different stars, most notably [[Beta Ceti|Deneb Kaitos]], the brightest star in the constellation of [[Cetus]]; [[Delta Capricorni|Deneb Algedi]], the brightest star in [[Capricornus]]; and [[Denebola]], the second brightest star in [[Leo (constellation)|Leo]]. All these stars are referring to the tail of the animals that their respective constellations represent.<br /> <br /> In Chinese, {{lang|zh|天津}} ({{lang|zh-Latn|Tiān Jīn}}), meaning ''[[Girl (Chinese constellation)|Celestial Ford]]'', refers to an asterism consisting of Deneb, [[Gamma Cygni]], [[Delta Cygni]], [[30 Cygni]], [[Nu Cygni]], [[Tau Cygni]], [[Upsilon Cygni]], [[Zeta Cygni]] and [[Epsilon Cygni]].&lt;ref&gt;{{cite book |author=陳久金|title=中國星座神話|url=https://books.google.com/books?id=0Vex0rYzdu8C|year=2005|publisher=五南圖書出版股份有限公司|isbn=978-986-7332-25-7}}&lt;/ref&gt; Consequently, the [[Chinese star names|Chinese name]] for Deneb itself is {{lang|zh|天津四}} ({{lang|zh-Latn|Tiān Jīn sì}}, {{lang-en|the Fourth Star of the Celestial Ford}}).&lt;ref&gt;{{cite web|language=zh |url=http://www.lcsd.gov.hk/CE/Museum/Space/Research/StarName/c_research_chinengstars_c_d.htm |title=香港太空館 - 研究資源 - 亮星中英對照表] |archive-url=https://web.archive.org/web/20081025110153/http://www.lcsd.gov.hk/CE/Museum/Space/Research/StarName/c_research_chinengstars_c_d.htm |archive-date=2008-10-25 | access-date=2019-01-09 | website=Hong Kong Space Museum}}&lt;/ref&gt;<br /> <br /> In the Chinese love story of [[Qi Xi]], Deneb marks the [[magpie]] bridge across the [[Milky Way]], which allows the separated lovers Niu Lang ([[Altair]]) and Zhi Nü ([[Vega]]) to be reunited on one special night of the year in late summer. In other versions of the story, Deneb is a fairy who acts as chaperone when the lovers meet.<br /> <br /> ===Namesakes===<br /> [[USS Arided (AK-73)|USS ''Arided'']] was a [[United States Navy]] [[Crater class cargo ship|''Crater''-class cargo ship]] named after the star. [[SS Deneb|SS ''Deneb'']] was an Italian merchant vessel that bore this name from 1951 until she was scrapped in 1966.<br /> <br /> ===In fiction===<br /> {{main|Deneb in fiction}}<br /> The star Deneb, and hypothetical planets orbiting it, have been used many times in [[literature]], [[film]], [[electronic game]]s, and [[music]]. Examples include several episodes of the ''[[Star Trek]]'' [[TV series]], the ''[[Silver Surfer]]'' comic book, the [[Rush (band)|Rush]] [[album]]s ''[[A Farewell to Kings]]'' and ''[[Hemispheres (Rush album)|Hemispheres]]'', the ''[[Descent: FreeSpace – The Great War]]'' [[computer game]], ''[[Stellaris (video game)|Stellaris]]'', and the [[science fiction]] [[novel]] ''[[Hyperion (Simmons novel)|Hyperion]]''.<br /> <br /> ==See also==<br /> * [[List of bright stars]]<br /> <br /> ==References==<br /> {{Reflist|30em|refs=}}<br /> <br /> {{Sky|20|41|25.9|+|45|16|49|1400}}<br /> {{Stars of Cygnus}}<br /> {{Portal bar|Astronomy|Stars|Outer space}}<br /> &lt;!-- Properties --&gt;<br /> <br /> [[Category:A-type supergiants]]<br /> [[Category:Alpha Cygni variables]]<br /> [[Category:Emission-line stars]]<br /> &lt;!-- Other --&gt;<br /> [[Category:Northern pole stars]]<br /> [[Category:Cygnus (constellation)]]<br /> [[Category:Bayer objects|Cygni, Alpha]]<br /> [[Category:Durchmusterung objects|BD+44 3541]]<br /> [[Category:Flamsteed objects|Cygni, 50]]<br /> [[Category:Henry Draper Catalogue objects|197345]]<br /> [[Category:Hipparcos objects|102098]]<br /> [[Category:Bright Star Catalogue objects|7924]]<br /> [[Category:Arabic words and phrases]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Deneb&diff=1170906801 Deneb 2023-08-17T22:36:49Z <p>205.189.94.9: </p> <hr /> <div>{{short description|Star in the constellation Cygnus}}<br /> {{About|the star}}<br /> {{Starbox begin}}<br /> {{Starbox image<br /> | image=<br /> {{Location mark<br /> | image=Cygnus constellation map.svg<br /> | float=center | width=250 | position=right<br /> | mark=Red circle.svg | mark_width=10 | mark_link=Deneb (star)<br /> | x%=32.5 | y%=36.9<br /> }}<br /> | caption=Location of Deneb (circled)<br /> }}<br /> {{Starbox observe<br /> | epoch=J2000<br /> | constell=[[Cygnus (constellation)|Cygnus]]<br /> | pronounce={{IPAc-en|'|d|ɛ|n|ɛ|b|}}, {{IPAc-en|'|d|ɛ|n|ə|b|}}&lt;ref name=merriam&gt;{{cite book|author=Merriam-Webster, Inc|title=Merriam-Webster's Collegiate Dictionary|url=https://books.google.com/books?id=53-PQgAACAAJ|year=1998|publisher=Merriam-Webster|isbn=978-0-87779-714-2}}&lt;/ref&gt;<br /> | ra={{RA|20|41|25.9}}&lt;ref name=hipparcos/&gt;|dec={{DEC|+45|16|49}}&lt;ref name=hipparcos/&gt;<br /> | appmag_v=1.25&lt;ref name=ducati&gt;{{cite journal|bibcode=2002yCat.2237....0D|title=VizieR On-Line Data Catalog: Catalogue of Stellar Photometry in Johnson's 11-color system|journal=CDS/ADC Collection of Electronic Catalogues|volume=2237|pages=0|last1=Ducati|first1=J. R.|year=2002}}&lt;/ref&gt; {{nowrap|(1.21–1.29&lt;ref name=gcvs&gt;{{cite journal|bibcode=2009yCat....102025S|title=VizieR Online Data Catalog: General Catalogue of Variable Stars (Samus+ 2007–2013)|journal=VizieR On-Line Data Catalog: B/GCVS. Originally Published in: 2009yCat....102025S|volume=1|pages=02025|last1=Samus|first1=N. N.|last2=Durlevich|first2=O. V.|year=2009|display-authors=etal}}&lt;/ref&gt;)}}<br /> }}<br /> {{Starbox character<br /> | class=A2&amp;nbsp;Ia&lt;ref name=baas25_1319&gt;{{Cite journal |last1=Garrison |first1=R. F. |title=Anchor Points for the MK System of Spectral Classification |journal=[[Bulletin of the American Astronomical Society]] |volume=25 |page=1319 |year=1993 |bibcode=1993AAS...183.1710G |url=http://www.astro.utoronto.ca/~garrison/mkstds.html |access-date=2012-02-04 |archive-date=2019-06-25 |archive-url=https://web.archive.org/web/20190625094716/http://www.astro.utoronto.ca/~garrison/mkstds.html |url-status=dead }}&lt;/ref&gt;<br /> | b-v=+0.09&lt;ref name=ducati/&gt;<br /> | u-b=&amp;minus;0.23&lt;ref name=ducati/&gt;<br /> | variable=[[Alpha Cygni variable|Alpha Cygni]]&lt;ref name=gcvs/&gt;<br /> }}<br /> {{Starbox astrometry<br /> | radial_v=&amp;minus;4.5&lt;ref name=pulkovo&gt;{{cite journal|bibcode=2006AstL...32..759G|title=Pulkovo Compilation of Radial Velocities for 35 495 Hipparcos stars in a common system|journal=Astronomy Letters|volume=32|issue=11|pages=759–771|last1=Gontcharov|first1=G. A.|year=2006|doi=10.1134/S1063773706110065|arxiv = 1606.08053 |s2cid=119231169}}&lt;/ref&gt;<br /> | prop_mo_ra=1.99&lt;ref name=hipparcos/&gt;<br /> | prop_mo_dec=1.95&lt;ref name=hipparcos/&gt;<br /> | parallax=2.29<br /> | p_error=0.32<br /> | parallax_footnote=&lt;ref name=hipparcos/&gt;<br /> | dist_ly={{val|2,615|215|fmt=commas}}<br /> | dist_pc={{val|802|66}}&lt;ref name=schiller/&gt;<br /> | absmag_v=&amp;minus;8.38&lt;ref name=schiller/&gt;<br /> }}<br /> {{Starbox detail<br /> |source=&lt;ref name=schiller&gt;{{cite journal<br /> | last1=Schiller | first1=F.<br /> | last2=Przybilla | first2=N.<br /> | date=2008<br /> | title=Quantitative spectroscopy of Deneb<br /> | journal=[[Astronomy &amp; Astrophysics]]<br /> | volume=479 | issue=3 | pages=849–858<br /> | arxiv=0712.0040<br /> | bibcode=2008A&amp;A...479..849S<br /> | doi=10.1051/0004-6361:20078590<br /> | s2cid=119225384<br /> }}&lt;/ref&gt;<br /> | mass = {{Val|19|4}}<br /> | radius = {{Val|203|17}}<br /> | luminosity = {{Val|196000|32000|fmt=commas}}<br /> | temperature = {{Val|8525|75|fmt=commas}}<br /> | rotational_velocity = {{Val|20|2}}<br /> | gravity = {{Val|1.10|0.05}}<br /> | metal_fe = −0.25<br /> | age_myr=<br /> }}<br /> {{Starbox catalog<br /> | names={{odlist | name=Arided | name2=Aridif | name3=Gallina | name4=Arrioph | B=α Cygni | F=50 Cygni | BD=+44°3541 | FK5=777 | HD=197345 | HIP=102098 | HR=7924 | SAO=49941 }}<br /> }}<br /> {{Starbox reference<br /> | Simbad=Deneb<br /> }}<br /> {{Starbox end}}<br /> <br /> '''Deneb''' ({{IPAc-en|ˈ|d|ɛ|n|ɛ|b}}) is a [[first-magnitude star]] in the [[constellation]] of [[Cygnus (constellation)|Cygnus]]. Deneb is one of the vertices of the [[Asterism (astronomy)|asterism]] known as the [[Summer Triangle]] and the &quot;head&quot; of the [[Northern Cross (asterism)|Northern Cross]]. It is the brightest [[star]] in Cygnus and the 19th [[List of brightest stars|brightest star]] in the [[night sky]], with an average [[apparent magnitude]] of +1.25. A blue-white [[supergiant]], Deneb rivals [[Rigel]] as the most luminous [[first-magnitude star]]. However, its distance, and hence luminosity, is poorly known; its luminosity is somewhere between 55,000 and 196,000 times [[Solar luminosity|that of the Sun]]. Its [[Bayer designation]] is '''α Cygni''', which is [[Latinisation of names|Latinised]] to '''Alpha Cygni''', abbreviated to '''Alpha Cyg''' or '''α Cyg'''. Assorted [[methods of detection]] confirm [[Deneb]] to be a [[luminous type]] [[star]] As better cataloguing and organization methods of sorting stellar and cosmic data are developed, The degree to which we can rely on quantitative studies of this class of stars as a whole will improve. The pentulitmate goal of astronomy, and especially catalogs of astronomers, atlases of stars and and benchmark objects is to establish a stable, confirmable database with plenty of bias and error. <br /> Long-term historical observation data show Deneb to be long stable at between 7525 Kelvin through 8250 Kelvin across long years to an accuracy of nonzero. <br /> <br /> This is a non-linear term reference solar system, which can be used as a candle or candlewatt in further research on astrophysics and map collection as a unitary art of education. This is known to be an informationally lucrative hobby.<br /> <br /> ==Nomenclature==<br /> <br /> [[File:CygnusCC.jpg|thumb|left|upright|Deneb is the brighest star in the constellation of Cygnus (top)]]<br /> ''α Cygni'' (Latinised to ''Alpha Cygni'') is the star's [[Bayer designation|designation]] given by [[Johann Bayer]] in 1603. The traditional name ''Deneb'' is derived from the [[Arabic]] word for &quot;tail&quot;, from the phrase ذنب الدجاجة ''Dhanab al-Dajājah'', or &quot;tail of the hen&quot;.&lt;ref name=allen/&gt; The [[IAU Working Group on Star Names]] has recognised the name ''Deneb'' for this star, and it is entered in their Catalog of Star Names.&lt;ref name=&quot;IAU-CSN&quot;&gt;{{cite web |url=http://www.pas.rochester.edu/~emamajek/WGSN/IAU-CSN.txt |title=IAU Catalog of Star Names |website=University of Rochester |access-date=28 July 2016}}&lt;/ref&gt;<br /> <br /> ''Denebadigege'' was used in the ''[[Alfonsine Tables]]'',&lt;ref name=Kunitzsch86&gt;<br /> {{cite journal<br /> |last=Kunitzsch |first=Paul<br /> |date=1986<br /> |title=The Star Catalogue Commonly Appended to the Alfonsine Tables<br /> |journal=[[Journal for the History of Astronomy]]<br /> |volume=17 |issue=49 |pages=89–98<br /> |bibcode=1986JHA....17...89K<br /> |doi=10.1177/002182868601700202<br /> |s2cid=118597258<br /> }}&lt;/ref&gt; other variants include ''Deneb Adige'', ''Denebedigege'' and ''Arided''. This latter name was derived from ''Al Ridhādh'', a name for the constellation. [[Johann Bayer]] called it ''Arrioph'', derived from ''Aridf'' and ''Al Ridf'', 'the hindmost' or ''Gallina''. German poet and author [[Philipp von Zesen|Philippus Caesius]] termed it ''Os rosae'', or ''Rosemund'' in German, or ''Uropygium'' – the parson's nose.&lt;ref name=allen&gt;<br /> {{cite book<br /> |last=Allen<br /> |first=Richard Hinckley<br /> |date=1963<br /> |title=Star Names: Their Lore and Meaning<br /> |page=[https://archive.org/details/starnamestheirlo00alle/page/195 195]<br /> |edition=Reprint<br /> |publisher=[[Dover Publications]]<br /> |isbn=978-0-486-21079-7<br /> |url-access=registration<br /> |url=https://archive.org/details/starnamestheirlo00alle/page/195<br /> }}&lt;/ref&gt; The names ''Arided'' and ''Aridif'' have fallen out of use.<br /> <br /> An older traditional name is '''Arided''' {{IPAc-en|'|ær|ɪ|d|E|d}}, from the Arabic ''ar-ridf'' 'the one sitting behind the rider' (or just 'the follower'), perhaps referring to the other major stars of Cygnus, which were called ''al-fawāris'' 'the riders'.&lt;ref name=Kunitzsch&gt;{{cite book<br /> |last1=Kunitzsch |first1=Paul<br /> |last2=Smart |first2=Tim<br /> |date = 2006 |edition = 2nd rev.<br /> |title = A Dictionary of Modern star Names: A Short Guide to 254 Star Names and Their Derivations<br /> |publisher = Sky Pub |location = Cambridge, Massachusetts<br /> |isbn = 978-1-931559-44-7<br /> }}&lt;/ref&gt;<br /> <br /> ==Observation==<br /> [[File:Summer triangle.png|left|thumb|upright=1.2|The [[Summer Triangle]]]]<br /> The 19th [[List of brightest stars|brightest star]] in the night sky, Deneb [[culmination|culminates]] each year on October 23 at 6 PM and September 7 at 9 PM,&lt;ref name=&quot;south2015&quot;&gt;{{cite web |title=The Constellations : Part 3 Culmination Times|url=http://www.southastrodel.com/Page20502.htm|website=Southern Astronomical Delights|first=Andrew|last=James|date=2015-06-17 |access-date=2019-04-02}}&lt;/ref&gt; corresponding to [[summer]] evenings in the [[northern hemisphere]].&lt;ref name=&quot;summer&quot; /&gt; It never dips below the horizon at or above 45° north latitude, just grazing the northern horizon at its lowest point at such locations as [[Minneapolis]], [[Montreal|Montréal]] and [[Turin]]. In the [[Southern Hemisphere|southern hemisphere]], Deneb is not visible south of [[45th parallel south|45° parallel south]], so it just barely rises above the horizon in [[South Africa]], southern [[Australia]], and northern [[New Zealand]] during the southern winter.<br /> <br /> Deneb is located at the tip of the [[Northern Cross (asterism)|Northern Cross]] asterism made up of the brightest stars in Cygnus, the others being [[Albireo]] (Beta Cygni), [[Gamma Cygni]], [[Delta Cygni]], and [[Epsilon Cygni]].&lt;ref name=&quot;summer&quot;&gt;{{cite journal|bibcode=1937ASPL....3...23S|title=Stars of the Summer Sky|journal=Astronomical Society of the Pacific Leaflets|volume=3|issue=102|pages=23|last1=Smith|first1=C. E.|year=1937}}&lt;/ref&gt; It also lies at one [[Vertex (geometry)|vertex]] of the prominent and widely spaced [[Asterism (astronomy)|asterism]] called the [[Summer Triangle]], shared with the first-[[apparent magnitude|magnitude]] stars [[Vega]] in the constellation [[Lyra]] and [[Altair]] in [[Aquila (constellation)|Aquila]].&lt;ref name=&quot;pasachoff2000&quot;&gt;<br /> {{Cite book<br /> |last1=Pasachoff |first1=J. M.<br /> |date=2000<br /> |title=A Field Guide to Stars and Planets<br /> |edition=4th<br /> |publisher=[[Houghton Mifflin]]<br /> |isbn=978-0-395-93431-9<br /> }}&lt;/ref&gt;&lt;ref name=upgren1998&gt;<br /> {{Cite book<br /> |last=Upgren |first=A. R.<br /> |date=1998<br /> |title=Night Has a Thousand Eyes: A Naked-Eye Guide to the Sky, Its Science, and Lore<br /> |publisher=[[Basic Books]]<br /> |isbn=978-0-306-45790-6<br /> }}&lt;/ref&gt; This outline of stars is the approximate shape of a [[right triangle]],&lt;!--Image on side shows this is self-evident--&gt; with Deneb located at one of the acute angles.<br /> <br /> The [[stellar spectrum|spectra]] of Alpha Cygni has been observed by astronomers since at least 1888, and by 1910 the variable [[radial velocity]] had become apparent. This led to the early suggestion by [[Edwin Brant Frost|E. B. Frost]] that this is a [[binary star]] system.&lt;ref name=Lee1910&gt;{{cite journal<br /> | title=Four stars having variable radial velocities<br /> | last=Lee | first=O. J.<br /> | journal=Astrophysical Journal<br /> | volume=31 | pages=176–179 | date=March 1910<br /> | doi=10.1086/141741 | bibcode=1910ApJ....31..176L<br /> }}&lt;/ref&gt; In 1935, the work of [[George Frederic Paddock|G. F. Paddock]] and others had established that this star was [[variable star|variable]] in luminosity with a dominant period of 11.7&amp;nbsp;days and possibly with other, lower amplitude periods.&lt;ref name=Abt1957&gt;{{cite journal<br /> | title=The Variability of Supergiants<br /> | last=Abt | first=Helmut A.<br /> | journal=Astrophysical Journal<br /> | volume=126 | page=138 | date=July 1957<br /> | doi=10.1086/146379 | bibcode=1957ApJ...126..138A<br /> }}&lt;/ref&gt; By 1954, closer examination of the star's [[Calcium K line|calcium H and K lines]] showed a stationary core, which indicated the variable velocity was instead being caused by motion of the [[Stellar atmosphere|star's atmosphere]]. This variation ranged from +6 to −9&amp;nbsp;km/s around the star's mean radial velocity.&lt;ref&gt;{{cite journal<br /> | title=The Stationary Calcium Lines of Alpha Cygni<br /> | last1=Struve | first1=Otto | last2=Huang | first2=S. S.<br /> | journal=Publications of the Astronomical Society of the Pacific<br /> | volume=66 | issue=392 | page=251 | date=October 1954<br /> | doi=10.1086/126710 | bibcode=1954PASP...66..251S<br /> | s2cid=121714858 | doi-access=free }}&lt;/ref&gt; Other, similar supergiants were found to have variable velocities, with this star being a typical member.&lt;ref name=Abt1957/&gt;<br /> <br /> ===Pole star===<br /> Due to the [[Earth|Earth's]] [[axial precession]], Deneb will be an approximate [[pole star]] (7° off of the north celestial pole) at around [[10th millennium#Astronomical events|9800 AD]].&lt;ref&gt;{{cite web |title=Deneb |url=http://stars.astro.illinois.edu/sow/deneb.html |website=[[University of Illinois]] |first=James B. |last=Kaler |date=1998-06-19 |access-date=2018-04-25}}&lt;/ref&gt; The north pole of [[Mars]] points to the midpoint of the line connecting Deneb and the star [[Alderamin]].&lt;ref name=&quot;Barlow&quot;&gt;<br /> {{cite book |last=Barlow |first=N. G. |url=https://archive.org/details/marsintroduction00barl_258 |title=Mars: An introduction to its interior, surface and atmosphere |date=2008 |publisher=[[Cambridge University Press]] |isbn=978-0-521-85226-5 |page=[https://archive.org/details/marsintroduction00barl_258/page/n30 21] |url-access=limited}}&lt;/ref&gt;<br /> {| class=&quot;wikitable&quot; style=&quot;margin: 1em auto 1em auto;&quot;<br /> ! width=&quot;120&quot; align=&quot;center&quot;|Preceded by<br /> ! width=&quot;160&quot; align=&quot;center&quot;|[[Pole Star]]<br /> ! width=&quot;120&quot; align=&quot;center&quot;|Succeeded by<br /> |-<br /> |align=&quot;center&quot;|'''[[Alderamin]]'''<br /> |align=&quot;center&quot;|8700 AD to 11000 AD<br /> |align=&quot;center&quot;|'''[[Delta Cygni]]'''<br /> |}<br /> <br /> ==Physical characteristics==<br /> Deneb's adopted distance from the Earth is around {{convert|802|pc|ly}}.&lt;ref name=schiller/&gt; This is derived by a variety of different methods, including spectral luminosity classes, atmospheric modelling, stellar evolution models, assumed membership of the [[Cygnus OB7]] association, and direct measurement of angular diameter. These methods give different distances, and all have significant margins of error. The original derivation of a [[parallax]] using measurements from the astrometric satellite [[Hipparcos]] gave an uncertain result of 1.01 ± 0.57 mas&lt;ref name=aaa323_L49&gt;{{Cite journal |last1=Perryman |first1=M. A. C. |last2=Lindegren |first2=L. |year=1997 |title=The Hipparcos Catalogue |journal=[[Astronomy and Astrophysics]] |volume=323 |pages=L49–L52 |bibcode=1997A&amp;A...323L..49P |last3=Kovalevsky |first3=J. |last4=Hoeg |first4=E. |last5=Bastian |first5=U. |last6=Bernacca |first6=P. L. |last7=Crézé |first7=M. |last8=Donati |first8=F. |last9=Grenon |first9=M. |last10=Grewing |first10=M. |last11=Van Leeuwen |first11=F. |last12=Van Der Marel |first12=H. |last13=Mignard |first13=F. |last14=Murray |first14=C. A. |last15=Le Poole |first15=R. S. |last16=Schrijver |first16=H. |last17=Turon |first17=C. |last18=Arenou |first18=F. |last19=Froeschlé |first19=M. |last20=Petersen |first20=C. S. }}&lt;/ref&gt;&lt;ref name=GSM&gt;{{Cite book |last=Perryman |first=M. |date=2010 |title=The Making of History's Greatest Star Map |publisher=[[Springer-Verlag]] |doi=10.1007/978-3-642-11602-5 |isbn=978-3-642-11601-8 |series=Astronomers' Universe |url=https://cds.cern.ch/record/1338896 |type=Submitted manuscript |bibcode=2010mhgs.book.....P }}&lt;/ref&gt; that was consistent with this distance. However, a more recent reanalysis gives the much larger parallax whose distance is barely half the current accepted value.&lt;ref name=hipparcos/&gt; One 2008 calculation using the Hipparcos data puts the most likely distance at {{convert|475|pc|ly}}, with an uncertainty of around 15%.&lt;ref name=maiz&gt;{{cite arXiv |title=Accurate distances to nearby massive stars with the new reduction of the Hipparcos raw data |pages=2553 |last1=Maíz Apellániz |first1=J. |last2=Alfaro |first2=E. J. |last3=Sota |first3=A. |year=2008 |class=astro-ph |eprint=0804.2553}}&lt;/ref&gt; The controversy over whether the direct Hipparcos measurements can be ignored in favour of a wide range of indirect stellar models and interstellar distance scales is similar to the better known situation with the [[Pleiades]].&lt;ref name=hipparcos&gt;{{cite journal |arxiv=0708.1752 |bibcode=2007A&amp;A...474..653V |doi=10.1051/0004-6361:20078357 |title=Validation of the new Hipparcos reduction |journal=Astronomy and Astrophysics |volume=474 |issue=2 |pages=653–664 |year=2007 |last1=Van Leeuwen |first1=F. |s2cid=18759600}}&lt;/ref&gt;<br /> <br /> Deneb's [[absolute magnitude]] is estimated as &amp;minus;8.4, placing it among the visually brightest stars known, with an estimated luminosity nearly {{solar luminosity|200,000|link=y}}. This is towards the upper end of values published over the past few decades, which vary between {{solar luminosity|55,000}} and {{solar luminosity|196,000}}.&lt;ref name=chesneau&gt;{{cite journal |bibcode=2010A&amp;A...521A...5C |title=Time, spatial, and spectral resolution of the Hα line-formation region of Deneb and Rigel with the VEGA/CHARA interferometer |journal=Astronomy and Astrophysics |volume=521 |pages=A5 |last1=Chesneau |first1=O. |last2=Dessart |first2=L. |last3=Mourard |first3=D. |last4=Bério |first4=Ph. |last5=Buil |first5=Ch. |last6=Bonneau |first6=D. |last7=Borges Fernandes |first7=M. |last8=Clausse |first8=J. M. |last9=Delaa |first9=O. |last10=Marcotto |first10=A. |last11=Meilland |first11=A. |last12=Millour |first12=F. |last13=Nardetto |first13=N. |last14=Perraut |first14=K. |last15=Roussel |first15=A. |last16=Spang |first16=A. |last17=Stee |first17=P. |last18=Tallon-Bosc |first18=I. |last19=McAlister |first19=H. |last20=Ten Brummelaar |first20=T. |last21=Sturmann |first21=J. |last22=Sturmann |first22=L. |last23=Turner |first23=N. |last24=Farrington |first24=C. |last25=Goldfinger |first25=P. J. |year=2010 |doi=10.1051/0004-6361/201014509 |arxiv=1007.2095 |s2cid=10340205 |url=https://hal.archives-ouvertes.fr/hal-00501515}}&lt;/ref&gt;&lt;ref&gt;<br /> {{Cite journal <br /> |last1=van de Kamp |first1=P.<br /> |date=1953<br /> |title=The Twenty Brightest Stars<br /> |journal=[[Publications of the Astronomical Society of the Pacific]]<br /> |volume=65 |issue=382<br /> |pages=30<br /> |bibcode=1953PASP...65...30V<br /> |doi=10.1086/126523 <br /> |doi-access=free<br /> }}&lt;/ref&gt;&lt;ref&gt;<br /> {{Cite journal<br /> |last1=Lamers |first1=H. J. G. L. M.<br /> |last2=Stalio |first2=R.<br /> |last3=Kondo |first3=Y.<br /> |date=1978<br /> |title=A study of mass loss from the mid-ultraviolet spectrum of α Cygni (A2 Ia), β Orionis (B8 Ia), and η Leonis (A0 Ib)<br /> |journal=[[The Astrophysical Journal]]<br /> |volume=223 |pages=207<br /> |bibcode=1978ApJ...223..207L<br /> |doi=10.1086/156252<br /> }}&lt;/ref&gt;<br /> <br /> Deneb is the most luminous first magnitude star, that is, stars with a brighter apparent magnitude than 1.5. Deneb is also the most distant of the 30 [[List of brightest stars|brightest stars]] by a factor of almost 2.&lt;ref&gt;{{cite web<br /> | title=The 172 Brightest Stars | work=STARS<br /> | first=James B. | last=Kaler | date=2017<br /> | url=http://stars.astro.illinois.edu/sow/bright.html | access-date=2021-09-17<br /> }}&lt;/ref&gt; Based on its temperature and luminosity, and also on direct measurements of its tiny [[angular diameter]] (a mere 0.002 seconds of arc), Deneb appears to have a diameter of about 200 times [[Solar radius|that of the Sun]];&lt;ref name=chesneau/&gt; if placed at the center of the [[Solar System]], Deneb would extend out to the [[Earth's orbit|orbit of the Earth]]. It is one of the [[List of largest stars|largest white 'A' spectral type stars known]].<br /> <br /> Deneb is a bluish-white star of [[stellar classification|spectral type]] A2Ia, with a surface temperature of 8,500 [[kelvin]]. Since 1943, its [[stellar spectrum|spectrum]] has served as one of the stable references by which other stars are classified.&lt;ref name=baas25_1319/&gt; Its mass is estimated at 19 {{Solar mass|link=y}}. [[Stellar wind]]s causes matter to be lost at an average rate of {{Solar mass|8±3{{e|-7}}}} per year, 100,000 times the Sun's rate of mass loss or equivalent to about one [[Earth mass]] per 500 years.&lt;ref&gt;{{cite journal |bibcode=2002ApJ...570..344A |title=The Spectral Energy Distribution and Mass-Loss Rate of the A-Type Supergiant Deneb |journal=The Astrophysical Journal |volume=570 |issue=1 |pages=344 |last1=Aufdenberg |first1=J. P. |last2=Hauschildt |first2=P. H. |last3=Baron |first3=E. |last4=Nordgren |first4=T. E. |last5=Burnley |first5=A. W. |last6=Howarth |first6=I. D. |last7=Gordon |first7=K. D. |last8=Stansberry |first8=J. A. |year=2002 |doi=10.1086/339740 |arxiv=astro-ph/0201218 |s2cid=13260314}}&lt;/ref&gt;<br /> <br /> ===Evolutionary state===<br /> Deneb spent much of its early life as an [[O-type main-sequence star]] of about {{solar mass|23}}, but it has now exhausted the [[hydrogen]] in its core and expanded to become a supergiant.&lt;ref name=schiller/&gt;&lt;ref name=georgy/&gt; Stars in the mass range of Deneb eventually expand to become the most luminous [[red supergiants]], and within a few million years their cores will collapse producing a [[supernova]] explosion. It is now known that red supergiants up to a certain mass explode as the commonly seen [[type II supernova|type II-P supernova]]e, but more massive ones lose their outer layers to become hotter again. Depending on their initial masses and the rate of mass loss, they may explode as [[yellow hypergiant]]s or [[luminous blue variable]]s, or they may become [[Wolf-Rayet star]]s before exploding in a [[Type Ib and Ic supernovae|type Ib or Ic supernova]]. Identifying whether Deneb is currently evolving towards a red supergiant or is currently evolving bluewards again would place valuable constraints on the classes of stars that explode as red supergiants and those that explode as hotter stars.&lt;ref name=georgy&gt;{{cite journal |bibcode=2014MNRAS.439L...6G |title=The puzzle of the CNO abundances of α Cygni variables resolved by the Ledoux criterion |journal=Monthly Notices of the Royal Astronomical Society: Letters |volume=439 |issue=1 |pages=L6–L10 |last1=Georgy |first1=Cyril |last2=Saio |first2=Hideyuki |last3=Meynet |first3=Georges |year=2014 |doi=10.1093/mnrasl/slt165 |arxiv=1311.4744 |s2cid=118557550}}&lt;/ref&gt;<br /> <br /> Stars evolving red-wards for the first time are most likely fusing hydrogen in a shell around a [[helium]] core that has not yet grown hot enough to start fusion to [[carbon]] and [[oxygen]]. Convection has begun [[Stellar evolution#Mature stars#Mid-sized stars#Red-giant-branch phase|dredging]] up fusion products but these do not reach the surface. Post-red supergiant stars are expected to show those fusion products at the surface due to stronger convection during the red supergiant phase and due to loss of the obscuring outer layers of the star. Deneb is thought to be increasing its temperature after a period as a red supergiant, although current models do not exactly reproduce the surface elements showing in its spectrum.&lt;ref name=georgy/&gt;<br /> <br /> ===Variable star===<br /> [[File:AlphaCygLightCurve.png|thumb|left|A [[Photometric_system#Photometric_letters|visual band]] [[light curve]] for Deneb, adapted from Yüce and Adelman (2019)&lt;ref name=&quot;Yuca2019&quot;/&gt;]]<br /> Deneb is the prototype of the [[Alpha Cygni variable|Alpha Cygni]] (α Cygni) [[variable star]]s,&lt;ref name=&quot;Richardson2011&quot; /&gt;&lt;ref name=&quot;Yuca2019&quot;&gt;{{cite journal<br /> |last1=Yüce|first1=K.<br /> |last2=Adelman |first2=S..J.<br /> |title=On the variability of the A0 supergiants 9 Per, HR 1035, 13 Mon, Deneb, and HR 8020 as seen in FCAPT Strömgren photometry<br /> |date=2019<br /> |journal=New Astronomy<br /> |volume=66<br /> |pages=88–99<br /> |doi=10.1016/j.newast.2018.07.002<br /> |bibcode = 2019NewA...66...88Y|s2cid=126285732<br /> }}&lt;/ref&gt; whose small irregular amplitudes and rapid pulsations can cause its magnitude to vary anywhere between 1.21 and 1.29.&lt;ref name=&quot;gscvquery&quot;&gt;{{Cite web |url=http://www.sai.msu.su/gcvs/cgi-bin/search.cgi?search=alf+Cyg|title=GCVS Query forms|website=Sternberg Astronomical Institute|access-date=2019-01-07<br /> }}&lt;/ref&gt; Its variable velocity discovered by Lee in 1910,&lt;ref name=Lee1910/&gt; but it was not formally placed as a unique class of variable stars until the 1985 4th edition of the General Catalogue of Variable Stars.&lt;ref name=&quot;GCVS4&quot;&gt;{{cite journal |bibcode=1996yCat.2139....0K |title=VizieR Online Data Catalog: General Catalog of Variable Stars, 4th Ed. (GCVS4) (/gcvs4Kholopov+ 1988) |journal=VizieR On-Line Data Catalog: II/139B. Originally Published in: Moscow: Nauka Publishing House (1985-1988) |volume=2139 |pages=0 |last1=Kholopov |first1=P. N. |last2=Samus' |first2=N. N. |last3=Frolov |first3=M. S. |last4=Goranskij |first4=V. P. |last5=Gorynya |first5=N. A. |last6=Kireeva |first6=N. N. |last7=Kukarkina |first7=N. P. |last8=Kurochkin |first8=N. E. |last9=Medvedeva |first9=G. I. |last10=Perova |first10=N. B.|date=1996}}&lt;/ref&gt; The cause of the pulsations of Alpha Cygni variable stars are not fully understood, but their [[irregular variable|irregular nature]] seems to be due to [[Beat (acoustics)|beat]]ing of multiple pulsation periods. Analysis of radial velocities determined 16 different harmonic pulsation modes with periods ranging between 6.9 and 100.8 days.&lt;ref name=&quot;Lucy1976&quot;/&gt; A longer period of about 800 days probably also exists.&lt;ref name=&quot;Yuca2019&quot; /&gt;<br /> <br /> ===Possible spectroscopic companion===<br /> Deneb has been reported as a possible single line spectroscopic [[Binary star|binary]] with a period of about 850 days, where the spectral lines from the star suggest cyclical radial velocity changes.&lt;ref name=&quot;Lucy1976&quot;&gt;{{cite journal|bibcode=1976ApJ...206..499L|title=An analysis of the variable radial velocity of alpha Cygni|journal=Astrophysical Journal|volume=206|pages=499|last1=Lucy|first1=L. B.|year=1976|doi=10.1086/154405|doi-access=free}}&lt;/ref&gt; Later investigations have found no evidence supporting the existence of a companion.&lt;ref name=&quot;Richardson2011&quot;&gt;{{cite journal |bibcode=2011AJ....141...17R |title=A Five-year Spectroscopic and Photometric Campaign on the Prototypical α Cygni Variable and A-type Supergiant Star Deneb |journal=The Astronomical Journal |volume=141 |issue=1 |pages=17 |last1=Richardson |first1=N. D. |last2=Morrison |first2=N. D. |last3=Kryukova |first3=E. E. |last4=Adelman |first4=S. J. |year=2011 |doi=10.1088/0004-6256/141/1/17 |arxiv=1009.5994 |s2cid=118300333}}&lt;/ref&gt;<br /> <br /> ==Etymology and cultural significance==<br /> [[File:Wide-field view of the Summer Triangle.jpg|thumb|upright=1.2|Wide-field view of the [[Summer Triangle]] and the [[Milky Way]]. Deneb is at the left-centre of the picture.&lt;!-- Not sure if you can find Deneb unless you have experience. --&gt;]]Names similar to Deneb have been given to at least seven different stars, most notably [[Beta Ceti|Deneb Kaitos]], the brightest star in the constellation of [[Cetus]]; [[Delta Capricorni|Deneb Algedi]], the brightest star in [[Capricornus]]; and [[Denebola]], the second brightest star in [[Leo (constellation)|Leo]]. All these stars are referring to the tail of the animals that their respective constellations represent.<br /> <br /> In Chinese, {{lang|zh|天津}} ({{lang|zh-Latn|Tiān Jīn}}), meaning ''[[Girl (Chinese constellation)|Celestial Ford]]'', refers to an asterism consisting of Deneb, [[Gamma Cygni]], [[Delta Cygni]], [[30 Cygni]], [[Nu Cygni]], [[Tau Cygni]], [[Upsilon Cygni]], [[Zeta Cygni]] and [[Epsilon Cygni]].&lt;ref&gt;{{cite book |author=陳久金|title=中國星座神話|url=https://books.google.com/books?id=0Vex0rYzdu8C|year=2005|publisher=五南圖書出版股份有限公司|isbn=978-986-7332-25-7}}&lt;/ref&gt; Consequently, the [[Chinese star names|Chinese name]] for Deneb itself is {{lang|zh|天津四}} ({{lang|zh-Latn|Tiān Jīn sì}}, {{lang-en|the Fourth Star of the Celestial Ford}}).&lt;ref&gt;{{cite web|language=zh |url=http://www.lcsd.gov.hk/CE/Museum/Space/Research/StarName/c_research_chinengstars_c_d.htm |title=香港太空館 - 研究資源 - 亮星中英對照表] |archive-url=https://web.archive.org/web/20081025110153/http://www.lcsd.gov.hk/CE/Museum/Space/Research/StarName/c_research_chinengstars_c_d.htm |archive-date=2008-10-25 | access-date=2019-01-09 | website=Hong Kong Space Museum}}&lt;/ref&gt;<br /> <br /> In the Chinese love story of [[Qi Xi]], Deneb marks the [[magpie]] bridge across the [[Milky Way]], which allows the separated lovers Niu Lang ([[Altair]]) and Zhi Nü ([[Vega]]) to be reunited on one special night of the year in late summer. In other versions of the story, Deneb is a fairy who acts as chaperone when the lovers meet.<br /> <br /> ===Namesakes===<br /> [[USS Arided (AK-73)|USS ''Arided'']] was a [[United States Navy]] [[Crater class cargo ship|''Crater''-class cargo ship]] named after the star. [[SS Deneb|SS ''Deneb'']] was an Italian merchant vessel that bore this name from 1951 until she was scrapped in 1966.<br /> <br /> ===In fiction===<br /> {{main|Deneb in fiction}}<br /> The star Deneb, and hypothetical planets orbiting it, have been used many times in [[literature]], [[film]], [[electronic game]]s, and [[music]]. Examples include several episodes of the ''[[Star Trek]]'' [[TV series]], the ''[[Silver Surfer]]'' comic book, the [[Rush (band)|Rush]] [[album]]s ''[[A Farewell to Kings]]'' and ''[[Hemispheres (Rush album)|Hemispheres]]'', the ''[[Descent: FreeSpace – The Great War]]'' [[computer game]], ''[[Stellaris (video game)|Stellaris]]'', and the [[science fiction]] [[novel]] ''[[Hyperion (Simmons novel)|Hyperion]]''.<br /> <br /> ==See also==<br /> * [[List of bright stars]]<br /> <br /> ==References==<br /> {{Reflist|30em|refs=}}<br /> <br /> {{Sky|20|41|25.9|+|45|16|49|1400}}<br /> {{Stars of Cygnus}}<br /> {{Portal bar|Astronomy|Stars|Outer space}}<br /> &lt;!-- Properties --&gt;<br /> <br /> [[Category:A-type supergiants]]<br /> [[Category:Alpha Cygni variables]]<br /> [[Category:Emission-line stars]]<br /> &lt;!-- Other --&gt;<br /> [[Category:Northern pole stars]]<br /> [[Category:Cygnus (constellation)]]<br /> [[Category:Bayer objects|Cygni, Alpha]]<br /> [[Category:Durchmusterung objects|BD+44 3541]]<br /> [[Category:Flamsteed objects|Cygni, 50]]<br /> [[Category:Henry Draper Catalogue objects|197345]]<br /> [[Category:Hipparcos objects|102098]]<br /> [[Category:Bright Star Catalogue objects|7924]]<br /> [[Category:Arabic words and phrases]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Xmas&diff=1170367445 Xmas 2023-08-14T16:57:12Z <p>205.189.94.9: /* See also */</p> <hr /> <div>{{Short description|Common abbreviation of the word &quot;Christmas&quot;}}<br /> {{About|the abbreviation|the holiday itself|Christmas|other uses|Xmas (disambiguation)}}<br /> [[Image:Xmas-LHJ-Dec-1922-Coles Phillips.jpg|thumb|right|alt=Illustration of a woman in a gingham dress standing in front of a large Christmas wreath|A 1922 advertisement in ''[[Ladies' Home Journal]]'': &quot;Give her a {{lang|fr|L'Aiglon|italic=no}} for Xmas&quot;]]<br /> '''Xmas''' (also '''X-mas''') is a common [[abbreviation]] of the word ''[[Christmas]]''. It is sometimes pronounced {{IPAc-en|ˈ|ɛ|k|s|m|ə|s}}, but ''Xmas'', and variants such as ''Xtemass'', originated as handwriting abbreviations for the typical pronunciation {{IPAc-en|ˈ|k|r|ɪ|s|m|ə|s}}. The 'X' comes from the [[Greek alphabet|Greek]] letter {{transliteration|grc|[[Chi (letter)|Chi]]}}, which is the first letter of the Greek word {{transliteration|grc|Christós}} ({{Lang-grc-gre|Χριστός|Khristós|anointed, covered in oil}}), which became ''[[Christ (title)|Christ]]'' in English.&lt;ref name=oed-x&gt;{{cite encyclopedia |title=X n. 10. |encyclopedia=[[Oxford English Dictionary]] |publisher=[[Oxford University Press]] |url=http://www.oed.com/view/Entry/230945#eid14045485 |year=2011 |access-date=17 June 2011}}&lt;/ref&gt;<br /> The suffix ''-mas'' is from the Latin-derived [[Old English]] word for [[Mass (liturgy)|Mass]].&lt;ref&gt;[http://www.newadvent.org/cathen/09790b.htm Catholic Encyclopedia: Liturgy of the Mass]. Retrieved 20 December 2007.&lt;/ref&gt;<br /> <br /> There is a [[Common English usage misconceptions|common misconception]] that the word ''Xmas'' stems from a [[Secular religion|secularizing]] tendency to de-emphasize the religious tradition from Christmas,&lt;ref&gt;{{cite book |title=Origins of the Specious: Myths and Misconceptions of the English Language |last1=O'Conner |first1=Patricia T. |last2=Kellerman |first2=Stewart |year=2009 |publisher=Random House |location=New York |isbn=978-1-4000-6660-5 |page=77 |quote=The usual suggestion is that 'Xmas' is{{nbsp}}[...] an attempt by the ungodly to x-out Jesus and banish religion from the holiday.}}&lt;/ref&gt; by &quot;taking the Christ out of Christmas&quot;; nevertheless, the term's usage dates back to the 16th century, and corresponds to [[Roman Catholic]], [[Eastern Orthodox]], [[Church of England]], and [[Episcopalian]]{{cn|date=March 2022}} [[liturgy|liturgical]] use of various forms of [[chi-rho]] [[monogram]]. In English, &quot;X&quot; was first used as a [[scribal abbreviation]] for &quot;Christ&quot; in 1100; &quot;X'temmas&quot; is attested in 1551, and &quot;Xmas&quot; in 1721.&lt;ref&gt;''[[Oxford English Dictionary]]'', ''s.v.'' &quot;[https://www.oed.com/view/Entry/230945 X]&quot; (1921 edition) and &quot;[https://www.oed.com/view/Entry/231032 Xmas]&quot; (Third Edition, 2020)&lt;/ref&gt;<br /> <br /> ==Style guides and etiquette==<br /> The term ''Xmas'' is deprecated by some modern [[style guide]]s, including those at the ''[[New York Times]]'',&lt;ref&gt;Siegel, Allan M. and William G. Connolly, [https://books.google.com/books?id=RT5w0s7_op8C&amp;q=Xmas+%22New+York+Times+Manual+of+Style%22 ''The New York Times Manual of Style and Usage''], Three Rivers Press, 1999, {{ISBN|978-0-8129-6389-2}}, pp 66, 365, retrieved via [[Google Books]], December 27, 2008&lt;/ref&gt; ''The New York Times Manual of Style and Usage'', ''[[The Times]]'', ''[[The Guardian]]'', and the [[BBC]].&lt;ref name=bbc04&gt;Griffiths, Emma, [http://news.bbc.co.uk/2/hi/uk_news/magazine/4097755.stm &quot;Why get cross about Xmas?&quot;], BBC website, December 22, 2004. Retrieved December 28, 2008.&lt;/ref&gt; [[Millicent Fenwick]], in the 1948 ''Vogue's Book of Etiquette'', states that &quot;'Xmas' should never be used&quot; in greeting cards.&lt;ref&gt;Fenwick, Millicent, [https://archive.org/details/voguesbookofetiq00fenw &lt;!-- quote=Xmas usage etiquette. --&gt; ''Vogue's Book of Etiquette: A Complete Guide to Traditional Forms and Modern Usage''], Simon and Schuster, 1948, p 611, retrieved via Google Books, December 27, 2008; full quote seen on Google Books search page&lt;/ref&gt;&lt;ref&gt;{{Cite book|last1=Siegal|first1=Allan M.|url=https://books.google.com/books?id=RT5w0s7_op8C&amp;q=Xmas+%22New+York+Times+Manual+of+Style%22|title=The New York Times Manual of Style and Usage|last2=Connolly|first2=William G.|date=1999|publisher=Three Rivers Press|isbn=978-0-8129-6389-2|language=en}}&lt;/ref&gt; ''The Cambridge Guide to Australian English Usage'' states that the spelling should be considered informal and restricted to contexts where concision is valued, such as headlines and greeting cards.&lt;ref name=ppcgaeu&gt;Peters, Pam, [https://books.google.com/books?id=nV8h0gnU1UEC&amp;dq=Xmas+usage&amp;pg=RA1-PA872 &quot;Xmas&quot; article], ''The Cambridge Guide to Australian English Usage'', Cambridge University Press, 2007, {{ISBN|978-0-521-87821-0}}, p 872, retrieved via Google Books, December 27, 2008&lt;/ref&gt; ''The Christian Writer's Manual of Style'', while acknowledging the ancient and respectful use of ''Xmas'' in the past, states that the spelling should never be used in [[formal writing]].&lt;ref&gt;Hudson, Robert, [https://books.google.com/books?id=SJyp_PS1rSkC&amp;q=Xmas&amp;pg=PA411 &quot;Xmas&quot; article], ''The Christian Writer's Manual of Style: Updated and Expanded Edition'', Zondervan, 2004, {{ISBN|978-0-310-48771-5}} p 412, retrieved via Google Books, December 27, 2008&lt;/ref&gt;<br /> <br /> ==History==<br /> <br /> ===Use in English===<br /> [[File:PostcardIBringYouAMerryXmas1910.jpg|thumb|right|upright|&quot;Xmas&quot; used on a Christmas postcard, 1910]]<br /> Early use of ''Xmas'' includes Bernard Ward's ''History of St. Edmund's college, Old Hall'' (originally published {{circa|1755}}).&lt;ref name=oed-xmas&gt;{{cite encyclopedia |title=Xmas, n. |encyclopedia=[[Oxford English Dictionary]] |publisher=[[Oxford University Press]] |url=http://www.oed.com/viewdictionaryentry/Entry/231032 |year=2011 |access-date=17 June 2011}}&lt;/ref&gt; An earlier version, ''X'temmas'', dates to 1551.&lt;ref name=oed-xmas/&gt; Around 1100 the term was written as {{lang|ang|Xp̄es mæsse}} in the ''[[Anglo-Saxon Chronicle]]''.&lt;ref name=oed-x/&gt; ''Xmas'' is found in a letter from [[George Woodward (diplomat)|George Woodward]] in 1753.&lt;ref&gt;Mullan, John and Christopher Reid, [https://books.google.com/books?id=qRKOG_JeSQIC&amp;dq=Xmas+%22eighteenth+century%22&amp;pg=PA216 ''Eighteenth-century Popular Culture: A Selection''], Oxford University Press, 2000, {{ISBN|978-0-19-871134-6}}, p 216, retrieved via Google Books, December 27, 2008&lt;/ref&gt; [[Lord Byron]] used the term in 1811,&lt;ref name=mwdeu/&gt; as did [[Samuel Coleridge]] (1801)&lt;ref name=bbc04/&gt; and [[Lewis Carroll]] (1864).&lt;ref name=mwdeu/&gt; In the United States, the fifth American edition of William Perry's ''Royal Standard English Dictionary'', published in Boston in 1800, included in its list of &quot;Explanations of Common Abbreviations, or Contraction of Words&quot; the entry: &quot;Xmas. Christmas.&quot;&lt;ref&gt;{{cite book |last=Perry |first=William |title=The Royal Standard English Dictionary |year=1800 |publisher=Isaiah Thomas &amp; Ebenezer T. Andrews|location=Boston |page=56 |url=https://books.google.com/books?id=2KURAAAAIAAJ&amp;pg=PA56}}&lt;/ref&gt; [[Oliver Wendell Holmes, Jr.]] used the term in a letter dated 1923.&lt;ref name=mwdeu/&gt;<br /> <br /> Since at least the late 19th century, ''Xmas'' has been in use in various other English-language nations. Quotations with the word can be found in texts first written in Canada,&lt;ref&gt;Kelcey, Barbara Eileen, ''Alone in Silence: European Women in the Canadian North Before 1940'', McGill-Queen's Press, 2001, {{ISBN|978-0-7735-2292-3}} (&quot;We had singing practice with the white men for the Xmas carols&quot;, written by Sadie Stringer in Peel River, Northwest Territories, Canada), p 50, retrieved via Google Books, December 27, 2008&lt;/ref&gt; and the word has been used in Australia,&lt;ref name=ppcgaeu/&gt; and in the Caribbean.&lt;ref&gt;Alssopp, Richard, [https://books.google.com/books?id=PmvSk13sIc0C&amp;dq=Xmas+usage&amp;pg=PA388 &quot;most1&quot; article]''Dictionary of Caribbean English Usage'', University of the West Indies Press, 2003, {{ISBN|978-976-640-145-0}} (&quot;The most day I enjoy was Xmas day&quot;&amp;nbsp;— Bdos, 1985), p 388, retrieved via Google Books, December 27, 2008&lt;/ref&gt; ''Merriam-Webster's Dictionary of English Usage'' stated that modern use of the term is largely limited to advertisements, headlines and banners, where its conciseness is valued. The association with commerce &quot;has done nothing for its reputation&quot;, according to the dictionary.&lt;ref name=mwdeu/&gt;<br /> <br /> In the United Kingdom, the former [[Church of England]] Bishop of [[Diocese of Blackburn|Blackburn]], [[Alan Chesters (Bishop)|Alan Chesters]], recommended to his clergy that they avoid the spelling.&lt;ref name=bbc04/&gt; In the United States, in 1977 [[New Hampshire]] Governor [[Meldrim Thomson]] sent out a press release saying that he wanted journalists to keep the &quot;Christ&quot; in Christmas, and not call it Xmas—which he called a &quot;[[pagan]]&quot; spelling of 'Christmas'.&lt;ref&gt;{{Cite web|url=https://news.google.com/newspapers?id=JYMuAAAAIBAJ&amp;sjid=lKEFAAAAIBAJ&amp;pg=912,1874288&amp;dq=xmas+christmas+x&amp;hl=en|title=The Montreal Gazette - Google News Archive Search|website=news.google.com}}&lt;/ref&gt;<br /> <br /> ===Use of ''X'' for 'Christ'===<br /> {{For|the article about the χρ symbol|Chi Rho}}<br /> [[Image:Chirho.svg|thumb|upright|The [[Chi-Rho]] is a [[Christianity|Christian]] symbol representing [[Christ]].]]<br /> The abbreviation of Christmas as ''Xmas'' is a source of disagreement among Christians who observe the holiday.<br /> <br /> The December 1957 ''News and Views'' published by the [[Church League of America]], a conservative organization co-founded in 1937 by George Washington Robnett,&lt;ref&gt;{{cite web |url=http://libweb.uoregon.edu/speccoll/guides/conservative.html |title=Subject Guide to Conservative and Libertarian Materials, in Manuscript Collections |publisher=University of Oregon}}&lt;/ref&gt; attacked the use of Xmas in an article titled &quot;X=The Unknown Quantity&quot;. The claims were picked up later by [[Gerald L. K. Smith]], who in December 1966 claimed that Xmas was a &quot;blasphemous omission of the name of Christ&quot; and that &quot;'X' is referred to as being symbolical of the unknown quantity&quot;. Smith further argued that the Jewish people had introduced Santa Claus to suppress New Testament accounts of Jesus, and that the United Nations, at the behest of &quot;world Jewry&quot;, had &quot;outlawed the name of Christ&quot;.&lt;ref&gt;{{cite book |author-link=Morris Kominsky |last=Kominsky |first=Morris |year=1970 |title=The Hoaxers: Plain Liars, Fancy Liars and Damned Liars |chapter=The Xmas Hoax |pages=137–138 |isbn=0-8283-1288-5 |publisher=Branden Press |location=Boston}}&lt;/ref&gt; There is, however, a well documented history of use of ''Χ'' (actually the [[Greek language|Greek]] letter {{transliteration|grc|[[Chi (letter)|chi]]}}) as an abbreviation for &quot;Christ&quot; {{lang|grc|(Χριστός)}} and possibly also a symbol of the cross.&lt;ref&gt;{{cite web |url=http://www.ancient-symbols.com/christian_symbols.html |title=Christian Symbols and Their Descriptions |publisher=Ancient-symbols.com |access-date=8 December 2008}}&lt;/ref&gt;{{unreliable source?|date=December 2015}}&lt;ref&gt;{{cite web |url=http://tlc.howstuffworks.com/family/xmas.htm |publisher=tlc.howstuffworks.com |title= Why Is There a Controversy Surrounding the Word 'Xmas'? |access-date= 25 December 2012 |date= 2007-11-21}}&lt;/ref&gt;{{unreliable source?|date=December 2015}} The abbreviation appears on many Orthodox Christian religious icons.<br /> <br /> Dennis Bratcher, writing for Christian website ''The Voice'', states &quot;there are always those who loudly decry the use of the abbreviation 'Xmas' as some kind of blasphemy against Christ and Christianity&quot;.&lt;ref&gt;{{cite web |url=http://www.crivoice.org/symbols/xmasorigin.html |title=The Origin of &quot;Xmas&quot; |publisher=CRI/Voice |date=2007-12-03 |access-date=2009-08-16}}&lt;/ref&gt; Among them are evangelist [[Franklin Graham]] and former [[CNN]] contributor [[Roland S. Martin]]. Graham stated in an interview:<br /> <br /> {{quote|[F]or us as Christians, this is one of the most holy of the holidays, the birth of our savior Jesus Christ. And for people to take Christ out of Christmas. They're happy to say merry Xmas.&lt;ref&gt;{{Cite web |url=https://www.vereeke.com/merry-christmas-messages-sms-whatsapp-facebook-status/ |title=Merry Christmas Messages, SMS, Whatsapp &amp; Facebook Status |last=Amaefule |first=Chigozie |date=2019-12-16 |website=Vereeke |access-date=2020-03-03}}&lt;/ref&gt; Let's just take Jesus out. And really, I think, a war against the name of Jesus Christ.&lt;ref&gt;[http://edition.cnn.com/TRANSCRIPTS/0512/16/ltm.02.html American Morning: A Conversation With Reverend Franklin Graham], CNN (December 16, 2005). Retrieved on December 29, 2009.&lt;/ref&gt;}}<br /> <br /> Roland Martin likewise relates the use of ''Xmas'' to his growing concerns of increasing commercialization and secularization of one of Christianity's highest holy days.&lt;ref&gt;Martin, Roland (December 20, 2007). [http://www.cnn.com/2007/US/12/20/roland.martin/index.html Commentary: You can't take Christ out of Christmas], CNN. Retrieved on December 29, 2009.&lt;/ref&gt; Bratcher posits that those who dislike abbreviating the word are unfamiliar with a long history of Christians using X in place of &quot;Christ&quot; for various purposes.<br /> <br /> The word ''[[Christ]]'' and its compounds, including ''Christmas'', have been abbreviated in English for at least the past 1,000 years, long before the modern ''Xmas'' was commonly used. ''Christ'' was often written as 'Xρ' or 'Xt'; there are references in the ''Anglo-Saxon Chronicle'' as far back as 1021. This 'X' and 'P' arose as the [[uppercase]] forms of the [[Greek alphabet|Greek letters]] {{lang|grc|[[Chi (letter)|χ]]}} (Ch) and {{lang|grc|[[rho|ρ]]}} (R) used in ancient abbreviations for {{lang|grc|Χριστος}} (Greek for &quot;Christ&quot;).&lt;ref name=oed-x/&gt; The [[Chi-Rho]], an amalgamation of the two Greek letters rendered as '☧' ([[Unicode]] character {{unichar|2627|chi rho}}) is a symbol often used to represent Christ in [[Catholic Church|Catholic]], [[Protestant]], and [[Eastern Orthodox Church|Orthodox]] Christian Churches.&lt;ref&gt;[http://christiansymbols.net/monograms_2.php Christian Symbols: Chi-Rho] ''Christian Symbols'', Doug Gray, Retrieved 2009-12-07&lt;/ref&gt;<br /> <br /> The ''[[Oxford English Dictionary]]'' (''OED'') and the ''OED Supplement'' have cited usages of ''X-'' or ''Xp-'' for 'Christ-' as early as 1485. The terms ''Xtian'' and less commonly ''Xpian'' have also been used for 'Christian'. The ''OED'' further cites usage of ''Xtianity'' for 'Christianity' from 1634.&lt;ref name=oed-x/&gt; According to ''Merriam-Webster's Dictionary of English Usage'', most of the evidence for these words comes from &quot;educated Englishmen who knew their Greek&quot;.&lt;ref name=mwdeu&gt;[https://books.google.com/books?id=2yJusP0vrdgC&amp;dq=Xmas+usage&amp;pg=PA968 &quot;Xmas&quot; article], ''Merriam-Webster's Dictionary of English Usage'', Merriam-Webster, 1994, p 968, {{ISBN|978-0-87779-132-4}}, retrieved via Google Books, December 27, 2008&lt;/ref&gt;<br /> <br /> In ancient Christian art, {{lang|grc|χ}} and {{lang|grc|χρ}} are abbreviations for Christ's name.&lt;ref&gt;{{cite web |url=http://www.newadvent.org/cathen/10488a.htm |title=Monogram of Christ |publisher=New Advent |date=1911-10-01 |access-date=2009-08-16}}&lt;/ref&gt; In many manuscripts of the ''[[New Testament]]'' and [[icon]]s, 'Χ' is an abbreviation for {{lang|grc|Χριστος}},&lt;ref&gt;{{cite web |title=The 'X' Factor |author=Rev. Steve Fritz |date=December 22, 2012 |access-date=December 25, 2012 |publisher=Lancaster Online |url=http://lancasteronline.com/article/local/794883_The--X--factor.html#ixzz2FyDSVafN}}&lt;/ref&gt; as is XC (the first and last letters in Greek, using the lunate [[sigma (letter)|sigma]]);&lt;ref&gt;''Church Symbolism: An Explanation of the more Important Symbols of the Old and New Testament, the Primitive, the Mediaeval and the Modern Church'' by Frederick Roth Webber (2nd. edition, 1938). {{OCLC|236708}}&lt;/ref&gt; compare IC for [[Jesus]] in Greek.<br /> <br /> ====Other uses of ''X(t)'' for 'Chris(t)-'====<br /> Other proper names containing the name 'Christ' besides those mentioned above are sometimes abbreviated similarly, either as ''X'' or ''Xt'', both of which have been used historically,&lt;ref&gt;http://www.all-acronyms.com/XT./Christ/1136835 &quot;Abbreviation: Xt.&quot; Date retrieved: 19 Dec. 2010.&lt;/ref&gt; e.g., ''Xtopher'' or ''Xopher'' for 'Christopher', or ''Xtina'' or ''Xina'' for the name 'Christina'.{{Citation needed|date=September 2020}}<br /> <br /> In the 17th and 18th centuries, ''Xene'' and ''Exene'' were common spellings for the given name 'Christine'.{{Citation needed|date=September 2020}} The American singer [[Christina Aguilera]] has sometimes gone by the name &quot;Xtina&quot;. Similarly, [[Exene Cervenka]] has been a noted American singer-songwriter since 1977.<br /> <br /> This usage of 'X' to spell the syllable ''kris'' (rather than the sounds ''ks'') has extended to ''xtal'' for '[[crystal]]', and on [[florist]]s' signs to ''xant'' for '[[chrysanthemum]]',&lt;ref&gt;{{cite web|url=http://everything2.com/index.pl?node_id=13693&amp;lastnode_id=0 |title=X |publisher=Everything 2 |access-date=2009-08-16}}&lt;/ref&gt;{{User-generated source|date=September 2020}} even though these words are not etymologically related to ''Christ'': ''crystal'' comes from a Greek word meaning 'ice' (and not even using the letter {{lang|grc|χ}}), and ''chrysanthemum'' comes from Greek words meaning 'golden flower', while ''Christ'' comes from a Greek word meaning 'anointed'.<br /> <br /> ==Popular culture==<br /> * In the animated TV series [[Futurama]], Christmas is referred to just as &quot;Xmas&quot;, in speech and writing.<br /> <br /> ==See also==<br /> * [[Christogram]]<br /> * [[Christmas controversies]]<br /> * [[Labarum]]<br /> * [[Names and titles of Jesus]]<br /> * [[Star of Bethlehem]]<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==External links==<br /> {{Commons category}}<br /> {{Wiktionary|Xmas}}<br /> * [http://www.goarch.org/en/multimedia/audio/images/christ.jpg An icon of Christ featuring the abbreviations IC and XC in the upper corners]<br /> * [http://news.bbc.co.uk/2/hi/uk_news/magazine/4097755.stm &quot;Why get cross about Xmas?&quot;] (BBC, December 22, 2004)<br /> <br /> {{Christmas}}<br /> <br /> [[Category:Christmas| ]]<br /> [[Category:16th-century neologisms]]<br /> [[Category:Abbreviations]]<br /> [[Category:Linguistic controversies]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=List_of_occult_terms&diff=1170367328 List of occult terms 2023-08-14T16:56:17Z <p>205.189.94.9: /* X */</p> <hr /> <div>{{Short description|none}}<br /> The [[occult]] is a category of [[supernatural]] beliefs and practices, encompassing such phenomena as those involving [[mysticism]], [[spirituality]], and [[Magic (supernatural)|magic]] in terms of any otherworldly agency. It can also refer to other non-religious supernatural ideas like [[extra-sensory perception]] and [[parapsychology]].<br /> <br /> The occult (from the [[Latin]] word ''occultus'' &quot;clandestine, hidden, secret&quot;) is &quot;knowledge of the hidden&quot;.&lt;ref&gt;[[George Crabb (writer)|Crabb, G.]] (1927). ''English synonyms explained, in alphabetical order, copious illustrations and examples drawn from the best writers''. New York: Thomas Y. Crowell Co.&lt;/ref&gt; In common usage, ''occult'' refers to &quot;knowledge of the [[paranormal]]&quot;, as opposed to &quot;knowledge of the [[measurable]]&quot;,&lt;ref&gt;[[Evelyn Underhill|Underhill, E.]] (1911). ''Mysticism'', Meridian, New York.&lt;/ref&gt; usually referred to as [[science]]. The term is sometimes taken to mean knowledge that &quot;is meant only for certain people&quot; or that &quot;must be kept hidden&quot;, but for most practicing occultists it is simply the study of a deeper spiritual reality that extends beyond [[Speculative reason|pure reason]] and the physical sciences.&lt;ref&gt;[[H.P. Blavatsky|Blavatsky, H. P.]] (1888). ''The Secret Doctrine''. Whitefish, MT: [[Kessinger Publishing]].&lt;/ref&gt; The terms ''[[esotericism|esoteric]]'' and ''arcane'' can also be used to describe the occult,&lt;ref&gt;Houghton Mifflin Company. (2004). ''The American Heritage College Thesaurus''. Boston: Houghton Mifflin. Page 530.&lt;/ref&gt;&lt;ref&gt;Wright, C. F. (1895). An outline of the principles of modern theosophy. Boston: New England Theosophical Corp.&lt;/ref&gt; in addition to their meanings unrelated to the supernatural. The term ''occult sciences'' was used in the 16th century to refer to [[astrology]], [[alchemy]], and [[natural magic]], which today are considered [[pseudosciences]].<br /> <br /> The term ''occultism'' emerged in 19th-century France, where it came to be associated with various French [[Western esotericism|esoteric]] groups connected to [[Éliphas Lévi]] and [[Papus]], and in 1875 was introduced into the [[English language]] by the esotericist [[Helena Blavatsky]]. Throughout the 20th century, the term was used [[idiosyncrasy|idiosyncratically]] by a range of different authors, but by the 21st century was commonly employed – including by academic scholars of esotericism – to refer to a range of esoteric currents that developed in the mid-19th century and their descendants. Occultism is thus often used to categorise such esoteric traditions as [[Spiritualism]], [[Theosophy]], [[Anthroposophy]], the [[Hermetic Order of the Golden Dawn]], and [[New Age]].<br /> <br /> It also describes a number of [[magical organization]]s or orders, the teachings and practices taught by them, and to a large body of current and historical literature and spiritual philosophy related to this subject.<br /> <br /> {{Expand list|date=August 2008}}<br /> {{compact ToC|side=yes|top=yes|num=yes}}<br /> <br /> == A ==<br /> *[[Abatur]] (Mandaean)<br /> *[[Abbey of Thelema]]<br /> *[[Abramelin oil]]<br /> *[[Acupuncture]]<br /> *[[Adept]]<br /> *[[Advent]], (Christ.)<br /> *[[Aeon (Gnosticism)]]<br /> *[[Aether theories|Aether]]<br /> *[[Akashic Records]]<br /> *[[Akhetaten]]<br /> *[[Alchemy]]<br /> *[[All Souls' Day]]<br /> *[[Alomancy]]<br /> *[[Alphabet of Desire]]<br /> *[[Altar cruet]]<br /> *[[Amulet]]<br /> * Amulet, (Christian) see [[New Testament amulet]]<br /> *[[Anachitis]], a form of divination stone<br /> *[[Anasyrma]] (Greek)<br /> *[[Anchimayen]] (Mapuche- S. America)<br /> *[[Animism]]<br /> *[[Ankh]]<br /> *[[Anointing]], see also Holy anointing oil<br /> *[[Anthesteria]], (Greek)<br /> *[[Anthroposophy]]<br /> *[[Apotheosis]]<br /> *Apparitions - See [[Ghost]]<br /> *[[Argenteum Astrum]]<br /> *[[Ariosophy]]<br /> *[[Ascended master]]<br /> *[[Aspergillum]]<br /> *[[Astral projection]], see also Soul flight<br /> *[[Astrological age]]<br /> *[[Astrological aspect]]<br /> *[[Astrology]]<br /> *[[Astrology and alchemy]]<br /> *[[Astrology and the classical elements]]<br /> *[[Astrology and numerology]]<br /> *[[Astrotheology]]<br /> *[[Athame]]<br /> *[[Aura (paranormal)|Aura]]<br /> *[[Augury]] (interpreting omens)<br /> *[[Automatic writing]]<br /> <br /> == B ==<br /> *[[Banishing]]<br /> *[[Banshee]]<br /> *[[Baphomet]]<br /> *[[Beltane]] <br /> *[[Benedicaria]] (Italian)<br /> *[[Bibliomancy]]<br /> *[[Biosophy]]<br /> *[[Biorhythm (pseudoscience)]]<br /> *[[Black magic]]<br /> *[[Black Mass]]<br /> *[[Black Sun (occult symbol)|Black Sun]]<br /> *[[Blarney Stone]]<br /> *Blue (color paint), see [[Haint blue]]<br /> *[[Body of light]]<br /> *[[Boline]]<br /> *[[Book of shadows]]<br /> <br /> == C ==<br /> *[[Candle]]<br /> *[[Cartomancy]] (divination using playing cards)<br /> *[[Cauldron]]<br /> *[[Censer]], see also Thurible<br /> *[[Centiloquium]]<br /> *[[Ceremonial magic]]<br /> *[[Chalice (cup)|Chalice]], see also Ciborium<br /> *[[Chaos magic]]<br /> *[[Charmstone]]<br /> *[[Chinese astrology]]<br /> *[[Chromotherapy]]<br /> *[[Church grim]] (Christ.-Eng./Nord.)<br /> *[[Ciborium (container)|Ciborium]], see also Chalice<br /> *[[Cilice]]<br /> *[[Circumambulation]]<br /> *[[Clairaudience]] (ability to hear voices &amp; sounds super-normally- spirited voices alleging to be those of dead people giving advice or warnings)<br /> *[[Clairsentience]] (supernormal sense perception)<br /> *[[Clairvoyance]] (ability to see objects or events spontaneously or supernormally above their normal range of vision- second sight)<br /> *[[Cleromancy]]<br /> *[[Coco (folklore)]]<br /> *[[Color symbolism]]<br /> *Color therapy see [[Chromotherapy]]<br /> *[[Cone of power]]<br /> *[[Conjuration (summoning)|Conjuration]] (summoning up a spirit by incantation)<br /> * Cosmology, see [[Religious cosmology]]<br /> *[[Coven]] (a community of witches)<br /> *[[Crossroads (folklore)]]<br /> *[[Crystal gazing]], see also Scrying<br /> *[[Cult]]<br /> *[[Cunning folk traditions and the Latter Day Saint movement]]<br /> *[[Curse]]<br /> <br /> == D ==<br /> *[[Da'at]]<br /> *[[Deal with the Devil]]<br /> *[[Déjà vu]]<br /> *[[Demonology]]<br /> *[[Demiurge]]<br /> *[[Discernment of Spirits]] (Christ.)<br /> *[[Djembe]] (West Africa)<br /> *[[Devil]], see also [[Satan]]<br /> *[[Divination]]<br /> *[[Dowsing]]<br /> *[[Dragon]]<br /> *[[Drak (mythology)]] (Germ.)<br /> *[[Dream interpretation]]<br /> *[[Dybbuk]]<br /> <br /> == E ==<br /> *[[Earth mysteries]]<br /> *[[Esbat]]<br /> *[[Ectenia]] ( E. Orthodox)<br /> *[[Ectoplasm (paranormal)|Ectoplasm]] (unknown substance from body of a medium)<br /> *[[Eight-circuit model of consciousness]]<br /> *[[Ein Sof]]<br /> *[[Elemental]]<br /> *[[Incantation|Enchanting]]<br /> *[[Energy (esotericism)]]<br /> *[[English Qaballa]]<br /> *[[Enochian]]<br /> *[[Ephemeris]]<br /> *[[Extrasensory perception|E.S.P.]] (extra sensory perception)<br /> *[[Esoteric Christianity]], see also Gnosticism<br /> *[[Esoteric cosmology]]<br /> *[[Esotericism]], see also Exoteric<br /> *[[Entheogen]]<br /> *[[Evil eye]]<br /> *[[Evocation]]<br /> *[[Exorcism]]<br /> *[[Exoteric]], see also [[Esotericism]] (for Esoteric)<br /> *[[Eucharist]]<br /> <br /> == F ==<br /> *[[Fama Fraternitatis]]<br /> *[[Familiar spirit]]<br /> *[[Fasting in religion]]<br /> *[[Feng shui]]<br /> *[[Feri Tradition]]<br /> *[[Fern flower]] (Baltic region &amp; Slavic beliefs)<br /> *[[Figs in the Bible]] (Biblical)<br /> *[[Filakto]] (E. Euro.)<br /> *[[Firewalking]]<br /> *[[Florida Water]]<br /> *[[Flying ointment]]<br /> *[[Folk belief]]<br /> *[[Folk religion]]<br /> *Food- see [[Sacred food as offering]], (see also [[Libation]])<br /> *[[Four-leaf clover]]<br /> *[[Fortune-telling]]<br /> *[[Fraternal order]], see also [[List of general fraternities]]<br /> *[[Freemason]]<br /> <br /> == G ==<br /> *[[Galdr]] (Old Norse)<br /> *[[Gargoyle]]<br /> *[[Gematria]], see also Numerology<br /> *[[Gemstones in the Bible]], see also [[List of plants in the Bible]]<br /> *[[Geocentric model]], see also Heliocentrism<br /> *[[Geomancy]]<br /> *[[Geomantic figures]]<br /> *[[Ghouls]]<br /> *[[Ghost hunting]]<br /> * Glossolalia, see Speaking in tongues<br /> *[[Gnome]]<br /> *[[Gnosis]]<br /> *[[Gnosis (chaos magic)]]<br /> *[[Gnosticism]], see also Esoteric Christianity<br /> *[[Gnostic mass]]<br /> *[[Goblin]]<br /> *[[Goetia]]<br /> *[[Golem]]<br /> *[[Gradobranitelj]]<br /> *[[Graphology]]<br /> *[[Gray magic]]<br /> *[[Great Work (Hermeticism)]]<br /> *[[Great Work (Thelema)]]<br /> *[[Greater and lesser magic]]<br /> *[[Grimoire]]<br /> *[[Guardian angel]]<br /> <br /> == H ==<br /> *[[Hadit]]<br /> *[[Haint blue]] (Hoodoo)<br /> *[[Hamingja]] (Norse)<br /> *[[Hamsa]]<br /> *[[Hand of Glory]]<br /> *[[Haruspex]]<br /> *[[Haunted House|Haunted]]<br /> *[[Hedgewitch]]<br /> *[[Heliocentrism]] see also [[Geocentric model]]<br /> *[[Hellfire club]]<br /> *[[Hermeticism]]<br /> *[[Hestia]] (Greek)<br /> *[[Hexagram]]<br /> *[[Curse|Hex]]<br /> *Holy Guardian Angel, see [[Guardian angel]]<br /> *[[Holy anointing oil]], see also Anointing<br /> *[[Holy water]]<br /> *[[Homeopathy]]<br /> *[[Homunculus]]<br /> *[[Hoodoo (folk magic)|Hoodoo]]<br /> *[[Huaychivo]], (Mayan)<br /> *[[Huna (New Age)|Huna]]<br /> <br /> == I ==<br /> *[[I Ching]]<br /> *[[Ifa]]<br /> *[[Imbolc]]<br /> *[[Imp]]<br /> *[[Incantation]]<br /> *[[Incense]]<br /> *[[Iconoclasm]]<br /> *[[Incorruptibility]]<br /> *[[Incubus]], see also Succubus<br /> *[[Initiation]]<br /> *[[Invocation]]<br /> <br /> == J ==<br /> *[[Jinx]]<br /> *[[Jumbee]] (Colombian, Venezuela, Caribbean)<br /> *[[Juju]]<br /> <br /> == K ==<br /> *[[Kabbalah]]<br /> *[[Karzełek]] (Slavik)<br /> *[[Kia (magic)]]<br /> *[[Kirlian Photography]]<br /> *[[Kumina]] (Afro-Jamaican)<br /> *[[Kundalini energy]]<br /> <br /> == L ==<br /> *[[Lamen (magic)]]<br /> *[[Lammas]]<br /> *[[Lampadomancy]]<br /> *[[Law of contagion]]<br /> *[[Left-hand path and right-hand path]]<br /> *[[Ley line]]<br /> *[[Lesser banishing ritual of the pentagram]]<br /> *[[Libation]], see also [[Sacred food as offering]]<br /> * Light, see [[Ceremonial use of lights]]<br /> *[[Liminality]]<br /> *[[List of alchemists]]<br /> *[[List of astrologers]]<br /> *[[List of channelers (mediumship)]]<br /> *[[Lists of deities]]<br /> *[[List of general fraternities]]<br /> *[[List of lucky symbols|List of good-luck charms]]<br /> *[[Lists of legendary creatures]]<br /> *[[List of lunar deities]]<br /> *[[List of Mesopotamian deities]]<br /> *[[List of mythological objects]]<br /> *[[List of Old Testament pseudepigrapha]]<br /> *[[List of plants in the Bible]], see also [[Gemstones in the Bible]]<br /> *[[List of solar deities]]<br /> *[[List of spirituality-related topics]]<br /> *[[List of Thelemites]]<br /> *[[List of theological demons]]<br /> *[[List of occultists]]<br /> *[[List of occult symbols]]<br /> *[[List of psychic abilities]]<br /> *[[List of vampiric creatures in folklore]]<br /> *List of religion see [[Outline of religion]]<br /> *[[Liturgy]]<br /> *[[Literomancy]]<br /> *[[Lithomancy]]<br /> *[[Lemures]] (Roman)<br /> *[[Lucifer]]<br /> *[[Luciferianism]]<br /> *[[Lughnasadh]] (Gaelic)<br /> <br /> == M ==<br /> *[[Magic (paranormal)]]<br /> *[[Magic circle]]<br /> *[[Magic square]]<br /> *[[Magic word]]<br /> *[[Magical formula]]<br /> *[[Magical thinking]]<br /> *[[Magick]]<br /> *[[Maleficium (sorcery)]]<br /> *[[Mami Wata]]<br /> *[[Martinist]]<br /> *[[Mass (liturgy)]]<br /> *[[Mathers table]]<br /> *[[Maypole]]<br /> *[[Mediumship]]<br /> *[[Melchizedek]]<br /> *[[Melchizedek priesthood (Latter Day Saints)]]<br /> *[[Mephistopheles]]<br /> *[[Merkabah mysticism]]<br /> *[[Mesmerism]]<br /> *[[Metaphysics]]<br /> *[[Methods of divination]]<br /> *[[Metoposcopy]] see also Phrenology, Physiognomy<br /> *[[Mojo (African-American culture)]], see also [[Sachet]]<br /> *[[Molybdomancy]]<br /> *[[Mood ring]]<br /> *[[Western esotericism|Mystery religion]]<br /> *[[Mysticism]]<br /> <br /> == N ==<br /> *[[Nagual]]<br /> *[[Navigium Isidis]], (Roman/Egyptian)<br /> *[[Necromancy]]<br /> *[[Necronomicon]]<br /> *[[Neodruidism]]<br /> *[[Neopaganism]]<br /> *[[Neotantra]], see also [[Tantra]] &amp; [[Plastic shaman]]<br /> *[[New Age]]<br /> *[[Neoshamanism]]<br /> *[[New Thought]]<br /> *[[Night terror]], see also Sleep paralysis<br /> *[[Nisse (folklore)]] (Scandinavian)<br /> *[[Noa-name]] (Polynesian)<br /> *[[Nominalism]]<br /> *[[Nuit]]<br /> *[[Numen]] (Latin)<br /> *[[Numerology]], see also Gematria<br /> <br /> == O ==<br /> *[[Obeah and Wanga]]<br /> *[[Obsession (Spiritism)]], see also Spirit possession<br /> *[[Occultism]]<br /> *[[Odic force]]<br /> *[[Omen]]<br /> *[[Oneiromancy]]<br /> *[[Oneironautics]]<br /> *[[Onychomancy]]<br /> *[[Orans]]<br /> *[[Ordination]]<br /> *[[Osculum infame]]<br /> *[[Otherworld]]<br /> *[[Ouija]]<br /> *[[Ouroboros]]<br /> <br /> == P ==<br /> *[[Paganism]]<br /> *[[Palmistry]]<br /> *[[Paranormal]]<br /> *[[paraphernalia]]<br /> *[[Parapsychology]]<br /> *[[Paten]] (Cath.)<br /> *[[Peijaiset]] (Finnish)<br /> *[[Pentacle]]<br /> *[[Penuel]]<br /> *[[Philosopher's stone]]<br /> *[[Phrenology]], see also Metoposcopy<br /> *[[Physiognomy]], see also Metoposcopy<br /> *[[Planetary hours]]<br /> *[[Planchette]]<br /> *[[Plastic shaman]]<br /> *[[Pneumatic (Gnosticism)]]<br /> *[[Podea]] <br /> *[[Poltergeist]]<br /> *[[Poppet]]<br /> *[[Potion]]<br /> *[[Power Animal]]<br /> *[[Pow-wow (folk magic)]]<br /> *[[Precognition]], see also Retrocognition<br /> *[[Pseudepigrapha]]<br /> *[[Psionics]]<br /> *[[Psychic]]<br /> *[[Psychic surgery]]<br /> *[[Psychic vampire]]<br /> *[[Psychopomp]]<br /> *[[Psychonautics]]<br /> *[[Psychometry (paranormal)|Psychometry]]<br /> *[[Pyramid power]]<br /> *[[Pyx]] (Cath.)<br /> <br /> == Q ==<br /> *[[Qabalah]]<br /> *[[Qareen]]<br /> *[[Quantum mysticism]]<br /> *[[Ceremonial magic#History|Quareia]]<br /> <br /> == R ==<br /> *[[Radiesthesia]]<br /> *[[Radionics]]<br /> *[[Regalia]]<br /> *[[Reiki]]<br /> *[[Renaissance magic]]<br /> *[[Retrocognition]], see also Precognition<br /> *[[Rhabdomancy]]<br /> *[[Rosicrucianism]]<br /> *[[Rougarou]], see also Werewolf<br /> *[[Rūḥ]] , Islamic<br /> *[[Rumpology]]<br /> *[[Runecasting]]<br /> *[[Runic magic]]<br /> *[[Rusalka]] (Slavic)<br /> <br /> == S ==<br /> *[[Sabbath]], see also [[Witches' Sabbath]] <br /> *[[Sachet]], see also [[Mojo (African-American culture)]]<br /> *[[Sacrament]]<br /> *Salamander, see [[Cultural depictions of salamanders]]<br /> * Salt, see [[Blessed salt]]<br /> *[[Samhain]]<br /> *[[Satan]], see also [[Devil]]<br /> *[[Satanic panic]]<br /> *[[Satanism]]<br /> *[[Scrying]], see also Crystal gazing<br /> *[[Séance]]<br /> *[[Secret Chiefs]]<br /> *[[Seer stone (Latter Day Saints)]]<br /> *[[Seeress (Germanic)]]<br /> *[[Seidr]] (Old Norse)<br /> *[[Selkie]] (Celtic &amp; Norse)<br /> *[[Siren (mythology)|Siren]]<br /> *[[Sefirot]]<br /> *[[Servitor (chaos magic)]]<br /> *[[Seven Rays]]<br /> *[[Seventh son of a seventh son]]<br /> *[[Sex magic]]<br /> *[[Shachihoko]] (Jap.)<br /> *[[Shadow person]]<br /> *[[Shamanism]]<br /> *[[Sheela na gig]]<br /> *[[Shekhinah]] (Hebrew)<br /> *[[Shem HaMephorash]] (Hebrew)<br /> *[[Shrovetide]], (pre-Lent)<br /> *[[Sidereal and tropical astrology]]<br /> *[[Sefirot]] (Hebrew)<br /> *[[Sigil (magic)|Sigil]]<br /> *[[Sigil of Baphomet]]<br /> *[[Sigillum Dei]]<br /> *[[Simiyya]], (Islamic)<br /> *[[Simurgh]] (Persian)<br /> *[[Skin-walker]]<br /> *[[Sleep paralysis]], see also Night terror<br /> *[[Smudging]]<br /> *[[Maleficium (sorcery)|Sorcery]]<br /> *[[Soul flight]], see also Astral projection<br /> *[[Speaking in tongues]], see also [[Xenoglossy]]<br /> *[[Spell (paranormal)|Spell]]<br /> *[[Spirit possession]], see also Obsession (Spiritism)<br /> *[[Spiritual warfare]] (Christian)<br /> *[[Square and Compasses]]<br /> *[[Stigmata]]<br /> *[[Stole (vestment)]] (Cath.)<br /> *[[Stregheria]]<br /> *[[Subtle body]]<br /> *[[Succubus]], see also Incubus<br /> *[[Sunwise]] (deosil), see also Widdershins<br /> *[[Supernatural]]<br /> *[[Sylph]]<br /> *[[Sympathetic magic]]<br /> *[[Synchromysticism]]<br /> <br /> == T ==<br /> *[[Table of correspondences]]<br /> *[[Tabernacle]] (Hebrew Bibl.)<br /> *[[Talisman]]<br /> *[[Tantra]], see also [[Neotantra]]<br /> *[[Tau cross]]<br /> *[[Tau (mythology)]]<br /> *[[Tau robe]]<br /> *[[Tarot divination]]<br /> *[[Tattva vision]]<br /> *[[Theophany]]<br /> *[[Theurgy]]<br /> *[[Thurible]], see also Censer<br /> *[[Tithe]]<br /> *Tools see [[Magical tools in Wicca]]<br /> *[[Totem]]<br /> *[[Transubstantiation]] (Cath.)<br /> *[[Trance]]<br /> *[[Tree of life]]<br /> *[[Tree of life (biblical)]]<br /> *[[True Will]]<br /> <br /> == U ==<br /> *[[Unclean spirit]]<br /> *[[Underworld]]<br /> *[[Undine]]<br /> *[[Unicursal hexagram]]<br /> *[[Ukehi]] (Jap.)<br /> *[[Urim and Thummim]]<br /> *[[Ursa Major]]<br /> <br /> == V ==<br /> *[[Valkyrie]]<br /> *[[Vampire]]<br /> *[[Vashtu]], East Indian version of feng shui (geomancy)<br /> *[[Vision (spirituality)]], see also [[Theophany]]<br /> *[[Vision quest]]<br /> *[[Vitalism]]<br /> *[[Vlach]] (Balkan)<br /> *[[Votive offering]]<br /> *[[Voodoo death]]<br /> *[[Voodoo doll]]<br /> * Vodun- see [[West African Vodun]]<br /> *[[Vril]]<br /> <br /> == W ==<br /> *[[Walpurgis Night]]<br /> *[[Wake (ceremony)]]<br /> *[[Wand]] <br /> *[[White magic]]<br /> *[[Wicca]]<br /> *[[Widdershins]], see also Sunwise (deosil)<br /> *[[Witchcraft]]<br /> *[[Witches' Sabbath]]<br /> *[[Witch ball]]<br /> * Witch hunt- see [[Modern witch-hunts]]<br /> *[[Witchcraft and divination in the Hebrew Bible]]<br /> *[[Will-o'-the-wisp]]<br /> *[[Werewolf]], see also Rougarou<br /> *[[Wendigo]]<br /> *[[Worship]]<br /> <br /> == X ==<br /> *[[Xenoglossy]], see also [[Speaking in tongues]]<br /> *[[Xipe Totec]] (Aztec)<br /> *[[Xmas]] (esoteric/occult views)<br /> *[[Xmucane and Xpiacoc]] (Mayan)<br /> *[[Xōchipilli]] (Aztec)<br /> *[[Xōchiquetzal]] (Aztec, classical Nahuatl)<br /> <br /> == Y ==<br /> *[[Ya sang]]<br /> *[[Yahweh]]<br /> *[[Yesod]]<br /> *[[Yggdrasil]] (Old Norse)<br /> *[[Yowie]] (Aust. abor.)<br /> *[[Yule]]<br /> <br /> == Z ==<br /> *[[Zalmoxis]]<br /> *Zamzam water (Islamic) see [[Zamzam Well]]<br /> *[[Zduhać]] (Serb.)<br /> *[[Zener cards]]<br /> *[[Zephyrus]] (Greek)<br /> *[[Zeus]] (Greek)<br /> *[[Zodiac]]<br /> *[[Zohar]]<br /> *[[Zombie]]<br /> *[[Zoroastrianism]]<br /> *[[Zorya]] (Slavic)<br /> *[[Zos Kia Cultus]]<br /> <br /> == References ==<br /> {{Reflist}}<br /> <br /> [[Category:Magical terminology| ]]<br /> [[Category:Occult|*]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Talk:List_of_occult_terms&diff=1170367047 Talk:List of occult terms 2023-08-14T16:54:16Z <p>205.189.94.9: /* Reg Xmas removal */ new section</p> <hr /> <div>{{WikiProjectBannerShell|1=<br /> {{WikiProject Index}}<br /> {{WikiProject Thelema|class=list|importance=mid}}<br /> {{WikiProject Occult|class=list|importance=mid}}<br /> {{WikiProject Lists|class=list|importance=mid}}<br /> {{WikiProject Skepticism|class=list|importance=low}}<br /> }}<br /> <br /> <br /> <br /> ==Untitled==<br /> Once upon a time I added [[Alphabet of Desire]] to this article, and now its being directed to the [[Austin Osman Spare]] page. I was hoping someone could start the page from there. Why is it being directed to Spares article? I'm desperately searching for sources to create the article myself, still! [[User:SynergeticMaggot|Zos]] 03:45, 11 June 2006 (UTC)<br /> :Good place to point it meanwhile. Just edit the redirect page when you are ready to add. [http://en.wikipedia.org/w/index.php?title=Alphabet_of_Desire&amp;action=edit] -[[User:999|999]] ([[User_talk:999|Talk]]) 04:15, 11 June 2006 (UTC)<br /> <br /> == Acupuncture ==<br /> <br /> Really? I don't think this belongs here. [[User:Spiel|Spiel]] ([[User talk:Spiel|talk]]) 21:33, 15 November 2020 (UTC)<br /> <br /> == Christian &amp; Islamic terms ==<br /> <br /> Included in the list <br /> Because there are Christian occultist, neo gnostics, mormons, Hoodoo practitioners, South American occult practitioners, Rosicrucians, Freemasons, fringe Christians (eg The Warrens, Ed &amp; Lorraine)into the occult.<br /> etc<br /> ~~ED~~ [[Special:Contributions/2607:FEA8:4A2:4100:2176:A766:F475:203|2607:FEA8:4A2:4100:2176:A766:F475:203]] ([[User talk:2607:FEA8:4A2:4100:2176:A766:F475:203|talk]]) 02:29, 25 April 2023 (UTC)<br /> <br /> == Reg Xmas removal ==<br /> <br /> Theres esoteric/occult views &amp; articles about Xmas on the net.<br /> Retoring it.<br /> <br /> Richard [[Special:Contributions/205.189.94.9|205.189.94.9]] ([[User talk:205.189.94.9|talk]]) 16:54, 14 August 2023 (UTC)</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=York_Centre_(provincial_electoral_district)&diff=1170008946 York Centre (provincial electoral district) 2023-08-12T17:50:49Z <p>205.189.94.9: /* Members of Provincial Parliament */ 1999, MARKHAM, Markham Unionville riding not created into 2000s.</p> <hr /> <div>{{for|the federal and municipal electoral divisions|York Centre|Ward 6 York Centre}}<br /> {{Infobox Canada electoral district<br /> | province = Ontario<br /> | image = Ontario 2018 York Centre.svg<br /> | caption = York Centre in relation to the other Toronto ridings (2015 boundaries)<br /> | prov-status = active<br /> | prov-created = <br /> | prov-abolished = <br /> | prov-election-first = 1999<br /> | prov-election-last = 2022<br /> | prov-rep = Michael Kerzner<br /> | prov-rep-link = <br /> | prov-rep-party = PC<br /> | prov-rep-party-link = <br /> | demo-census-date = 2016<br /> | demo-pop = 104320<br /> | demo-electors = 70520<br /> | demo-electors-date = 2018<br /> | demo-area = 35<br /> | demo-cd = [[Toronto]]<br /> | demo-csd = [[Toronto]]<br /> }}<br /> [[Image:York Centre Elections Canada map 35118 (2015 boundaries).gif|250px|thumb|right|Map of York Centre]]<br /> [[Image:York Centre, Toronto.png|250px|thumb|right|York Centre from 2003 to 2018]]<br /> [[Image:York Centre (2003 federal riding map).png|thumb|250px|right|Map of York Centre under 2003 boundaries]]<br /> <br /> '''York Centre''' is a provincial [[electoral district (Canada)|electoral district]] in [[Ontario]], [[Canada]], that has been the name of ridings in the [[Legislative Assembly of Ontario]] three different times. It was created initially in 1955 from the southern part of York North. It was dissolved in 1963 when it was split into three ridings called Yorkview, Downsview and Armourdale. In 1967, it was reconstituted north of Steeles in the township of Markham. This lasted until 1999 when it was dissolved into [[Markham—Unionville (provincial electoral district)|Markham—Unionville]]. The name was given to a new riding formed in its original location south of Steeles. It remains as an existing riding today.<br /> <br /> ==Boundaries==<br /> <br /> ===1955 to 1963===<br /> The original boundaries consisted of Steeles Avenue West to the north, Yonge Street to the East, Lawrence Avenue West to the south and the Humber River to the west.<br /> <br /> ===1963 to 1999===<br /> <br /> ===1999 to present===<br /> York Centre consists of the part of the City of Toronto within the [[North York]] district bounded on the north by the northern city limit, and on the east, south and west by a line drawn from the city limit south along Yonge Street, west along the hydroelectric transmission line north of Finch Avenue West, south along Bathurst Street, southeast along the Don River West Branch, southwest and west along Highway 401, north along Jane Street, east along Sheppard Avenue West, northwest along Black Creek, east along Grandravine Drive, and north along Keele Street to the city limit.<br /> <br /> ==History==<br /> The provincial electoral district was created in 1999 when provincial ridings were defined to have the same borders as federal ridings.<br /> <br /> Before 1999, the name York Centre was assigned to a completely different riding located in [[York Regional Municipality, Ontario|York Region]] north of [[Toronto]] with none of the same territory as the current York Centre. In 1999, much of the old York Centre was absorbed by the new riding of [[Vaughan—King—Aurora]]. The former riding was [[Wilson Heights (electoral district)|Wilson Heights]].<br /> <br /> ==Members of Provincial Parliament==<br /> <br /> {{OntMPP|York Centre}}<br /> {{OntMPP NoData|''Riding created from'' [[York North (Ontario provincial electoral district)|York North]]}}<br /> {{OntMPP Row<br /> | FromYr=1955<br /> | ToYr=1959<br /> | Assembly#=25<br /> | OntParty=Progressive Conservative<br /> | RepName=Thomas Graham<br /> | RepLink=Thomas Graham (Canadian politician)<br /> | PartyTerms#=1<br /> | RepTerms#=1<br /> }}<br /> {{OntMPP Row<br /> | FromYr=1959<br /> | ToYr=1963<br /> | Assembly#=26<br /> | OntParty=Liberal<br /> | RepName=Vernon Singer<br /> | PartyTerms#=1<br /> | RepTerms#=1<br /> }}<br /> {{OntMPP NoData|''Riding dissolved into'' [[Yorkview]], [[Downsview (electoral district)|Downsview]] ''and'' [[Armourdale (provincial electoral district)|Armourdale]]}}<br /> {{OntMPP NoData|''Riding re-created''}}<br /> {{OntMPP Row<br /> | FromYr=1967<br /> | ToYr=1971<br /> | Assembly#=28<br /> | OntParty=Liberal<br /> | RepName=Donald Deacon<br /> | PartyTerms#=4<br /> | RepTerms#=2<br /> }}<br /> {{OntMPP Row<br /> | FromYr=1971<br /> | ToYr=1975<br /> | Assembly#=29<br /> }}<br /> {{OntMPP Row<br /> | FromYr=1975<br /> | ToYr=1977<br /> | Assembly#=30<br /> | RepName=Alfred Stong<br /> | RepTerms#=2<br /> }}<br /> {{OntMPP Row<br /> | FromYr=1977<br /> | ToYr=1981<br /> | Assembly#=31<br /> }}<br /> {{OntMPP Row<br /> | FromYr=1981<br /> | ToYr=1985<br /> | Assembly#=32<br /> | OntParty=PC<br /> | RepName=Don Cousens<br /> | PartyTerms#=2<br /> | RepTerms#=2<br /> }}<br /> {{OntMPP Row<br /> | FromYr=1985<br /> | ToYr=1987<br /> | Assembly#=33<br /> }}<br /> {{OntMPP Row<br /> | FromYr=1987<br /> | ToYr=1990<br /> | Assembly#=34<br /> | OntParty=Liberal<br /> | RepName=Greg Sorbara<br /> | PartyTerms#=2<br /> | RepTerms#=2<br /> }}<br /> {{OntMPP Row<br /> | FromYr=1990<br /> | ToYr=1995<br /> | Assembly#=35<br /> }}<br /> {{OntMPP Row<br /> | FromYr=1995<br /> | ToYr=1999<br /> | Assembly#=36<br /> | OntParty=PC<br /> | RepName=Al Palladini<br /> | PartyTerms#=1<br /> | RepTerms#=1<br /> }}<br /> {{OntMPP NoData|''Riding dissolved into'' [[Vaughan—King—Aurora]] ''and'' [[Markham (provincial electoral district)|Markham]]}}<br /> {{OntMPP NoData|''Riding re-created from'' [[Downsview (electoral district)|Downsview]] ''and'' [[Wilson Heights (electoral district)|Wilson Heights]]}}<br /> {{OntMPP Row<br /> | FromYr=1999<br /> | ToYr=2003<br /> | Assembly#=37<br /> | OntParty=Liberal<br /> | RepName=Monte Kwinter<br /> | PartyTerms#=5<br /> | RepTerms#=5<br /> }}<br /> {{OntMPP Row<br /> | FromYr=2003<br /> | ToYr=2007<br /> | Assembly#=38<br /> }}<br /> {{OntMPP Row<br /> | FromYr=2007<br /> | ToYr=2011<br /> | Assembly#=39<br /> }}<br /> {{OntMPP Row<br /> | FromYr=2011<br /> | ToYr=2014<br /> | Assembly#=40<br /> }}<br /> {{OntMPP Row<br /> | FromYr=2014<br /> | ToYr=2018<br /> | Assembly#=41<br /> }}<br /> {{OntMPP Row<br /> | FromYr=2018<br /> | ToYr=2021<br /> | Assembly#=42<br /> | #ByElections = 1<br /> | OntParty=PC<br /> | RepName=[[Roman Baber]]<br /> | RepTerms#=2<br /> }}<br /> {{OntMPP Row<br /> | FromYr = 2021<br /> | ToYr = 2022<br /> | OntParty = Independent<br /> | PartyTerms# = 1<br /> }}<br /> {{OntMPP Row<br /> | FromYr=2022<br /> | ToYr=<br /> | Assembly#=43<br /> | OntParty=PC<br /> | RepName=Michael Kerzner<br /> }}<br /> {{OntMPP NoData|<br /> &lt;small&gt;Sourced from the Ontario Legislative Assembly&lt;/small&gt;&lt;ref&gt;For a listing of each MPP's Queen's Park ''curriculum vitae'' see below:<br /> * For Thomas Graham's Legislative Assembly information see {{cite web<br /> |title=Michael Bryant, MPP <br /> |url=http://www.ontla.on.ca/web/members/members_all_detail.do?locale=en&amp;ID=1222 <br /> |work=Parliamentary History <br /> |publisher=Legislative Assembly of Ontario <br /> |access-date=2014-09-08 <br /> |location=Toronto <br /> |year=2014 <br /> }}<br /> * For Vernon Singer's Legislative Assembly information see {{cite web<br /> |title=Vernon Singer, MPP <br /> |url=http://www.ontla.on.ca/web/members/members_all_detail.do?locale=en&amp;ID=616 <br /> |work=Parliamentary History <br /> |publisher=Legislative Assembly of Ontario <br /> |access-date=2014-09-08 <br /> |location=Toronto <br /> |year=2014 <br /> }}<br /> * For Donald Deacon's Legislative Assembly information see {{cite web<br /> |title=Donald Deacon, MPP <br /> |url=http://www.ontla.on.ca/web/members/members_all_detail.do?locale=en&amp;ID=1062 <br /> |work=Parliamentary History <br /> |publisher=Legislative Assembly of Ontario <br /> |access-date=2014-09-08 <br /> |location=Toronto <br /> |year=2014 <br /> }}<br /> * For Alfred Stong's Legislative Assembly information see {{cite web<br /> |title=Alfred Stong, MPP <br /> |url=http://www.ontla.on.ca/web/members/members_all_detail.do?locale=en&amp;ID=625 <br /> |work=Parliamentary History <br /> |publisher=Legislative Assembly of Ontario <br /> |access-date=2014-09-08 <br /> |location=Toronto <br /> |year=2014 <br /> }}<br /> * For Don Cousens's Legislative Assembly information see {{cite web<br /> |title=W. Donald Cousens, MPP <br /> |url=http://www.ontla.on.ca/web/members/members_all_detail.do?locale=en&amp;ID=378 <br /> |work=Parliamentary History <br /> |publisher=Legislative Assembly of Ontario <br /> |access-date=2014-09-08 <br /> |location=Toronto <br /> |year=2014 <br /> }}<br /> * For Greg Sorbara's Legislative Assembly information see {{cite web<br /> |title=Greg Sorbara, MPP <br /> |url=http://www.ontla.on.ca/web/members/members_all_detail.do?locale=en&amp;ID=90 <br /> |work=Parliamentary History <br /> |publisher=Legislative Assembly of Ontario <br /> |access-date=2014-09-08 <br /> |location=Toronto <br /> |year=2014 <br /> |url-status=dead <br /> |archive-url=https://web.archive.org/web/20140908202525/http://www.ontla.on.ca/web/members/members_all_detail.do?locale=en&amp;ID=90 <br /> |archive-date=September 8, 2014 <br /> }}<br /> * For Al Palladini's Legislative Assembly information see {{cite web<br /> |title=Al Palladini, MPP <br /> |url=http://www.ontla.on.ca/web/members/members_all_detail.do?locale=en&amp;ID=346 <br /> |work=Parliamentary History <br /> |publisher=Legislative Assembly of Ontario <br /> |access-date=2014-09-08 <br /> |location=Toronto <br /> |year=2014 <br /> }}<br /> * For Monte Kwinter's Legislative Assembly information see {{cite web<br /> |title=Monte Kwinter, MPP <br /> |url=http://www.ontla.on.ca/web/members/members_all_detail.do?locale=en&amp;ID=56 <br /> |work=Parliamentary History <br /> |publisher=Legislative Assembly of Ontario <br /> |access-date=2014-09-08 <br /> |location=Toronto <br /> |year=2014 <br /> }}&lt;/ref&gt;}}<br /> {{OntMPP End}}<br /> <br /> ==Election results==<br /> {{CANelec/top|ON|2022|York Centre (provincial electoral district)|York Centre|prelim=yes |percent=yes|change=yes}}<br /> {{CANelec|ON|PC|Michael Kerzner|12,875|45.79|-4.36}}<br /> {{CANelec|ON|Liberal|Shelley Brown|8,984|31.95|+10.56}}<br /> {{CANelec|ON|NDP|Frank Chu|3,935|14.00|-9.44}}<br /> {{CANelec|ON|Green|Alison Lowney|799|2.84|+0.55}}<br /> {{CANelec|ON|Ontario Party|Nick Balaskas|679|2.41| }}<br /> {{CANelec|ON|New Blue|Don Pincevero|411|1.46| }}<br /> {{CANelec|ON|Special Needs|Lionel Wayne Poizner|184|0.65| }}<br /> {{CANelec|ON|NOTA|Mark Dewdney|169|0.60|-0.67}}<br /> {{CANelec|ON|Moderate|Parviz Isganderov|80|0.28|-0.09}}<br /> {{CANelec/total|Total valid votes|28,116|}} <br /> {{CANelec/total|Total rejected, unmarked and declined ballots|||}}<br /> {{CANelec/total|Turnout||38.38|-14.54}}<br /> {{CANelec/total|Eligible voters|73,248}} <br /> {{CANelec/hold|ON|PC|-7.56}}<br /> {{CANelec/source|Source: Elections Ontario&lt;ref&gt;{{cite web |title=Candidates in: York Centre (120) |url=https://voterinformationservice.elections.on.ca/en/election/4-general-election-jun-2-2022/122-york-south-weston?tab=candidates |publisher=Elections Ontario |access-date=May 5, 2022}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{CANelec/top|ON|2018|percent=yes|change=yes|prelim=no}}<br /> {{CANelec|ON|PC|[[Roman Baber]]|18,434|50.15|+19.29}}<br /> {{CANelec|ON|NDP|Andrea Vásquez Jiménez|8,617|23.44|+7.07}}<br /> {{CANelec|ON|Liberal|Ramon Estaris|7,865|21.39|-26.72}}<br /> {{CANelec|ON|Green|Roma Lyon|843|2.29|-0.96}}<br /> {{CANelec|ON|NOTA|Cherie Ann Day|467|1.27|}}<br /> {{CANelec|ON|Libertarian|Benjamin Kamminga|398|1.08|}}<br /> {{CANelec|ON|Moderate|Alexander Leonov|137|0.37|}}<br /> {{CANelec/total|Total valid votes|36,761|98.51}} <br /> {{CANelec/total|Total rejected, unmarked and declined ballots|556|1.49|}}<br /> {{CANelec/total|Turnout|37,317|52.92|}}<br /> {{CANelec/total|Eligible voters|70,520}} <br /> {{CANelec/gain|ON|PC|Liberal|+23.01}}<br /> {{CANelec/source|Source: [[Elections Ontario]]&lt;ref&gt;{{cite web|url=https://www.elections.on.ca/content/dam/NGW/sitecontent/2018/results/officialresults-yellowbook/votescastbycandidate/pdf/Valid%20Votes%20Cast%20for%20Each%20Candidate%20-%202018%20Provincial%20General%20Election.pdf|title=Summary of Valid Votes Cast for each Candidate |page=11|publisher=Elections Ontario|access-date=20 January 2019}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! colspan=&quot;4&quot; | [[2014 Ontario general election|2014 general election]] redistributed results&lt;ref&gt;{{cite web | url=https://www.electionprediction.org/2018_on/riding/22.php | title=22 - Don Valley West }}&lt;/ref&gt;<br /> |-<br /> ! bgcolor=&quot;#DDDDFF&quot; width=&quot;130px&quot; colspan=&quot;2&quot; | Party<br /> ! bgcolor=&quot;#DDDDFF&quot; width=&quot;50px&quot; | Vote<br /> ! bgcolor=&quot;#DDDDFF&quot; width=&quot;30px&quot; | %<br /> |-<br /> | {{Canadian party colour|ON|Liberal|background}} | &amp;nbsp;<br /> | [[Ontario Liberal Party|Liberal]] ||align=right| 14,556 ||align=right| 48.12<br /> |-<br /> | {{Canadian party colour|ON|PC|background}} | &amp;nbsp;<br /> | [[Progressive Conservative Party of Ontario|Progressive Conservative]] ||align=right| 9,333 ||align=right| 30.85<br /> |-<br /> | {{Canadian party colour|ON|NDP|background}} | &amp;nbsp;<br /> | [[Ontario New Democratic Party|New Democratic]] ||align=right| 4,953 ||align=right| 16.37<br /> |-<br /> | {{Canadian party colour|ON|Green|background}} | &amp;nbsp;<br /> | [[Green Party of Ontario|Green]] ||align=right| 984 ||align=right| 3.25<br /> |-<br /> | {{Canadian party colour|ON|Independents|background}} | &amp;nbsp;<br /> | Others ||align=right| 425 ||align=right| 1.40<br /> |}<br /> <br /> {{CANelec/top|ON|2014|percent=yes|change=yes|prelim=no}}<br /> {{CANelec|ON|Liberal|[[Monte Kwinter]]|16,935|47.22|+2.68}}<br /> {{CANelec|ON|PC|Avi Yufest|11,125|31.02|-4.50}}<br /> {{CANelec|ON|NDP| John Fagan|5,645|15.74|+1.61}}<br /> {{CANelec|ON|Green|Josh Borenstein|1,156|3.27|+1.62}}<br /> {{CANelec|ON|Freedom|Laurence Cherniak|489|1.38|+1.05}}<br /> {{CANelec/total|Total valid votes|35,350|100.0 &amp;nbsp;||}}<br /> {{CANelec/hold|ON|Liberal|+3.66}}<br /> {{CANelec/source|Source: [[Elections Ontario]]&lt;ref&gt;{{cite web|url=http://wemakevotingeasy.ca/en/general-election-district-results.aspx?d=104 |date=2014 |publisher=[[Elections Ontario]] |title=General Election Results by District, 104 York Centre |access-date=17 June 2014 |url-status=dead |archive-url=https://web.archive.org/web/20140617111104/http://wemakevotingeasy.ca/en/general-election-district-results.aspx?d=104 |archive-date=June 17, 2014 }}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{CANelec/top|ON|2011|percent=yes|change=yes}}<br /> {{CANelec|ON|Liberal|[[Monte Kwinter]]|14,694|45.36|-3.37}}<br /> {{CANelec|ON|PC|Michael Mostyn|11,506|35.52|+3.24 }}<br /> {{CANelec|ON|NDP|John Fagan|4,579|14.13|+3.26 }}<br /> {{CANelec|ON|Libertarian|David Epstein|846|2.61|&amp;nbsp; }}<br /> {{CANelec|ON|Green|Yuriy Shevyryov|535|1.65|-4.81 }}<br /> {{CANelec|XX|Independent|Jeff Pancer|127|0.39|&amp;nbsp; }}<br /> {{CANelec|ON|Freedom|Ron Tal|108|0.33|&amp;nbsp; }}<br /> {{CANelec/total|Total valid votes|32,395|100.00| | }}<br /> {{CANelec/total|Total rejected, unmarked and declined ballots|325|0.99}}<br /> {{CANelec/total|Turnout|32,720|45.74}}<br /> {{CANelec/total|Eligible voters|71,531}}<br /> {{CANelec/hold|ON|Liberal|-3.31}}<br /> {{CANelec/source|Source: Elections Ontario&lt;ref&gt;{{cite web|url=http://www.wemakevotingeasy.ca/media/EO_Site/official_GE/ED104-F0244.pdf |publisher=[[Elections Ontario]] |date=2011 |title=Official return from the records / Rapport des registres officiels - York Centre |access-date=6 June 2014 }}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|2007|percent=yes|change=yes}}<br /> {{CANelec|ON|Liberal|[[Monte Kwinter]]| 16,646| 48.73|-10.68 }}<br /> {{CANelec|ON|PC|Igor Toutchinski| 11,028| 32.28|+7.45 }}<br /> {{CANelec|ON|NDP|Claudia Rodriguez| 3,713| 10.87|-0.17 }}<br /> {{CANelec|ON|Green|Marija Minic| 2,207| 6.46|+1.73 }} <br /> {{CANelec|ON|Family Coalition|Marilyn Carvalho| 568| 1.66|&amp;nbsp; }}<br /> {{Canadian election result/total|Total valid votes| |100.0 | | }}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|2003|percent=yes|change=yes}} <br /> {{CANelec|ON|Liberal|[[Monte Kwinter]]| 18,808| 59.41| -1.68}}<br /> {{CANelec|ON|PC|Dan Cullen| 7,862| 24.83| -1.6}}<br /> {{CANelec|ON|NDP|Matthew Norrish| 3,494| 11.04| +0.34}}<br /> {{CANelec|ON|Green|Constantine Kritsonis| 1,496| 4.73|&amp;nbsp;}}<br /> {{Canadian election result/total|Total valid votes| 31,660|100.0 | | }}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|1999|percent=yes}} <br /> {{CANelec|ON|Liberal|[[Monte Kwinter]]| 21,250| 61.09}}<br /> {{CANelec|ON|PC|Robert Hausman| 9,192| 26.43}}<br /> {{CANelec|ON|NDP|Norm Jesin| 3,721| 10.70}}<br /> {{CANelec|ON|Natural Law|Angus Hunt| 621| 1.79}}<br /> {{Canadian election result/total|Total valid votes| 34,784|100.0 | | }}<br /> {{end}}<br /> <br /> ==2007 electoral reform referendum==<br /> <br /> {| border=1 cellpadding=4 cellspacing=0 style=&quot;margin: 1em 1em 1em 0; background: #f9f9f9; border: 1px #aaa solid; border-collapse: collapse; font-size: 95%; clear:both&quot;<br /> |- style=&quot;background-color:#E9E9E9&quot;<br /> ! colspan=4|[[2007 Ontario electoral reform referendum]]<br /> |- style=&quot;background-color:#E9E9E9&quot;<br /> ! colspan=2 style=&quot;width: 130px&quot;|Side<br /> ! style=&quot;width: 50px&quot;|Votes<br /> ! style=&quot;width: 40px&quot;|%<br /> |-<br /> |bgcolor=&quot;blue&quot;|<br /> |'''First Past the Post'''<br /> |'''19,223'''<br /> |'''59.8'''<br /> |-<br /> |bgcolor=&quot;green&quot;|<br /> |Mixed member proportional<br /> |12,907<br /> |40.2<br /> |-<br /> |bgcolor=&quot;white&quot;|<br /> !Total valid votes<br /> | 32,130<br /> |100.0<br /> |}<br /> <br /> ==Historic election results==<br /> <br /> ===1987 boundaries===<br /> <br /> {{Canadian election result/top|ON|1995|percent=yes|change=yes}}<br /> {{CANelec|ON|PC|[[Al Palladini]]|37,897|48.94|+25.24}}<br /> {{CANelec|ON|Liberal|Mario Ferri|29,150|37.65|-8.03}}<br /> {{CANelec|ON|NDP|T. S. Joseph Thevarkunnel|6,698|8.65|-21.97}}<br /> {{CANelec|ON|Family Coalition|Giuseppi Gori|1,891|2.44|&amp;nbsp;}}<br /> {{CANelec|ON|Libertarian|Robert Ede|1,792|2.31|&amp;nbsp;}}<br /> {{Canadian election result/total|Total valid votes|77,428|100.0}}<br /> {{Canadian election result/source|Source:Elections Ontario&lt;ref&gt;{{cite web|url=http://results.elections.on.ca/results/1995_results/valid_votes.jsp?e_code=36&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G|title=Summary of Valid Ballots by Candidate|publisher=Elections Ontario|date=1995-06-08|access-date=2012-09-04}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|1990|percent=yes|change=yes}}<br /> {{CANelec|ON|Liberal|[[Greg Sorbara]]|28,056|45.57|-16.05}}<br /> {{CANelec|ON|NDP|Laurie Orrent|18850|30.62|+12.33}}<br /> {{CANelec|ON|PC|Don McGuire|14,656|23.81|+3.73}}<br /> {{Canadian election result/total|Total valid votes|61,562|100.0}}<br /> {{Canadian election result/source|Source: ''The Toronto Daily Star''&lt;ref&gt;{{cite news|title=How Metro-Area Voted|newspaper=The Toronto Daily Star|date=1990-09-07|location=Toronto|page=A10}}&lt;/ref&gt;&lt;ref group =&quot;nb&quot;&gt;390 out of 391 polls reporting.&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|1987|percent=yes|change=yes}}<br /> {{CANelec|ON|Liberal|[[Greg Sorbara]]|26,096|62.44|+28.61}}<br /> {{CANelec|ON|PC|Doug Mason|8,605|19.83|-30.43}}<br /> {{CANelec|ON|NDP|Joe Licastro|7,692|17.73|+6.24}}<br /> {{Canadian election result/total|Total valid votes|43.393|100.0}}<br /> {{Canadian election result/source|Source: ''The Toronto Daily Star''&lt;ref&gt;{{cite news|title=How Metro-Area Voted|newspaper=The Toronto Daily Star|date=1987-09-11|location=Toronto|page=A12}}&lt;/ref&gt;&lt;ref group =&quot;nb&quot;&gt;301 out of 308 polls reporting.&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> ===1974 boundaries===<br /> <br /> {{Canadian election result/top|ON|1985|percent=yes|change=yes}}<br /> {{CANelec|ON|PC|[[Donald Cousens]]|29,652|50.40|+3.09}}<br /> {{CANelec|ON|Liberal|Ron Maheu|19,484|33.12|-9.37}}<br /> {{CANelec|ON|NDP|Diane Meaghan|7,089|12.05|+2.55}}<br /> {{CANelec|XX|Independent|Stewart Cole|2,607|4.43|&amp;nbsp;}}<br /> {{Canadian election result/total|Total valid votes|58,832|100.0}}<br /> {{Canadian election result/source|Source:''Ottawa Citizen''&lt;ref name=&quot;1985 Election Results&quot;&gt;{{cite news|author=Canadian Press|title=The night the Tories tumbled; riding by riding results|newspaper=Ottawa Citizen |url=https://news.google.com/newspapers?id=Gzc0AAAAIBAJ&amp;sjid=hvUIAAAAIBAJ&amp;pg=1464%2C1195068 |date=1985-05-03|location=Toronto|page=43|access-date=2012-05-10}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|1981|percent=yes|change=yes}}<br /> {{CANelec|ON|PC|[[Donald Cousens]]|18,369|47.31|+7.55}}<br /> {{CANelec|ON|Liberal|[[Alfred Stong]]|16,495|42.49|-1.92}}<br /> {{CANelec|ON|NDP|John Campey|3,689|9.50|-6.33}}<br /> {{Canadian election result/total|Total valid votes|38,823|100.0}}<br /> {{Canadian election result/source|Source: ''The Windsor Star''&lt;ref name=&quot;1981 Election Results&quot;&gt;{{cite news|author=Canadian Press|title=Election results for Metro Toronto ridings|newspaper=The Windsor Star |url=https://news.google.com/newspapers?id=0NtYAAAAIBAJ&amp;sjid=QlIMAAAAIBAJ&amp;pg=6285%2C1391429 |date=1981-03-20|location=Windsor, Ontario|page=22|access-date=2012-05-10}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|1977|percent=yes|change=yes}}<br /> {{CANelec|ON|Liberal|[[Alfred Stong]]|17,608|44.41|+3.69}}<br /> {{CANelec|ON|PC|Bill Corcoran|15,768|39.76|+2.95}}<br /> {{CANelec|ON|NDP|Chris Olsen|6,277|15.83|-6.16}}<br /> {{Canadian election result/total|Total valid votes|39,653|100.0}}<br /> {{Canadian election result/source|Source: Canadian Press&lt;ref name=&quot;1977 Election Results&quot;&gt;{{cite news|author=Canadian Press|title=How they voted in Metro area |newspaper=The Toronto Daily Star |date=1977-06-10|location=Toronto|page=A10}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|1975|percent=yes|change=yes}}<br /> {{CANelec|ON|Liberal|[[Alfred Stong]]|14,347|40.72|-1.55}}<br /> {{CANelec|ON|PC|[[Tony Roman]]|12,968|36.81|-4.86}}<br /> {{CANelec|ON|NDP|Tony Snedker|7,748|21.99|+5.93}}<br /> {{CANelec|XX|Independent|John White|171|0.49|&amp;nbsp;}}<br /> {{Canadian election result/total|Total valid votes|35,234|100.0}}<br /> {{Canadian election result/source|Source: Canadian Press&lt;ref name=&quot;1975 Election Results&quot;&gt;{{cite news|author=Canadian Press|title=Results from the 29 ridings in Metro |newspaper=The Toronto Daily Star |date=1975-09-19|location=Toronto|page=A18}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> ===1966 boundaries===<br /> <br /> {{Canadian election result/top|ON|1971|percent=yes|change=yes}}<br /> {{CANelec|ON|Liberal|[[Donald Deacon]]|14,885|42.27|+1.66}}<br /> {{CANelec|ON|PC|[[Tony Roman]]|14,674|41.67|+5.06}}<br /> {{CANelec|ON|NDP|Roy Clifton|5,657|16.06|-6.73}}<br /> {{Canadian election result/total|Total valid votes|35,216|100.0}}<br /> {{Canadian election result/source|Source: Canadian Press&lt;ref name=&quot;1971 Election Results&quot;&gt;{{cite news|author=Canadian Press|title=Here's who won on the Metro ridings|newspaper=The Toronto Daily Star|date=1971-10-22|location=Toronto|page=12}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|1967|percent=yes|change=yes}}<br /> {{CANelec|ON|Liberal|[[Donald Deacon]]|9,991|40.61|+4.04}}<br /> {{CANelec|ON|PC|Lorne Wells|9,006|36.61|+4.72}}<br /> {{CANelec|ON|NDP|Jim Norton|5,606|22.79|-8.12}}<br /> {{Canadian election result/total|Total valid votes|24,603|100.0}}<br /> {{Canadian election result/source|Source: Canadian Press&lt;ref name=&quot;1967 Election Results&quot;&gt;{{cite news|author=Canadian Press|title=Tories win, but...|newspaper=The Windsor Star|date=1967-10-18|location=Windsor, Ontario|page=B2|url=https://news.google.com/newspapers?id=TDM_AAAAIBAJ&amp;sjid=VVEMAAAAIBAJ&amp;pg=3673%2C2835192|access-date=2012-04-30}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> ===1950s===<br /> <br /> {{Canadian election result/top|ON|1959|percent=yes|change=yes}}<br /> {{CANelec|ON|Liberal|[[Vernon Singer]]|15,702|36.57|+2.12}}<br /> {{CANelec|ON|PC|[[Thomas Graham (Canadian politician)|Thomas Graham]]|13,695|31.89|-5.82}}<br /> {{CANelec|ON|CCF|Fred Young|13,272|30.91|+4.94}}<br /> {{CANelec|XX|Independent|George Rolland|270|0.63|&amp;nbsp;}}<br /> {{Canadian election result/total|Total valid votes|42,939|100.0}}<br /> {{Canadian election result/source|Source: Canadian Press&lt;ref name=&quot;1959 Election Results&quot;&gt;{{cite news|author=Canadian Press|title=Complete Results of Ontario Voting by Constituencies|newspaper=The Ottawa Citizen|date=1959-06-12|location=Ottawa|page=26|url=https://news.google.com/newspapers?id=Yh0yAAAAIBAJ&amp;sjid=d-QFAAAAIBAJ&amp;pg=6095%2C2812390|access-date=2012-04-22}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> {{Canadian election result/top|ON|1955|percent=yes}}<br /> {{CANelec|ON|PC|[[Thomas Graham (Canadian politician)|Thomas Graham]]|12,648|37.71}}<br /> {{CANelec|ON|Liberal|[[Frederick Joseph McMahon]]|11,553|34.45}}<br /> {{CANelec|ON|CCF|[[Fred Young (Ontario politician)|Fred Young]]|8,710|25.97}}<br /> {{CANelec|ON|Labor-Progressive|Stephen Endicott|646|1.93}}<br /> {{Canadian election result/total|Total valid votes|33,537|100.0}}<br /> {{Canadian election result/source|Source: Canadian Press&lt;ref name=&quot;1955 Election Results&quot;&gt;{{cite news|author=Canadian Press|title=Complete Results of Ontario Voting by Constituencies|newspaper=The Ottawa Citizen|date=1955-06-10|location=Ottawa|page=4|url=https://news.google.com/newspapers?id=vCAvAAAAIBAJ&amp;sjid=TN0FAAAAIBAJ&amp;pg=4268%2C2256424|access-date=2012-04-22}}&lt;/ref&gt;}}<br /> {{end}}<br /> <br /> ==References==<br /> <br /> ===Notes===<br /> {{Reflist|group=nb}}<br /> <br /> ===Citations===<br /> {{Reflist}}<br /> <br /> ==External links==<br /> *[https://web.archive.org/web/20071012015710/http://www.elections.on.ca/en-CA/Tools/PastResults.htm Elections Ontario Past Election Results]<br /> *[https://www.elections.on.ca/content/dam/NGW/sitecontent/2017/preo/2017atlasmaps/120YorkCentreAtlasMap.pdf Map of riding for 2018 election]<br /> <br /> {| width=&quot;75%&quot; border=&quot;2&quot; align=&quot;center&quot;<br /> |-----<br /> | width=&quot;25%&quot; align=&quot;center&quot; |<br /> | width=&quot;50%&quot; align=&quot;center&quot; | '''North:''' [[Thornhill (provincial electoral district)|Thornhill]]<br /> | width=&quot;25%&quot; align=&quot;center&quot; |<br /> |-----<br /> | align=&quot;center&quot; | '''West:''' [[Humber River—Black Creek (provincial electoral district)|Humber River—Black Creek]]<br /> | align=&quot;center&quot; | '''York Centre'''<br /> | align=&quot;center&quot; | '''East:''' [[Willowdale (provincial electoral district)|Willowdale]]<br /> |-----<br /> | align=&quot;center&quot; |<br /> | align=&quot;center&quot; | '''South:''' [[Eglinton—Lawrence (provincial electoral district)|Eglinton—Lawrence]], [[York South—Weston (provincial electoral district)|York South—Weston]]<br /> | align=&quot;center&quot; |<br /> |}<br /> <br /> {{ON-ED}}<br /> <br /> {{coord|43.7627|N|79.4456|W|display=title}}<br /> <br /> [[Category:Ontario provincial electoral districts]]<br /> [[Category:North York]]<br /> [[Category:Provincial electoral districts of Toronto]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Monte_Kwinter&diff=1170007969 Monte Kwinter 2023-08-12T17:43:28Z <p>205.189.94.9: tenure in office, riding by riding.</p> <hr /> <div>{{more citations|date=July 2023}}<br /> {{Short description|Canadian politician (1931–2023)}}<br /> {{Use Canadian English|date=September 2021}}<br /> {{Use dmy dates|date=September 2021}}<br /> {{Infobox officeholder<br /> | name = Monte Kwinter<br /> | honorific-suffix = <br /> | image = Monte Kwinter - MPP York Center - 2006.jpg<br /> | caption = MPP Kwinter in 2006<br /> | parliament1 = Ontario Provincial<br /> | term_start1 = May 2, 1985<br /> | term_end1 = June 7, 2018<br /> | predecessor1 = [[David Rotenberg]]<br /> | successor1 = [[Roman Baber]]<br /> | riding1 = [[York Centre (provincial electoral district)|York Centre]]&lt;br /&gt;&lt;small&gt;[[Wilson Heights (electoral district)|Wilson Heights]] (1985–1999)&lt;/small&gt;<br /> | party = [[Ontario Liberal Party|Liberal]]<br /> | birth_date = {{birth date|1931|3|22}}<br /> | birth_place = [[Toronto, Ontario]], Canada<br /> | death_date = {{death date and age|2023|7|21|1931|3|22}}<br /> | death_place = <br /> | residence = <br /> | occupation = Real estate agent<br /> | website = {{official website | http://montekwinter.onmpp.ca/}}<br /> }}<br /> <br /> '''Monte Kwinter''' (March 22, 1931 – July 21, 2023) was a Canadian politician in [[Ontario]]. He was a [[Ontario Liberal Party|Liberal]] member of the [[Legislative Assembly of Ontario]] from 1985 until 2018. He represented the ridings of [[Wilson Heights (electoral district)|Wilson Heights]] from 1985 to 1999, and [[York Centre (provincial electoral district)|York Centre]] from 1999 to 2018. Kwinter was a [[Political minister|cabinet minister]] in the government of [[David Peterson]] from 1985 to 1990 and also in [[Dalton McGuinty]]'s government from 2003 to 2007. Kwinter was the oldest person ever to be an MPP in Ontario, although at his death, [[Raymond Cho (politician)]], was seven months shy of surpassing him. <br /> <br /> On January 26, 2013, Kwinter became the oldest person to ever serve in the Ontario legislature at the age of 81 years 310 days, surpassing previous record holder [[Lex MacKenzie]], who was 81 years and 309 days old when he left provincial politics in 1967.&lt;ref name=TorStar20130124&gt;{{cite news|last=Brennan|first=Richard J.|title=Monte Kwinter becomes oldest serving MPP in Ontario history this weekend|url=https://www.thestar.com/news/canada/2013/01/24/monte_kwinter_becomes_oldest_serving_mpp_in_ontario_history_this_weekend.html|newspaper=Toronto Star|date=January 24, 2013}}&lt;/ref&gt;<br /> <br /> On July 20, 2017, Kwinter announced that he would not be seeking re-election in the upcoming [[42nd Ontario general election|2018 election]] and that &quot;the time has come to let the next generation serve, and I look forward to offering my support to our future York Centre Liberal MPP.&quot;&lt;ref&gt;{{cite press release| url=https://ontarioliberal.ca/news/5970b8c79b36dd03c4ecffda?l=EN| title=Monte Kwinter MPP for York Centre not seeking re-election in 2018| date=July 20, 2017| publisher=[[Ontario Liberal Party]]| access-date=July 20, 2017| archive-url=https://web.archive.org/web/20171007021043/https://ontarioliberal.ca/news/5970b8c79b36dd03c4ecffda?l=EN| archive-date=October 7, 2017| url-status=dead}}&lt;/ref&gt;<br /> <br /> ==Background==<br /> Monte Kwinter was born in Toronto on March 22, 1931.&lt;ref&gt;[https://books.google.com/books?id=hfsLAQAAMAAJ&amp;q=Monte+Kwinter+1931]&lt;/ref&gt; He was educated at the [[Ontario College of Art]], [[Syracuse University]], the [[Massachusetts Institute of Technology]], the [[Institute of Contemporary Art, Boston]], and the [[Université de Montréal]]. He has a degree in [[fine arts]], specializing in industrial design.<br /> <br /> Kwinter worked in real estate before entering political life, eventually owning his own firm within the field. He was also a founding member of the Toronto Regional Council of [[B'nai Brith Canada]], served on the board of directors of the Upper Canadian Zoological Society, and the [[Canadian National Exhibition]], was chair of the [[Toronto Harbour Commission]], chair of the [[Toronto Humane Society]], vice-president of the Ontario College of Art, and served as an executive member on the League for Human Rights of B'nai B'rith Canada.&lt;ref&gt;{{Cite web|url=https://news.ontario.ca/newsroom/en|title=Newsroom : Recent News|website=news.ontario.ca}}&lt;/ref&gt;<br /> <br /> Kwinter was also involved in the [[Liberal Party of Canada]] as a fundraiser and organizer and worked on [[John Turner]]'s [[1984 Liberal Party of Canada leadership election|1984 leadership campaign]].<br /> <br /> ==Politics==<br /> ===Peterson government===<br /> Kwinter was elected to the Ontario legislature in the [[1985 Ontario general election|provincial election of 1985]] as a [[Ontario Liberal Party|Liberal]], defeating incumbent [[Progressive Conservative Party of Ontario|Progressive Conservative]] [[David Rotenberg]] and [[Ontario New Democratic Party|New Democrat]] city councillor [[Howard Moscoe]] in the [[North York]] riding of [[Wilson Heights (electoral district)|Wilson Heights]] (which has a large immigrant population and a prominent [[Orthodox Jewish]] community; Kwinter was himself [[Jewish]]).&lt;ref name=&quot;1985 results&quot;&gt;{{cite news |title=Results of vote in Ontario election |newspaper=The Globe and Mail |date=May 3, 1985 |page=13}}&lt;/ref&gt;<br /> <br /> Kwinter had been a strong advocate for the completion of the controversial [[Spadina Expressway]] in [[Toronto]] but abandoned this position soon after winning election.<br /> <br /> On June 26, 1985, he was appointed [[Ontario Minister of Consumer and Commercial Relations|Minister of Consumer and Commercial Relations]] and Minister of Financial Institutions.&lt;ref name=&quot;1985PetCab&quot;&gt;{{cite news |title=Liberals pledge reform as they take over in Ontario |newspaper=The Gazette |location=Montreal, Que |date=June 27, 1985 |page=B1}}&lt;/ref&gt;<br /> <br /> Kwinter was easily re-elected in the [[1987 Ontario general election|provincial election of 1987]], and was named [[Ministry of Industry, Trade and Technology|Minister of Industry, Trade and Technology]] in September of that year.&lt;ref name=&quot;1987 results&quot;&gt;{{cite news |title=Results from individual ridings |newspaper=The Windsor Star |date=September 11, 1987 |page=F2}}&lt;/ref&gt;&lt;ref name=&quot;1987PetCab&quot;&gt;{{cite news |title=Wrye gets new cabinet job |newspaper=The Windsor Star |date=September 29, 1987 |page=A1}}&lt;/ref&gt; In June 1989, Kwinter was implicated in the [[Patti Starr affair|Patti Starr]] corruption scandal. Starr, who was head of the National Council of Jewish Women, misused her position by having the organization make political contributions to the riding associations of prominent Liberal MPPs. Kwinter's riding of Wilson Heights was among those who received these illegal contributions.&lt;ref&gt;{{cite news |first=Matt |last=Maychak |title=Liberal links to fallen Starr scares MPPs |newspaper=Toronto Star |date=June 15, 1989 |page=A30}}&lt;/ref&gt; On August 2, when Peterson shuffled his cabinet in the wake of the scandal, Kwinter was one of only two ministers who retained their positions despite the scandal. Eight other ministers lost their positions.&lt;ref&gt;{{cite news |first1=Alan |last1=Storey |first2=Derek |last2=Ferguson |title=Tainted ministers axed: Peterson drops 8 in cabinet shuffle |newspaper=Toronto Star |date=August 2, 1989 |pages=A1, A27}}&lt;/ref&gt;<br /> <br /> ====Cabinet====<br /> {{s-start}}<br /> {{Canadian cabinet member navigational box header |ministry=David_Peterson}}<br /> {{ministry box cabinet posts<br /> | post3preceded = [[Hugh O'Neil]]<br /> | post3 = [[Ministry of Industry, Trade and Technology|Minister of Industry, Trade and Technology]]<br /> | post3years = 1987–1990<br /> | post3note =<br /> | post3followed = [[Allan Pilkey]]<br /> <br /> | post2preceded = New position<br /> | post2 = Minister of Financial Institutions<br /> | post2years = 1986–1987<br /> | post2note =<br /> | post2followed = [[Robert Nixon (politician)|Robert Nixon]]<br /> <br /> | post1preceded = [[Bob Runciman]]<br /> | post1 = [[Ministry of Government Services (Ontario)|Minister of Consumer and Commercial Relations]]<br /> | post1years = 1985–1987<br /> | post1note =<br /> | post1followed = [[Bill Wrye]]<br /> }}<br /> {{s-end}}<br /> <br /> ===Opposition===<br /> The Liberals were upset by the [[Ontario New Democratic Party|New Democratic Party]] in the [[1990 Ontario provincial election|1990 provincial election]], although Kwinter himself was again re-elected without difficulty, although one contender was better known as the alter-ego of [[Ed the Sock]].&lt;ref name=&quot;1990 results&quot;&gt;{{cite news |title=Ontario election: Riding-by-riding voting results |newspaper=The Globe and Mail |date=September 7, 1990 |page=A12}}&lt;/ref&gt;.<br /> <br /> Kwinter faced a more serious challenge in the [[1995 Ontario provincial election|1995 election]], which was won by the Progressive Conservatives; Tory candidate Sam Pasternak came within 3,000 votes of upsetting him.&lt;ref name=&quot;1995 results&quot;&gt;{{cite web |url=http://results.elections.on.ca/results/1995_results/valid_votes.jsp?e_code=36&amp;rec=0&amp;district=wilson+heights&amp;flag=E&amp;layout=G |title=Summary of Valid Ballots by Candidate |publisher=Elections Ontario |date=June 8, 1995 |access-date=2014-03-02}}&lt;/ref&gt; Kwinter was not a prominent figure in the Legislative Assembly during his time in the opposition, though he was nevertheless regarded as a strong community representative.<br /> <br /> Despite having a reputation for being on the right wing of the Ontario Liberal Party, Kwinter supported left wing candidate [[Gerard Kennedy]] in the party's 1996 [[leadership convention]].<br /> <br /> The Progressive Conservative government of [[Mike Harris]] reduced the number of provincial ridings from 130 to 103 in 1996, forcing several incumbent [[Member of Provincial Parliament (Ontario)|Members of Provincial Parliament]] (MPPs) to compete against one another for re-election. In some cases, MPPs from the same party were forced to compete against one another for their riding nominations. Kwinter was challenged for the Liberal nomination in the new riding of [[York Centre (Ontario riding)|York Centre]] by fellow MPP [[Anna-Marie Castrilli]], who had unsuccessfully competed for the party's leadership in 1996.<br /> <br /> Castrilli's challenge to Kwinter was extremely controversial, and was marked by serious divisions in the local riding association. Kwinter was subjected to a number of incidents of [[anti-Semitic]] abuse during this period, and on one occasion received hate mail at his legislative office. Castrilli was not involved in these incidents, but they were regarded by many as reinforcing the unpleasant character of the nomination battle.<br /> <br /> Liberal leader Dalton McGuinty tried to convince Castrilli to run in a different riding, but was unsuccessful. Rumours began to circulate that Kwinter was planning to defect to the Progressive Conservatives in the event that he was defeated. As it happened, there was never an opportunity to test this speculation—Kwinter was able to defeat Castrilli, who defected to the Tories herself shortly thereafter.<br /> <br /> Kwinter's nomination difficulties proved to be his only real challenge of the [[1999 Ontario general election|1999 campaign]], and he was again returned by a significant margin in the general election.&lt;ref name=&quot;1999 results&quot;&gt;{{cite web |url=http://results.elections.on.ca/results/1999_results/valid_votes.jsp?e_code=37&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |title=Summary of Valid Ballots by Candidate |publisher=Elections Ontario |date=June 3, 1999 |access-date=2014-03-02 |url-status=dead |archive-url=https://web.archive.org/web/20140320204717/http://results.elections.on.ca/results/1999_results/valid_votes.jsp?e_code=37&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |archive-date=March 20, 2014 |df=mdy-all }}&lt;/ref&gt; The Progressive Conservatives were again victorious across the province, and Kwinter remained on the opposition benches.<br /> <br /> In 2002, Kwinter publicly opposed the Liberal Party's position on tax credits for parents who send their children to private and non-Catholic denominational schools. The party opposes such credits as a detrimental to the public system. Kwinter referred to the distinction between publicly funded Catholic [[Separate School]]s and non-Catholic denominational schools as one of discrimination, though he also opposed funding for non-denominational private schools.<br /> <br /> ===McGuinty government===<br /> Kwinter was again re-elected in the [[2003 Ontario general election|2003 election]] without difficulty.&lt;ref name=&quot;2003 results&quot;&gt;{{cite web |url=http://results.elections.on.ca/results/2003_results/valid_votes.jsp?e_code=38&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |title=Summary of Valid Ballots by Candidate |publisher=Elections Ontario |date=October 2, 2003 |access-date=2014-03-02 |url-status=dead |archive-url=https://web.archive.org/web/20140320211050/http://results.elections.on.ca/results/2003_results/valid_votes.jsp?e_code=38&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |archive-date=March 20, 2014 |df=mdy-all }}&lt;/ref&gt; The election was won by the Liberals, and there was considerable media speculation as to whether or not Dalton McGuinty would appoint the septuagenarian Kwinter to cabinet again. Ultimately, Kwinter's public disagreements with party policy were not enough to sideline his career: he was appointed [[Ontario Minister of Public Safety and Security]] (essentially a retitled [[Solicitor General of Ontario|Solicitor-General]]'s position) on October 23, 2003.&lt;ref name=&quot;2003McGuintyCab&quot;&gt;{{cite news |title=Premier Dalton McGuinty and his 22-member cabinet were sworn in Thursday |publisher=Canadian Press NewsWire |date=October 23, 2003 |page=1}}&lt;/ref&gt;<br /> <br /> Kwinter put forward a plan to combat [[cannabis (drug)|marijuana]] grow-ops in Ontario that would permit local utilities to cut off electrical power to those in the illegal industry. There were many who opposed this plan on the grounds that innocent citizens could see their power cut off without warning in the event of an administrative or legal error.&lt;ref&gt;{{cite news |first=Antonella |last=Artuso |title=Hydro to root out grow ops: suspicious homes to lose power |newspaper=Toronto Sun |date=October 8, 2004|url=http://www.mapinc.org/drugnews/v04/n1427/a07.html}}&lt;/ref&gt;<br /> <br /> Kwinter was re-elected in the [[2007 Ontario general election|2007 provincial election]] despite a stronger challenge from the Progressive Conservative Party due to its support for extending funding to Jewish and other religious day schools.&lt;ref name=&quot;2007 results&quot;&gt;{{cite web |url=http://elections.on.ca/NR/rdonlyres/AB409CCD-84F3-46FA-B3BD-39AB659EFC2D/0/SummaryofValidBallotsCastforEachCandidate.pdf |title=Summary of Valid Ballots Cast for Each Candidate |publisher=Elections Ontario |date=October 10, 2007 |page=17 (xxvi) |url-status=dead |archive-url=https://web.archive.org/web/20091007160233/http://www.elections.on.ca/NR/rdonlyres/AB409CCD-84F3-46FA-B3BD-39AB659EFC2D/0/SummaryofValidBallotsCastforEachCandidate.pdf |archive-date=October 7, 2009 |df=mdy-all }}&lt;/ref&gt; Kwinter broke with the Liberal platform and [[cabinet solidarity]] by supporting the Progressive Conservative's proposal. The Liberal government was re-elected however Kwinter was dropped from Cabinet in the post-election [[cabinet shuffle]].&lt;ref name=&quot;McGuinty2007Cab&quot;&gt;{{cite news |title=Premier goes for new blood; Expanded 28-member cabinet has eight ministers from Toronto, three from 905 area |last1=Ferguson |first1=Rob |last2=Benzie |first2=Robert |newspaper=Toronto Star |date=October 31, 2007 |page=A13}}&lt;/ref&gt; While no official reason was given for the demotion the ''[[The Jewish Tribune (Canada)|Jewish Tribune]]'' claimed that it was a result of the position he took on school funding during the election campaign though it did not name its source for this claim.&lt;ref&gt;{{cite news |first=Atara |last=Beck |title=Kwinter kicked out of cabinet: Stand on inclusive public education the reason, source says |newspaper=Jewish Tribune |date=November 22, 2007 |page=2}}&lt;/ref&gt;&lt;ref&gt;{{cite news |title=10 new faces to spur 'activist' agenda: McGuinty left several key ministers in place and turfed out four others as he remade Ontario's Liberal cabinet |first=Keith |last=Leslie |publisher=Canadian Press |newspaper=Toronto Star |date=October 30, 2007}}&lt;/ref&gt;<br /> <br /> Following the cabinet shuffle Premier McGuinty appointed Kwinter to the position of chair Ontario investment and trade advisory council and the Parliamentary Assistant to the Minister of Economic Development and Trade (Investment Attraction and Trade).<br /> <br /> Kwinter retained his seat in the [[2011 Ontario general election|2011 provincial election]] against Progressive Conservative candidate [[Conservative Party of Canada candidates, 2006 Canadian federal election#Michael Mostyn (York Centre)|Michael Mostyn]] by 3,188 votes.&lt;ref&gt;{{cite news |first=Kris |last=Scheuer |url=http://www.mytowncrier.ca/monte-kwinter-wants-back-for-eighth-term.html |title=Monte Kwinter wants back for eighth term |newspaper=Town Crier |date=March 7, 2011 |access-date=March 20, 2011 |archive-url=https://web.archive.org/web/20110706185949/http://www.mytowncrier.ca/monte-kwinter-wants-back-for-eighth-term.html |archive-date=July 6, 2011 |url-status=dead }}&lt;/ref&gt;&lt;ref name=&quot;2011 results&quot;&gt;{{cite web |url=http://elections.on.ca/NR/rdonlyres/7849B894-4C4F-490E-9E8C-271BCF0C0D4D/5712/SummaryofvalidvotescastforeacndGE2011.pdf |title=Summary of Valid Ballots Cast for Each Candidate |publisher=Elections Ontario |date=October 6, 2011 |page=20 |url-status=dead |archive-url=https://web.archive.org/web/20130330163815/http://elections.on.ca/NR/rdonlyres/7849B894-4C4F-490E-9E8C-271BCF0C0D4D/5712/SummaryofvalidvotescastforeacndGE2011.pdf |archive-date=March 30, 2013 |df=mdy-all }}&lt;/ref&gt;<br /> <br /> ====Cabinet====<br /> {{s-start}}<br /> {{Canadian cabinet member navigational box header |ministry=Dalton_McGuinty}}<br /> {{ministry box cabinet posts<br /> | post1preceded = [[Rob Sampson]]<br /> | post1 = [[Ministry of Community Safety and Correctional Services|Minister of Community Safety and Correctional Services]]<br /> | post1years = 2003–2007<br /> | post1note =<br /> | post1followed = [[Rick Bartolucci]]<br /> }}<br /> {{s-end}}<br /> <br /> ===Wynne government===<br /> Kwinter's riding association nominated him to run as the Liberal candidate in the [[41st Ontario general election|next provincial election]] which occurred on June 12, 2014.&lt;ref name=paikin&gt;{{cite news|last=Paikin|first=Steve|title=The Original Sin|url=http://theagenda.tvo.org/blog/agenda-blogs/original-sin|access-date=March 14, 2014|newspaper=TVOntario|date=March 13, 2014}}&lt;/ref&gt; He defeated PC candidate Avi Yufest by 6,066 votes.&lt;ref name=&quot;2014 results&quot;&gt;{{cite web |title=General Election by District: York Centre |publisher=Elections Ontario |date=June 12, 2014 |url=http://wemakevotingeasy.ca/en/general-election-district-results.aspx?d=104 |access-date=September 2, 2014 |archive-url=https://web.archive.org/web/20140617111104/http://wemakevotingeasy.ca/en/general-election-district-results.aspx?d=104 |archive-date=June 17, 2014 |url-status=dead }}&lt;/ref&gt;<br /> <br /> From June 2014 to June 2016, he served as Parliamentary Assistant to the [[Ministry of Citizenship and Immigration|Minister of Citizenship, Immigration and International Trade]]. As of June 2016, he served as Parliamentary Assistant to the Minister of International Trade.&lt;ref&gt;{{Cite web|url=http://www.ontla.on.ca/web/members/members_detail.do?locale=en&amp;ID=56&amp;detailPage=members_detail_career|title=Request Rejected|website=www.ontla.on.ca}}&lt;/ref&gt;<br /> <br /> In October 2016, it was reported that Kwinter was living in a nursing home, Kensington Place, while recovering from an illness.&lt;ref&gt;{{Cite news|url=http://www.torontosun.com/2016/10/17/mpp-absence-not-unprecedented|title=MPP absence not unprecedented|newspaper=Toronto Sun|access-date=2016-10-27}}&lt;/ref&gt; In March 2017, Kwinter reappeared in public after his months long recovery from shingles. Requiring the use of a wheelchair and aid from a caregiver, Kwinter intended to return to his role and run in the 2018 election,&lt;ref&gt;{{Cite web|url=https://www.tvo.org/article/ontarios-oldest-ever-mpp-re-emerges|title=Ontario's oldest-ever MPP re-emerges|website=TVO.org}}&lt;/ref&gt; but later chose to retire from politics at the 2018 election, when the Wynne government was soundly defeated. Ramon Estaris lost that election, placing third to one-term MPP, [[Roman Baber]]. <br /> <br /> ==Death==<br /> Monte Kwinter died on July 21, 2023, at the age of 92.&lt;ref&gt;[https://benjaminsparkmemorialchapel.ca/ServiceDetails?snum=139523&amp;fg=0 Monte Kwinter]&lt;/ref&gt;&lt;ref&gt;[https://torontosun.com/news/provincial/longtime-toronto-liberal-mpp-monte-kwinter-dead-at-92 Longtime Toronto Liberal MPP Monte Kwinter dead at 92]&lt;/ref&gt;<br /> <br /> ==References==<br /> {{Reflist|2}}<br /> <br /> ==External links==<br /> * {{official website|http://montekwinter.onmpp.ca/}}<br /> * {{OntarioMPPbio|id=monte-kwinter}}<br /> {{McGuinty Ministry}}<br /> {{Peterson Ministry}}<br /> <br /> {{DEFAULTSORT:Kwinter, Monte}}<br /> [[Category:1931 births]]<br /> [[Category:2023 deaths]]<br /> [[Category:Canadian real estate agents]]<br /> [[Category:Jewish Canadian politicians]]<br /> [[Category:Members of the Executive Council of Ontario]]<br /> [[Category:Ontario Liberal Party MPPs]]<br /> [[Category:Politicians from Toronto]]<br /> [[Category:OCAD University alumni]]<br /> [[Category:21st-century Canadian politicians]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Monte_Kwinter&diff=1170006198 Monte Kwinter 2023-08-12T17:33:21Z <p>205.189.94.9: /* Opposition */ Ed the Sock was PC candidate in 1990. Riding now held by another Kerzner...</p> <hr /> <div>{{more citations|date=July 2023}}<br /> {{Short description|Canadian politician (1931–2023)}}<br /> {{Use Canadian English|date=September 2021}}<br /> {{Use dmy dates|date=September 2021}}<br /> {{Infobox officeholder<br /> | name = Monte Kwinter<br /> | honorific-suffix = <br /> | image = Monte Kwinter - MPP York Center - 2006.jpg<br /> | caption = MPP Kwinter in 2006<br /> | parliament1 = Ontario Provincial<br /> | term_start1 = May 2, 1985<br /> | term_end1 = June 7, 2018<br /> | predecessor1 = [[David Rotenberg]]<br /> | successor1 = [[Roman Baber]]<br /> | riding1 = [[York Centre (provincial electoral district)|York Centre]]&lt;br /&gt;&lt;small&gt;[[Wilson Heights (electoral district)|Wilson Heights]] (1985–1999)&lt;/small&gt;<br /> | party = [[Ontario Liberal Party|Liberal]]<br /> | birth_date = {{birth date|1931|3|22}}<br /> | birth_place = [[Toronto, Ontario]], Canada<br /> | death_date = {{death date and age|2023|7|21|1931|3|22}}<br /> | death_place = <br /> | residence = <br /> | occupation = Real estate agent<br /> | website = {{official website | http://montekwinter.onmpp.ca/}}<br /> }}<br /> <br /> '''Monte Kwinter''' (March 22, 1931 – July 21, 2023) was a Canadian politician in [[Ontario]]. He was a [[Ontario Liberal Party|Liberal]] member of the [[Legislative Assembly of Ontario]] from 1985 until 2018. He represented the riding of [[York Centre (provincial electoral district)|York Centre]] for much of that period. Kwinter was a [[Political minister|cabinet minister]] in the government of [[David Peterson]] from 1985 to 1990 and also in [[Dalton McGuinty]]'s government from 2003 to 2007. Kwinter was the oldest person ever to be an MPP in Ontario, although at his death, [[Raymond Cho (politician)]], was seven months shy of surpassing him. <br /> <br /> On January 26, 2013, Kwinter became the oldest person to ever serve in the Ontario legislature at the age of 81 years 310 days, surpassing previous record holder [[Lex MacKenzie]], who was 81 years and 309 days old when he left provincial politics in 1967.&lt;ref name=TorStar20130124&gt;{{cite news|last=Brennan|first=Richard J.|title=Monte Kwinter becomes oldest serving MPP in Ontario history this weekend|url=https://www.thestar.com/news/canada/2013/01/24/monte_kwinter_becomes_oldest_serving_mpp_in_ontario_history_this_weekend.html|newspaper=Toronto Star|date=January 24, 2013}}&lt;/ref&gt;<br /> <br /> On July 20, 2017, Kwinter announced that he would not be seeking re-election in the upcoming [[42nd Ontario general election|2018 election]] and that &quot;the time has come to let the next generation serve, and I look forward to offering my support to our future York Centre Liberal MPP.&quot;&lt;ref&gt;{{cite press release| url=https://ontarioliberal.ca/news/5970b8c79b36dd03c4ecffda?l=EN| title=Monte Kwinter MPP for York Centre not seeking re-election in 2018| date=July 20, 2017| publisher=[[Ontario Liberal Party]]| access-date=July 20, 2017| archive-url=https://web.archive.org/web/20171007021043/https://ontarioliberal.ca/news/5970b8c79b36dd03c4ecffda?l=EN| archive-date=October 7, 2017| url-status=dead}}&lt;/ref&gt;<br /> <br /> ==Background==<br /> Monte Kwinter was born in Toronto on March 22, 1931.&lt;ref&gt;[https://books.google.com/books?id=hfsLAQAAMAAJ&amp;q=Monte+Kwinter+1931]&lt;/ref&gt; He was educated at the [[Ontario College of Art]], [[Syracuse University]], the [[Massachusetts Institute of Technology]], the [[Institute of Contemporary Art, Boston]], and the [[Université de Montréal]]. He has a degree in [[fine arts]], specializing in industrial design.<br /> <br /> Kwinter worked in real estate before entering political life, eventually owning his own firm within the field. He was also a founding member of the Toronto Regional Council of [[B'nai Brith Canada]], served on the board of directors of the Upper Canadian Zoological Society, and the [[Canadian National Exhibition]], was chair of the [[Toronto Harbour Commission]], chair of the [[Toronto Humane Society]], vice-president of the Ontario College of Art, and served as an executive member on the League for Human Rights of B'nai B'rith Canada.&lt;ref&gt;{{Cite web|url=https://news.ontario.ca/newsroom/en|title=Newsroom : Recent News|website=news.ontario.ca}}&lt;/ref&gt;<br /> <br /> Kwinter was also involved in the [[Liberal Party of Canada]] as a fundraiser and organizer and worked on [[John Turner]]'s [[1984 Liberal Party of Canada leadership election|1984 leadership campaign]].<br /> <br /> ==Politics==<br /> ===Peterson government===<br /> Kwinter was elected to the Ontario legislature in the [[1985 Ontario general election|provincial election of 1985]] as a [[Ontario Liberal Party|Liberal]], defeating incumbent [[Progressive Conservative Party of Ontario|Progressive Conservative]] [[David Rotenberg]] and [[Ontario New Democratic Party|New Democrat]] city councillor [[Howard Moscoe]] in the [[North York]] riding of [[Wilson Heights (electoral district)|Wilson Heights]] (which has a large immigrant population and a prominent [[Orthodox Jewish]] community; Kwinter was himself [[Jewish]]).&lt;ref name=&quot;1985 results&quot;&gt;{{cite news |title=Results of vote in Ontario election |newspaper=The Globe and Mail |date=May 3, 1985 |page=13}}&lt;/ref&gt;<br /> <br /> Kwinter had been a strong advocate for the completion of the controversial [[Spadina Expressway]] in [[Toronto]] but abandoned this position soon after winning election.<br /> <br /> On June 26, 1985, he was appointed [[Ontario Minister of Consumer and Commercial Relations|Minister of Consumer and Commercial Relations]] and Minister of Financial Institutions.&lt;ref name=&quot;1985PetCab&quot;&gt;{{cite news |title=Liberals pledge reform as they take over in Ontario |newspaper=The Gazette |location=Montreal, Que |date=June 27, 1985 |page=B1}}&lt;/ref&gt;<br /> <br /> Kwinter was easily re-elected in the [[1987 Ontario general election|provincial election of 1987]], and was named [[Ministry of Industry, Trade and Technology|Minister of Industry, Trade and Technology]] in September of that year.&lt;ref name=&quot;1987 results&quot;&gt;{{cite news |title=Results from individual ridings |newspaper=The Windsor Star |date=September 11, 1987 |page=F2}}&lt;/ref&gt;&lt;ref name=&quot;1987PetCab&quot;&gt;{{cite news |title=Wrye gets new cabinet job |newspaper=The Windsor Star |date=September 29, 1987 |page=A1}}&lt;/ref&gt; In June 1989, Kwinter was implicated in the [[Patti Starr affair|Patti Starr]] corruption scandal. Starr, who was head of the National Council of Jewish Women, misused her position by having the organization make political contributions to the riding associations of prominent Liberal MPPs. Kwinter's riding of Wilson Heights was among those who received these illegal contributions.&lt;ref&gt;{{cite news |first=Matt |last=Maychak |title=Liberal links to fallen Starr scares MPPs |newspaper=Toronto Star |date=June 15, 1989 |page=A30}}&lt;/ref&gt; On August 2, when Peterson shuffled his cabinet in the wake of the scandal, Kwinter was one of only two ministers who retained their positions despite the scandal. Eight other ministers lost their positions.&lt;ref&gt;{{cite news |first1=Alan |last1=Storey |first2=Derek |last2=Ferguson |title=Tainted ministers axed: Peterson drops 8 in cabinet shuffle |newspaper=Toronto Star |date=August 2, 1989 |pages=A1, A27}}&lt;/ref&gt;<br /> <br /> ====Cabinet====<br /> {{s-start}}<br /> {{Canadian cabinet member navigational box header |ministry=David_Peterson}}<br /> {{ministry box cabinet posts<br /> | post3preceded = [[Hugh O'Neil]]<br /> | post3 = [[Ministry of Industry, Trade and Technology|Minister of Industry, Trade and Technology]]<br /> | post3years = 1987–1990<br /> | post3note =<br /> | post3followed = [[Allan Pilkey]]<br /> <br /> | post2preceded = New position<br /> | post2 = Minister of Financial Institutions<br /> | post2years = 1986–1987<br /> | post2note =<br /> | post2followed = [[Robert Nixon (politician)|Robert Nixon]]<br /> <br /> | post1preceded = [[Bob Runciman]]<br /> | post1 = [[Ministry of Government Services (Ontario)|Minister of Consumer and Commercial Relations]]<br /> | post1years = 1985–1987<br /> | post1note =<br /> | post1followed = [[Bill Wrye]]<br /> }}<br /> {{s-end}}<br /> <br /> ===Opposition===<br /> The Liberals were upset by the [[Ontario New Democratic Party|New Democratic Party]] in the [[1990 Ontario provincial election|1990 provincial election]], although Kwinter himself was again re-elected without difficulty, although one contender was better known as the alter-ego of [[Ed the Sock]].&lt;ref name=&quot;1990 results&quot;&gt;{{cite news |title=Ontario election: Riding-by-riding voting results |newspaper=The Globe and Mail |date=September 7, 1990 |page=A12}}&lt;/ref&gt;.<br /> <br /> Kwinter faced a more serious challenge in the [[1995 Ontario provincial election|1995 election]], which was won by the Progressive Conservatives; Tory candidate Sam Pasternak came within 3,000 votes of upsetting him.&lt;ref name=&quot;1995 results&quot;&gt;{{cite web |url=http://results.elections.on.ca/results/1995_results/valid_votes.jsp?e_code=36&amp;rec=0&amp;district=wilson+heights&amp;flag=E&amp;layout=G |title=Summary of Valid Ballots by Candidate |publisher=Elections Ontario |date=June 8, 1995 |access-date=2014-03-02}}&lt;/ref&gt; Kwinter was not a prominent figure in the Legislative Assembly during his time in the opposition, though he was nevertheless regarded as a strong community representative.<br /> <br /> Despite having a reputation for being on the right wing of the Ontario Liberal Party, Kwinter supported left wing candidate [[Gerard Kennedy]] in the party's 1996 [[leadership convention]].<br /> <br /> The Progressive Conservative government of [[Mike Harris]] reduced the number of provincial ridings from 130 to 103 in 1996, forcing several incumbent [[Member of Provincial Parliament (Ontario)|Members of Provincial Parliament]] (MPPs) to compete against one another for re-election. In some cases, MPPs from the same party were forced to compete against one another for their riding nominations. Kwinter was challenged for the Liberal nomination in the new riding of [[York Centre (Ontario riding)|York Centre]] by fellow MPP [[Anna-Marie Castrilli]], who had unsuccessfully competed for the party's leadership in 1996.<br /> <br /> Castrilli's challenge to Kwinter was extremely controversial, and was marked by serious divisions in the local riding association. Kwinter was subjected to a number of incidents of [[anti-Semitic]] abuse during this period, and on one occasion received hate mail at his legislative office. Castrilli was not involved in these incidents, but they were regarded by many as reinforcing the unpleasant character of the nomination battle.<br /> <br /> Liberal leader Dalton McGuinty tried to convince Castrilli to run in a different riding, but was unsuccessful. Rumours began to circulate that Kwinter was planning to defect to the Progressive Conservatives in the event that he was defeated. As it happened, there was never an opportunity to test this speculation—Kwinter was able to defeat Castrilli, who defected to the Tories herself shortly thereafter.<br /> <br /> Kwinter's nomination difficulties proved to be his only real challenge of the [[1999 Ontario general election|1999 campaign]], and he was again returned by a significant margin in the general election.&lt;ref name=&quot;1999 results&quot;&gt;{{cite web |url=http://results.elections.on.ca/results/1999_results/valid_votes.jsp?e_code=37&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |title=Summary of Valid Ballots by Candidate |publisher=Elections Ontario |date=June 3, 1999 |access-date=2014-03-02 |url-status=dead |archive-url=https://web.archive.org/web/20140320204717/http://results.elections.on.ca/results/1999_results/valid_votes.jsp?e_code=37&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |archive-date=March 20, 2014 |df=mdy-all }}&lt;/ref&gt; The Progressive Conservatives were again victorious across the province, and Kwinter remained on the opposition benches.<br /> <br /> In 2002, Kwinter publicly opposed the Liberal Party's position on tax credits for parents who send their children to private and non-Catholic denominational schools. The party opposes such credits as a detrimental to the public system. Kwinter referred to the distinction between publicly funded Catholic [[Separate School]]s and non-Catholic denominational schools as one of discrimination, though he also opposed funding for non-denominational private schools.<br /> <br /> ===McGuinty government===<br /> Kwinter was again re-elected in the [[2003 Ontario general election|2003 election]] without difficulty.&lt;ref name=&quot;2003 results&quot;&gt;{{cite web |url=http://results.elections.on.ca/results/2003_results/valid_votes.jsp?e_code=38&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |title=Summary of Valid Ballots by Candidate |publisher=Elections Ontario |date=October 2, 2003 |access-date=2014-03-02 |url-status=dead |archive-url=https://web.archive.org/web/20140320211050/http://results.elections.on.ca/results/2003_results/valid_votes.jsp?e_code=38&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |archive-date=March 20, 2014 |df=mdy-all }}&lt;/ref&gt; The election was won by the Liberals, and there was considerable media speculation as to whether or not Dalton McGuinty would appoint the septuagenarian Kwinter to cabinet again. Ultimately, Kwinter's public disagreements with party policy were not enough to sideline his career: he was appointed [[Ontario Minister of Public Safety and Security]] (essentially a retitled [[Solicitor General of Ontario|Solicitor-General]]'s position) on October 23, 2003.&lt;ref name=&quot;2003McGuintyCab&quot;&gt;{{cite news |title=Premier Dalton McGuinty and his 22-member cabinet were sworn in Thursday |publisher=Canadian Press NewsWire |date=October 23, 2003 |page=1}}&lt;/ref&gt;<br /> <br /> Kwinter put forward a plan to combat [[cannabis (drug)|marijuana]] grow-ops in Ontario that would permit local utilities to cut off electrical power to those in the illegal industry. There were many who opposed this plan on the grounds that innocent citizens could see their power cut off without warning in the event of an administrative or legal error.&lt;ref&gt;{{cite news |first=Antonella |last=Artuso |title=Hydro to root out grow ops: suspicious homes to lose power |newspaper=Toronto Sun |date=October 8, 2004|url=http://www.mapinc.org/drugnews/v04/n1427/a07.html}}&lt;/ref&gt;<br /> <br /> Kwinter was re-elected in the [[2007 Ontario general election|2007 provincial election]] despite a stronger challenge from the Progressive Conservative Party due to its support for extending funding to Jewish and other religious day schools.&lt;ref name=&quot;2007 results&quot;&gt;{{cite web |url=http://elections.on.ca/NR/rdonlyres/AB409CCD-84F3-46FA-B3BD-39AB659EFC2D/0/SummaryofValidBallotsCastforEachCandidate.pdf |title=Summary of Valid Ballots Cast for Each Candidate |publisher=Elections Ontario |date=October 10, 2007 |page=17 (xxvi) |url-status=dead |archive-url=https://web.archive.org/web/20091007160233/http://www.elections.on.ca/NR/rdonlyres/AB409CCD-84F3-46FA-B3BD-39AB659EFC2D/0/SummaryofValidBallotsCastforEachCandidate.pdf |archive-date=October 7, 2009 |df=mdy-all }}&lt;/ref&gt; Kwinter broke with the Liberal platform and [[cabinet solidarity]] by supporting the Progressive Conservative's proposal. The Liberal government was re-elected however Kwinter was dropped from Cabinet in the post-election [[cabinet shuffle]].&lt;ref name=&quot;McGuinty2007Cab&quot;&gt;{{cite news |title=Premier goes for new blood; Expanded 28-member cabinet has eight ministers from Toronto, three from 905 area |last1=Ferguson |first1=Rob |last2=Benzie |first2=Robert |newspaper=Toronto Star |date=October 31, 2007 |page=A13}}&lt;/ref&gt; While no official reason was given for the demotion the ''[[The Jewish Tribune (Canada)|Jewish Tribune]]'' claimed that it was a result of the position he took on school funding during the election campaign though it did not name its source for this claim.&lt;ref&gt;{{cite news |first=Atara |last=Beck |title=Kwinter kicked out of cabinet: Stand on inclusive public education the reason, source says |newspaper=Jewish Tribune |date=November 22, 2007 |page=2}}&lt;/ref&gt;&lt;ref&gt;{{cite news |title=10 new faces to spur 'activist' agenda: McGuinty left several key ministers in place and turfed out four others as he remade Ontario's Liberal cabinet |first=Keith |last=Leslie |publisher=Canadian Press |newspaper=Toronto Star |date=October 30, 2007}}&lt;/ref&gt;<br /> <br /> Following the cabinet shuffle Premier McGuinty appointed Kwinter to the position of chair Ontario investment and trade advisory council and the Parliamentary Assistant to the Minister of Economic Development and Trade (Investment Attraction and Trade).<br /> <br /> Kwinter retained his seat in the [[2011 Ontario general election|2011 provincial election]] against Progressive Conservative candidate [[Conservative Party of Canada candidates, 2006 Canadian federal election#Michael Mostyn (York Centre)|Michael Mostyn]] by 3,188 votes.&lt;ref&gt;{{cite news |first=Kris |last=Scheuer |url=http://www.mytowncrier.ca/monte-kwinter-wants-back-for-eighth-term.html |title=Monte Kwinter wants back for eighth term |newspaper=Town Crier |date=March 7, 2011 |access-date=March 20, 2011 |archive-url=https://web.archive.org/web/20110706185949/http://www.mytowncrier.ca/monte-kwinter-wants-back-for-eighth-term.html |archive-date=July 6, 2011 |url-status=dead }}&lt;/ref&gt;&lt;ref name=&quot;2011 results&quot;&gt;{{cite web |url=http://elections.on.ca/NR/rdonlyres/7849B894-4C4F-490E-9E8C-271BCF0C0D4D/5712/SummaryofvalidvotescastforeacndGE2011.pdf |title=Summary of Valid Ballots Cast for Each Candidate |publisher=Elections Ontario |date=October 6, 2011 |page=20 |url-status=dead |archive-url=https://web.archive.org/web/20130330163815/http://elections.on.ca/NR/rdonlyres/7849B894-4C4F-490E-9E8C-271BCF0C0D4D/5712/SummaryofvalidvotescastforeacndGE2011.pdf |archive-date=March 30, 2013 |df=mdy-all }}&lt;/ref&gt;<br /> <br /> ====Cabinet====<br /> {{s-start}}<br /> {{Canadian cabinet member navigational box header |ministry=Dalton_McGuinty}}<br /> {{ministry box cabinet posts<br /> | post1preceded = [[Rob Sampson]]<br /> | post1 = [[Ministry of Community Safety and Correctional Services|Minister of Community Safety and Correctional Services]]<br /> | post1years = 2003–2007<br /> | post1note =<br /> | post1followed = [[Rick Bartolucci]]<br /> }}<br /> {{s-end}}<br /> <br /> ===Wynne government===<br /> Kwinter's riding association nominated him to run as the Liberal candidate in the [[41st Ontario general election|next provincial election]] which occurred on June 12, 2014.&lt;ref name=paikin&gt;{{cite news|last=Paikin|first=Steve|title=The Original Sin|url=http://theagenda.tvo.org/blog/agenda-blogs/original-sin|access-date=March 14, 2014|newspaper=TVOntario|date=March 13, 2014}}&lt;/ref&gt; He defeated PC candidate Avi Yufest by 6,066 votes.&lt;ref name=&quot;2014 results&quot;&gt;{{cite web |title=General Election by District: York Centre |publisher=Elections Ontario |date=June 12, 2014 |url=http://wemakevotingeasy.ca/en/general-election-district-results.aspx?d=104 |access-date=September 2, 2014 |archive-url=https://web.archive.org/web/20140617111104/http://wemakevotingeasy.ca/en/general-election-district-results.aspx?d=104 |archive-date=June 17, 2014 |url-status=dead }}&lt;/ref&gt;<br /> <br /> From June 2014 to June 2016, he served as Parliamentary Assistant to the [[Ministry of Citizenship and Immigration|Minister of Citizenship, Immigration and International Trade]]. As of June 2016, he served as Parliamentary Assistant to the Minister of International Trade.&lt;ref&gt;{{Cite web|url=http://www.ontla.on.ca/web/members/members_detail.do?locale=en&amp;ID=56&amp;detailPage=members_detail_career|title=Request Rejected|website=www.ontla.on.ca}}&lt;/ref&gt;<br /> <br /> In October 2016, it was reported that Kwinter was living in a nursing home, Kensington Place, while recovering from an illness.&lt;ref&gt;{{Cite news|url=http://www.torontosun.com/2016/10/17/mpp-absence-not-unprecedented|title=MPP absence not unprecedented|newspaper=Toronto Sun|access-date=2016-10-27}}&lt;/ref&gt; In March 2017, Kwinter reappeared in public after his months long recovery from shingles. Requiring the use of a wheelchair and aid from a caregiver, Kwinter intended to return to his role and run in the 2018 election,&lt;ref&gt;{{Cite web|url=https://www.tvo.org/article/ontarios-oldest-ever-mpp-re-emerges|title=Ontario's oldest-ever MPP re-emerges|website=TVO.org}}&lt;/ref&gt; but later chose to retire from politics at the 2018 election, when the Wynne government was soundly defeated. Ramon Estaris lost that election, placing third to one-term MPP, [[Roman Baber]]. <br /> <br /> ==Death==<br /> Monte Kwinter died on July 21, 2023, at the age of 92.&lt;ref&gt;[https://benjaminsparkmemorialchapel.ca/ServiceDetails?snum=139523&amp;fg=0 Monte Kwinter]&lt;/ref&gt;&lt;ref&gt;[https://torontosun.com/news/provincial/longtime-toronto-liberal-mpp-monte-kwinter-dead-at-92 Longtime Toronto Liberal MPP Monte Kwinter dead at 92]&lt;/ref&gt;<br /> <br /> ==References==<br /> {{Reflist|2}}<br /> <br /> ==External links==<br /> * {{official website|http://montekwinter.onmpp.ca/}}<br /> * {{OntarioMPPbio|id=monte-kwinter}}<br /> {{McGuinty Ministry}}<br /> {{Peterson Ministry}}<br /> <br /> {{DEFAULTSORT:Kwinter, Monte}}<br /> [[Category:1931 births]]<br /> [[Category:2023 deaths]]<br /> [[Category:Canadian real estate agents]]<br /> [[Category:Jewish Canadian politicians]]<br /> [[Category:Members of the Executive Council of Ontario]]<br /> [[Category:Ontario Liberal Party MPPs]]<br /> [[Category:Politicians from Toronto]]<br /> [[Category:OCAD University alumni]]<br /> [[Category:21st-century Canadian politicians]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Monte_Kwinter&diff=1170004047 Monte Kwinter 2023-08-12T17:17:28Z <p>205.189.94.9: tense; longivity in Legislature being challenged at present. 2018 Election.</p> <hr /> <div>{{more citations|date=July 2023}}<br /> {{Short description|Canadian politician (1931–2023)}}<br /> {{Use Canadian English|date=September 2021}}<br /> {{Use dmy dates|date=September 2021}}<br /> {{Infobox officeholder<br /> | name = Monte Kwinter<br /> | honorific-suffix = <br /> | image = Monte Kwinter - MPP York Center - 2006.jpg<br /> | caption = MPP Kwinter in 2006<br /> | parliament1 = Ontario Provincial<br /> | term_start1 = May 2, 1985<br /> | term_end1 = June 7, 2018<br /> | predecessor1 = [[David Rotenberg]]<br /> | successor1 = [[Roman Baber]]<br /> | riding1 = [[York Centre (provincial electoral district)|York Centre]]&lt;br /&gt;&lt;small&gt;[[Wilson Heights (electoral district)|Wilson Heights]] (1985–1999)&lt;/small&gt;<br /> | party = [[Ontario Liberal Party|Liberal]]<br /> | birth_date = {{birth date|1931|3|22}}<br /> | birth_place = [[Toronto, Ontario]], Canada<br /> | death_date = {{death date and age|2023|7|21|1931|3|22}}<br /> | death_place = <br /> | residence = <br /> | occupation = Real estate agent<br /> | website = {{official website | http://montekwinter.onmpp.ca/}}<br /> }}<br /> <br /> '''Monte Kwinter''' (March 22, 1931 – July 21, 2023) was a Canadian politician in [[Ontario]]. He was a [[Ontario Liberal Party|Liberal]] member of the [[Legislative Assembly of Ontario]] from 1985 until 2018. He represented the riding of [[York Centre (provincial electoral district)|York Centre]] for much of that period. Kwinter was a [[Political minister|cabinet minister]] in the government of [[David Peterson]] from 1985 to 1990 and also in [[Dalton McGuinty]]'s government from 2003 to 2007. Kwinter was the oldest person ever to be an MPP in Ontario, although at his death, [[Raymond Cho (politician)]], was seven months shy of surpassing him. <br /> <br /> On January 26, 2013, Kwinter became the oldest person to ever serve in the Ontario legislature at the age of 81 years 310 days, surpassing previous record holder [[Lex MacKenzie]], who was 81 years and 309 days old when he left provincial politics in 1967.&lt;ref name=TorStar20130124&gt;{{cite news|last=Brennan|first=Richard J.|title=Monte Kwinter becomes oldest serving MPP in Ontario history this weekend|url=https://www.thestar.com/news/canada/2013/01/24/monte_kwinter_becomes_oldest_serving_mpp_in_ontario_history_this_weekend.html|newspaper=Toronto Star|date=January 24, 2013}}&lt;/ref&gt;<br /> <br /> On July 20, 2017, Kwinter announced that he would not be seeking re-election in the upcoming [[42nd Ontario general election|2018 election]] and that &quot;the time has come to let the next generation serve, and I look forward to offering my support to our future York Centre Liberal MPP.&quot;&lt;ref&gt;{{cite press release| url=https://ontarioliberal.ca/news/5970b8c79b36dd03c4ecffda?l=EN| title=Monte Kwinter MPP for York Centre not seeking re-election in 2018| date=July 20, 2017| publisher=[[Ontario Liberal Party]]| access-date=July 20, 2017| archive-url=https://web.archive.org/web/20171007021043/https://ontarioliberal.ca/news/5970b8c79b36dd03c4ecffda?l=EN| archive-date=October 7, 2017| url-status=dead}}&lt;/ref&gt;<br /> <br /> ==Background==<br /> Monte Kwinter was born in Toronto on March 22, 1931.&lt;ref&gt;[https://books.google.com/books?id=hfsLAQAAMAAJ&amp;q=Monte+Kwinter+1931]&lt;/ref&gt; He was educated at the [[Ontario College of Art]], [[Syracuse University]], the [[Massachusetts Institute of Technology]], the [[Institute of Contemporary Art, Boston]], and the [[Université de Montréal]]. He has a degree in [[fine arts]], specializing in industrial design.<br /> <br /> Kwinter worked in real estate before entering political life, eventually owning his own firm within the field. He was also a founding member of the Toronto Regional Council of [[B'nai Brith Canada]], served on the board of directors of the Upper Canadian Zoological Society, and the [[Canadian National Exhibition]], was chair of the [[Toronto Harbour Commission]], chair of the [[Toronto Humane Society]], vice-president of the Ontario College of Art, and served as an executive member on the League for Human Rights of B'nai B'rith Canada.&lt;ref&gt;{{Cite web|url=https://news.ontario.ca/newsroom/en|title=Newsroom : Recent News|website=news.ontario.ca}}&lt;/ref&gt;<br /> <br /> Kwinter was also involved in the [[Liberal Party of Canada]] as a fundraiser and organizer and worked on [[John Turner]]'s [[1984 Liberal Party of Canada leadership election|1984 leadership campaign]].<br /> <br /> ==Politics==<br /> ===Peterson government===<br /> Kwinter was elected to the Ontario legislature in the [[1985 Ontario general election|provincial election of 1985]] as a [[Ontario Liberal Party|Liberal]], defeating incumbent [[Progressive Conservative Party of Ontario|Progressive Conservative]] [[David Rotenberg]] and [[Ontario New Democratic Party|New Democrat]] city councillor [[Howard Moscoe]] in the [[North York]] riding of [[Wilson Heights (electoral district)|Wilson Heights]] (which has a large immigrant population and a prominent [[Orthodox Jewish]] community; Kwinter was himself [[Jewish]]).&lt;ref name=&quot;1985 results&quot;&gt;{{cite news |title=Results of vote in Ontario election |newspaper=The Globe and Mail |date=May 3, 1985 |page=13}}&lt;/ref&gt;<br /> <br /> Kwinter had been a strong advocate for the completion of the controversial [[Spadina Expressway]] in [[Toronto]] but abandoned this position soon after winning election.<br /> <br /> On June 26, 1985, he was appointed [[Ontario Minister of Consumer and Commercial Relations|Minister of Consumer and Commercial Relations]] and Minister of Financial Institutions.&lt;ref name=&quot;1985PetCab&quot;&gt;{{cite news |title=Liberals pledge reform as they take over in Ontario |newspaper=The Gazette |location=Montreal, Que |date=June 27, 1985 |page=B1}}&lt;/ref&gt;<br /> <br /> Kwinter was easily re-elected in the [[1987 Ontario general election|provincial election of 1987]], and was named [[Ministry of Industry, Trade and Technology|Minister of Industry, Trade and Technology]] in September of that year.&lt;ref name=&quot;1987 results&quot;&gt;{{cite news |title=Results from individual ridings |newspaper=The Windsor Star |date=September 11, 1987 |page=F2}}&lt;/ref&gt;&lt;ref name=&quot;1987PetCab&quot;&gt;{{cite news |title=Wrye gets new cabinet job |newspaper=The Windsor Star |date=September 29, 1987 |page=A1}}&lt;/ref&gt; In June 1989, Kwinter was implicated in the [[Patti Starr affair|Patti Starr]] corruption scandal. Starr, who was head of the National Council of Jewish Women, misused her position by having the organization make political contributions to the riding associations of prominent Liberal MPPs. Kwinter's riding of Wilson Heights was among those who received these illegal contributions.&lt;ref&gt;{{cite news |first=Matt |last=Maychak |title=Liberal links to fallen Starr scares MPPs |newspaper=Toronto Star |date=June 15, 1989 |page=A30}}&lt;/ref&gt; On August 2, when Peterson shuffled his cabinet in the wake of the scandal, Kwinter was one of only two ministers who retained their positions despite the scandal. Eight other ministers lost their positions.&lt;ref&gt;{{cite news |first1=Alan |last1=Storey |first2=Derek |last2=Ferguson |title=Tainted ministers axed: Peterson drops 8 in cabinet shuffle |newspaper=Toronto Star |date=August 2, 1989 |pages=A1, A27}}&lt;/ref&gt;<br /> <br /> ====Cabinet====<br /> {{s-start}}<br /> {{Canadian cabinet member navigational box header |ministry=David_Peterson}}<br /> {{ministry box cabinet posts<br /> | post3preceded = [[Hugh O'Neil]]<br /> | post3 = [[Ministry of Industry, Trade and Technology|Minister of Industry, Trade and Technology]]<br /> | post3years = 1987–1990<br /> | post3note =<br /> | post3followed = [[Allan Pilkey]]<br /> <br /> | post2preceded = New position<br /> | post2 = Minister of Financial Institutions<br /> | post2years = 1986–1987<br /> | post2note =<br /> | post2followed = [[Robert Nixon (politician)|Robert Nixon]]<br /> <br /> | post1preceded = [[Bob Runciman]]<br /> | post1 = [[Ministry of Government Services (Ontario)|Minister of Consumer and Commercial Relations]]<br /> | post1years = 1985–1987<br /> | post1note =<br /> | post1followed = [[Bill Wrye]]<br /> }}<br /> {{s-end}}<br /> <br /> ===Opposition===<br /> The Liberals were upset by the [[Ontario New Democratic Party|New Democratic Party]] in the [[1990 Ontario provincial election|1990 provincial election]], although Kwinter himself was again re-elected without difficulty.&lt;ref name=&quot;1990 results&quot;&gt;{{cite news |title=Ontario election: Riding-by-riding voting results |newspaper=The Globe and Mail |date=September 7, 1990 |page=A12}}&lt;/ref&gt;<br /> <br /> Kwinter faced a more serious challenge in the [[1995 Ontario provincial election|1995 election]], which was won by the Progressive Conservatives; Tory candidate Sam Pasternak came within 3,000 votes of upsetting him.&lt;ref name=&quot;1995 results&quot;&gt;{{cite web |url=http://results.elections.on.ca/results/1995_results/valid_votes.jsp?e_code=36&amp;rec=0&amp;district=wilson+heights&amp;flag=E&amp;layout=G |title=Summary of Valid Ballots by Candidate |publisher=Elections Ontario |date=June 8, 1995 |access-date=2014-03-02}}&lt;/ref&gt; Kwinter was not a prominent figure in the Legislative Assembly during his time in the opposition, though he was nevertheless regarded as a strong community representative.<br /> <br /> Despite having a reputation for being on the right wing of the Ontario Liberal Party, Kwinter supported left wing candidate [[Gerard Kennedy]] in the party's 1996 [[leadership convention]].<br /> <br /> The Progressive Conservative government of [[Mike Harris]] reduced the number of provincial ridings from 130 to 103 in 1996, forcing several incumbent [[Member of Provincial Parliament (Ontario)|Members of Provincial Parliament]] (MPPs) to compete against one another for re-election. In some cases, MPPs from the same party were forced to compete against one another for their riding nominations. Kwinter was challenged for the Liberal nomination in the new riding of [[York Centre (Ontario riding)|York Centre]] by fellow MPP [[Anna-Marie Castrilli]], who had unsuccessfully competed for the party's leadership in 1996.<br /> <br /> Castrilli's challenge to Kwinter was extremely controversial, and was marked by serious divisions in the local riding association. Kwinter was subjected to a number of incidents of [[anti-Semitic]] abuse during this period, and on one occasion received hate mail at his legislative office. Castrilli was not involved in these incidents, but they were regarded by many as reinforcing the unpleasant character of the nomination battle.<br /> <br /> Liberal leader Dalton McGuinty tried to convince Castrilli to run in a different riding, but was unsuccessful. Rumours began to circulate that Kwinter was planning to defect to the Progressive Conservatives in the event that he was defeated. As it happened, there was never an opportunity to test this speculation—Kwinter was able to defeat Castrilli, who defected to the Tories herself shortly thereafter.<br /> <br /> Kwinter's nomination difficulties proved to be his only real challenge of the [[1999 Ontario general election|1999 campaign]], and he was again returned by a significant margin in the general election.&lt;ref name=&quot;1999 results&quot;&gt;{{cite web |url=http://results.elections.on.ca/results/1999_results/valid_votes.jsp?e_code=37&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |title=Summary of Valid Ballots by Candidate |publisher=Elections Ontario |date=June 3, 1999 |access-date=2014-03-02 |url-status=dead |archive-url=https://web.archive.org/web/20140320204717/http://results.elections.on.ca/results/1999_results/valid_votes.jsp?e_code=37&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |archive-date=March 20, 2014 |df=mdy-all }}&lt;/ref&gt; The Progressive Conservatives were again victorious across the province, and Kwinter remained on the opposition benches.<br /> <br /> In 2002, Kwinter publicly opposed the Liberal Party's position on tax credits for parents who send their children to private and non-Catholic denominational schools. The party opposes such credits as a detrimental to the public system. Kwinter referred to the distinction between publicly funded Catholic [[Separate School]]s and non-Catholic denominational schools as one of discrimination, though he also opposed funding for non-denominational private schools.<br /> <br /> ===McGuinty government===<br /> Kwinter was again re-elected in the [[2003 Ontario general election|2003 election]] without difficulty.&lt;ref name=&quot;2003 results&quot;&gt;{{cite web |url=http://results.elections.on.ca/results/2003_results/valid_votes.jsp?e_code=38&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |title=Summary of Valid Ballots by Candidate |publisher=Elections Ontario |date=October 2, 2003 |access-date=2014-03-02 |url-status=dead |archive-url=https://web.archive.org/web/20140320211050/http://results.elections.on.ca/results/2003_results/valid_votes.jsp?e_code=38&amp;rec=0&amp;district=york+centre&amp;flag=E&amp;layout=G |archive-date=March 20, 2014 |df=mdy-all }}&lt;/ref&gt; The election was won by the Liberals, and there was considerable media speculation as to whether or not Dalton McGuinty would appoint the septuagenarian Kwinter to cabinet again. Ultimately, Kwinter's public disagreements with party policy were not enough to sideline his career: he was appointed [[Ontario Minister of Public Safety and Security]] (essentially a retitled [[Solicitor General of Ontario|Solicitor-General]]'s position) on October 23, 2003.&lt;ref name=&quot;2003McGuintyCab&quot;&gt;{{cite news |title=Premier Dalton McGuinty and his 22-member cabinet were sworn in Thursday |publisher=Canadian Press NewsWire |date=October 23, 2003 |page=1}}&lt;/ref&gt;<br /> <br /> Kwinter put forward a plan to combat [[cannabis (drug)|marijuana]] grow-ops in Ontario that would permit local utilities to cut off electrical power to those in the illegal industry. There were many who opposed this plan on the grounds that innocent citizens could see their power cut off without warning in the event of an administrative or legal error.&lt;ref&gt;{{cite news |first=Antonella |last=Artuso |title=Hydro to root out grow ops: suspicious homes to lose power |newspaper=Toronto Sun |date=October 8, 2004|url=http://www.mapinc.org/drugnews/v04/n1427/a07.html}}&lt;/ref&gt;<br /> <br /> Kwinter was re-elected in the [[2007 Ontario general election|2007 provincial election]] despite a stronger challenge from the Progressive Conservative Party due to its support for extending funding to Jewish and other religious day schools.&lt;ref name=&quot;2007 results&quot;&gt;{{cite web |url=http://elections.on.ca/NR/rdonlyres/AB409CCD-84F3-46FA-B3BD-39AB659EFC2D/0/SummaryofValidBallotsCastforEachCandidate.pdf |title=Summary of Valid Ballots Cast for Each Candidate |publisher=Elections Ontario |date=October 10, 2007 |page=17 (xxvi) |url-status=dead |archive-url=https://web.archive.org/web/20091007160233/http://www.elections.on.ca/NR/rdonlyres/AB409CCD-84F3-46FA-B3BD-39AB659EFC2D/0/SummaryofValidBallotsCastforEachCandidate.pdf |archive-date=October 7, 2009 |df=mdy-all }}&lt;/ref&gt; Kwinter broke with the Liberal platform and [[cabinet solidarity]] by supporting the Progressive Conservative's proposal. The Liberal government was re-elected however Kwinter was dropped from Cabinet in the post-election [[cabinet shuffle]].&lt;ref name=&quot;McGuinty2007Cab&quot;&gt;{{cite news |title=Premier goes for new blood; Expanded 28-member cabinet has eight ministers from Toronto, three from 905 area |last1=Ferguson |first1=Rob |last2=Benzie |first2=Robert |newspaper=Toronto Star |date=October 31, 2007 |page=A13}}&lt;/ref&gt; While no official reason was given for the demotion the ''[[The Jewish Tribune (Canada)|Jewish Tribune]]'' claimed that it was a result of the position he took on school funding during the election campaign though it did not name its source for this claim.&lt;ref&gt;{{cite news |first=Atara |last=Beck |title=Kwinter kicked out of cabinet: Stand on inclusive public education the reason, source says |newspaper=Jewish Tribune |date=November 22, 2007 |page=2}}&lt;/ref&gt;&lt;ref&gt;{{cite news |title=10 new faces to spur 'activist' agenda: McGuinty left several key ministers in place and turfed out four others as he remade Ontario's Liberal cabinet |first=Keith |last=Leslie |publisher=Canadian Press |newspaper=Toronto Star |date=October 30, 2007}}&lt;/ref&gt;<br /> <br /> Following the cabinet shuffle Premier McGuinty appointed Kwinter to the position of chair Ontario investment and trade advisory council and the Parliamentary Assistant to the Minister of Economic Development and Trade (Investment Attraction and Trade).<br /> <br /> Kwinter retained his seat in the [[2011 Ontario general election|2011 provincial election]] against Progressive Conservative candidate [[Conservative Party of Canada candidates, 2006 Canadian federal election#Michael Mostyn (York Centre)|Michael Mostyn]] by 3,188 votes.&lt;ref&gt;{{cite news |first=Kris |last=Scheuer |url=http://www.mytowncrier.ca/monte-kwinter-wants-back-for-eighth-term.html |title=Monte Kwinter wants back for eighth term |newspaper=Town Crier |date=March 7, 2011 |access-date=March 20, 2011 |archive-url=https://web.archive.org/web/20110706185949/http://www.mytowncrier.ca/monte-kwinter-wants-back-for-eighth-term.html |archive-date=July 6, 2011 |url-status=dead }}&lt;/ref&gt;&lt;ref name=&quot;2011 results&quot;&gt;{{cite web |url=http://elections.on.ca/NR/rdonlyres/7849B894-4C4F-490E-9E8C-271BCF0C0D4D/5712/SummaryofvalidvotescastforeacndGE2011.pdf |title=Summary of Valid Ballots Cast for Each Candidate |publisher=Elections Ontario |date=October 6, 2011 |page=20 |url-status=dead |archive-url=https://web.archive.org/web/20130330163815/http://elections.on.ca/NR/rdonlyres/7849B894-4C4F-490E-9E8C-271BCF0C0D4D/5712/SummaryofvalidvotescastforeacndGE2011.pdf |archive-date=March 30, 2013 |df=mdy-all }}&lt;/ref&gt;<br /> <br /> ====Cabinet====<br /> {{s-start}}<br /> {{Canadian cabinet member navigational box header |ministry=Dalton_McGuinty}}<br /> {{ministry box cabinet posts<br /> | post1preceded = [[Rob Sampson]]<br /> | post1 = [[Ministry of Community Safety and Correctional Services|Minister of Community Safety and Correctional Services]]<br /> | post1years = 2003–2007<br /> | post1note =<br /> | post1followed = [[Rick Bartolucci]]<br /> }}<br /> {{s-end}}<br /> <br /> ===Wynne government===<br /> Kwinter's riding association nominated him to run as the Liberal candidate in the [[41st Ontario general election|next provincial election]] which occurred on June 12, 2014.&lt;ref name=paikin&gt;{{cite news|last=Paikin|first=Steve|title=The Original Sin|url=http://theagenda.tvo.org/blog/agenda-blogs/original-sin|access-date=March 14, 2014|newspaper=TVOntario|date=March 13, 2014}}&lt;/ref&gt; He defeated PC candidate Avi Yufest by 6,066 votes.&lt;ref name=&quot;2014 results&quot;&gt;{{cite web |title=General Election by District: York Centre |publisher=Elections Ontario |date=June 12, 2014 |url=http://wemakevotingeasy.ca/en/general-election-district-results.aspx?d=104 |access-date=September 2, 2014 |archive-url=https://web.archive.org/web/20140617111104/http://wemakevotingeasy.ca/en/general-election-district-results.aspx?d=104 |archive-date=June 17, 2014 |url-status=dead }}&lt;/ref&gt;<br /> <br /> From June 2014 to June 2016, he served as Parliamentary Assistant to the [[Ministry of Citizenship and Immigration|Minister of Citizenship, Immigration and International Trade]]. As of June 2016, he served as Parliamentary Assistant to the Minister of International Trade.&lt;ref&gt;{{Cite web|url=http://www.ontla.on.ca/web/members/members_detail.do?locale=en&amp;ID=56&amp;detailPage=members_detail_career|title=Request Rejected|website=www.ontla.on.ca}}&lt;/ref&gt;<br /> <br /> In October 2016, it was reported that Kwinter was living in a nursing home, Kensington Place, while recovering from an illness.&lt;ref&gt;{{Cite news|url=http://www.torontosun.com/2016/10/17/mpp-absence-not-unprecedented|title=MPP absence not unprecedented|newspaper=Toronto Sun|access-date=2016-10-27}}&lt;/ref&gt; In March 2017, Kwinter reappeared in public after his months long recovery from shingles. Requiring the use of a wheelchair and aid from a caregiver, Kwinter intended to return to his role and run in the 2018 election,&lt;ref&gt;{{Cite web|url=https://www.tvo.org/article/ontarios-oldest-ever-mpp-re-emerges|title=Ontario's oldest-ever MPP re-emerges|website=TVO.org}}&lt;/ref&gt; but later chose to retire from politics at the 2018 election, when the Wynne government was soundly defeated. Ramon Estaris lost that election, placing third to one-term MPP, [[Roman Baber]]. <br /> <br /> ==Death==<br /> Monte Kwinter died on July 21, 2023, at the age of 92.&lt;ref&gt;[https://benjaminsparkmemorialchapel.ca/ServiceDetails?snum=139523&amp;fg=0 Monte Kwinter]&lt;/ref&gt;&lt;ref&gt;[https://torontosun.com/news/provincial/longtime-toronto-liberal-mpp-monte-kwinter-dead-at-92 Longtime Toronto Liberal MPP Monte Kwinter dead at 92]&lt;/ref&gt;<br /> <br /> ==References==<br /> {{Reflist|2}}<br /> <br /> ==External links==<br /> * {{official website|http://montekwinter.onmpp.ca/}}<br /> * {{OntarioMPPbio|id=monte-kwinter}}<br /> {{McGuinty Ministry}}<br /> {{Peterson Ministry}}<br /> <br /> {{DEFAULTSORT:Kwinter, Monte}}<br /> [[Category:1931 births]]<br /> [[Category:2023 deaths]]<br /> [[Category:Canadian real estate agents]]<br /> [[Category:Jewish Canadian politicians]]<br /> [[Category:Members of the Executive Council of Ontario]]<br /> [[Category:Ontario Liberal Party MPPs]]<br /> [[Category:Politicians from Toronto]]<br /> [[Category:OCAD University alumni]]<br /> [[Category:21st-century Canadian politicians]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Talk:John_William_Thomson&diff=1169999165 Talk:John William Thomson 2023-08-12T16:41:13Z <p>205.189.94.9: /* wife's obit in August 2023 */ new section</p> <hr /> <div>{{WikiProject Biography|politician-work-group=yes|politician-priority=low<br /> |living=yes<br /> |class=Stub<br /> |listas=Thomson, John William<br /> }}<br /> {{WikiProject Canada|ab=yes|ppap=yes|class=Stub|importance=low}}<br /> <br /> == wife's obit in August 2023 ==<br /> <br /> He just lost his wife Joan Elizabeth McFarlane Thomson in late July 2023. Her obit appeared in Globe and Mail in August. <br /> Married on May 5, 1956, two children born in St. Catharines, and three in Calgary. Here's hoping that one of her children pens a &quot;Lives lived&quot;..... [[Special:Contributions/205.189.94.9|205.189.94.9]] ([[User talk:205.189.94.9|talk]]) 16:41, 12 August 2023 (UTC)</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Dan_Petry&diff=1169405556 Dan Petry 2023-08-08T22:22:53Z <p>205.189.94.9: /* Personal life */ link to St. Mary's Prep wiki entry.</p> <hr /> <div>{{short description|American baseball player (born 1958)}}<br /> {{Infobox baseball biography<br /> |image=<br /> |name=Dan Petry<br /> |position=[[Pitcher]]<br /> |bats=Right<br /> |throws=Right<br /> |birth_date={{Birth date and age|1958|11|13}}<br /> |birth_place=[[Palo Alto, California]], U.S.<br /> |debutleague = MLB<br /> |debutdate=July 8<br /> |debutyear=1979<br /> |debutteam=Detroit Tigers<br /> |finalleague = MLB<br /> |finaldate=October 5<br /> |finalyear=1991<br /> |finalteam=Boston Red Sox<br /> |statleague = MLB<br /> |stat1label=[[Win–loss record (pitching)|Win–loss record]]<br /> |stat1value=125–104<br /> |stat2label=[[Earned run average]]<br /> |stat2value=3.95<br /> |stat3label=[[Strikeout]]s<br /> |stat3value=1,063<br /> |teams=<br /> *[[Detroit Tigers]] ({{mlby|1979}}–{{mlby|1987}})<br /> *[[Los Angeles Angels of Anaheim|California Angels]] ({{mlby|1988}}–{{mlby|1989}})<br /> *[[Detroit Tigers]] ({{mlby|1990}}–{{mlby|1991}})<br /> *[[Atlanta Braves]] ({{mlby|1991}})<br /> *[[Boston Red Sox]] ({{mlby|1991}})<br /> |highlights=<br /> *[[Major League Baseball All-Star Game|All-Star]] ([[1985 Major League Baseball All-Star Game|1985]])<br /> *[[World Series]] champion ({{wsy|1984}})<br /> }}<br /> '''Daniel Joseph Petry''' ({{IPAc-en|ˈ|p|iː|t|r|iː}} {{respell|PEE|tree}};&lt;ref&gt;[http://www.mlb.com/det/downloads/mediaguides/mediaguide_80.pdf Detroit Tigers 1980 Press-TV-Radio Guide (pronunciations on page 38).] {{Webarchive|url=https://web.archive.org/web/20210414200411/http://www.mlb.com/det/downloads/mediaguides/mediaguide_80.pdf |date=2021-04-14 }} Retrieved June 7, 2020&lt;/ref&gt; born November 13, 1958) is an American former [[Major League Baseball]] [[pitcher]] for the [[Detroit Tigers]] (1979–87 and 1990–91), [[Los Angeles Angels|California Angels]] (1988–89), [[Atlanta Braves]] (1991) and [[Boston Red Sox]] (1991). He currently serves as a studio analyst for the Detroit Tigers on [[Bally Sports Detroit]].<br /> <br /> ==Playing career==<br /> Petry helped the Tigers win the [[1984 World Series]] and the 1987 American League Eastern Division, and helped the Braves win the 1991 National League pennant. He was elected to the American League [[Major League Baseball All-Star Game|All-Star]] team in 1985. He led the American League in games started (38) in 1983. In 1982 and 1984, Petry finished ninth and fifth, respectively, in American League [[Cy Young Award]] voting.<br /> <br /> In 13 years he had a 125-104 record (.546), 370 appearances, 300 games started, 52 complete games, 11 shutouts, one save, {{frac|2,080|1|3}} innings pitched, 1,984 hits allowed, 1,025 runs allowed, 912 earned runs allowed, 218 home runs allowed, 852 walks allowed, 1,063 strikeouts, 47 hit batsmen, 77 wild pitches, seven balks and a 3.95 earned run average.<br /> <br /> Petry only appeared in 13 games as a Red Sox, but in 1991, that is where he ended his career. Used strictly as a relief pitcher, he managed to pick up his one and only MLB save. It came on September 30, 1991 against the Brewers. Petry pitched 1 2/3 scoreless innings to close out a wild 9-8 Red Sox victory over the Brewers.&lt;ref&gt;{{Cite web|url=https://www.baseball-reference.com/boxes/MIL/MIL199109300.shtml|title=Boston Red Sox at Milwaukee Brewers Box Score, September 30, 1991}}&lt;/ref&gt;<br /> <br /> Defensively, Petry was an above average fielding pitcher, posting a .980 [[fielding percentage]], committing only 12 errors in 603 [[total chances]], which was 23 points higher than the league average at his position.<br /> <br /> ==Post-playing career==<br /> In 2012, Petry served as a substitute [[color analyst]] for the [[Detroit Tigers Radio Network]], teaming with play-by-play announcer [[Dan Dickerson]] for several road games while regular analyst [[Jim Price (catcher)|Jim Price]] recuperated from health problems.&lt;ref&gt;{{cite news|url=http://www.detroitnews.com/article/20120507/SPORTS0104/205070392/1361/Tigers-radio-broadcaster-Jim-Price-will-miss-nine-game-road-trip |title=Dan Petry to pinch-hit again for Jim Price on Tigers' nine-game road trip |last=Paul |first=Tony |newspaper=The Detroit News |date=May 7, 2012 }}{{dead link|date=December 2016 |bot=InternetArchiveBot |fix-attempted=yes }}&lt;/ref&gt; In 2022, he resumed filling in for Price while the Tigers were on the road.<br /> <br /> On January 15, 2019, Petry was named a studio analyst for the [[Detroit Tigers]] on [[Bally Sports Detroit]].&lt;ref&gt;{{cite web|url=https://www.mlb.com/tigers/news/kirk-gibson-jack-morris-join-tigers-telecasts/c-302725484 |title=Gibson, Morris join Tigers broadcasting team |first=Jason |last=Beck |work=MLB.com |date=January 15, 2019 |access-date=January 15, 2019}}&lt;/ref&gt;<br /> <br /> ==Personal life==<br /> Petry and his wife, Christine have two sons, Matt, who is the head coach of the [[St. Mary's Preparatory|Orchard Lake St. Mary’s]] baseball team, who have won three Michigan High School Athletic Association championships under Petry, and [[Jeff Petry|Jeff]], who currently is a defenseman for the [[Montreal Canadiens]] of the [[National Hockey League]]. Petry attended [[El Dorado High School (Placentia, California)|El Dorado High School]] in [[Placentia, California|Placentia]], California, where he was a [[California Interscholastic Federation|CIF]] championship winning pitcher.<br /> <br /> {{Portal|Baseball}}<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==External links==<br /> {{baseballstats|mlb=120476|espn=983|br=p/petryda01|fangraphs=1010246|brm=petry-001dan|retro=P/Ppetrd001}}<br /> *[https://baseballbiography.com/dan-petry-1958 Dan Petry] at Baseball Library<br /> *[https://thisdayinbaseball.com/dan-petry-page/ Dan Petry] Facts, Biography &amp; Chronology at This Day In Baseball <br /> {{1984 Detroit Tigers}}<br /> <br /> {{authority control}}<br /> <br /> {{DEFAULTSORT:Petry, Dan}}<br /> [[Category:1958 births]]<br /> [[Category:American League All-Stars]]<br /> [[Category:Atlanta Braves players]]<br /> [[Category:Baseball players from Palo Alto, California]]<br /> [[Category:Boston Red Sox players]]<br /> [[Category:Bristol Tigers players]]<br /> [[Category:California Angels players]]<br /> [[Category:Detroit Tigers announcers]]<br /> [[Category:Detroit Tigers players]]<br /> [[Category:Evansville Triplets players]]<br /> [[Category:Lakeland Tigers players]]<br /> [[Category:Living people]]<br /> [[Category:Major League Baseball broadcasters]]<br /> [[Category:Major League Baseball pitchers]]<br /> [[Category:Montgomery Rebels players]]<br /> [[Category:Palm Springs Angels players]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Presbyterian_Church_of_Vanuatu&diff=1169402667 Presbyterian Church of Vanuatu 2023-08-08T22:02:03Z <p>205.189.94.9: linking to Presbyterian churches of these nations,,,</p> <hr /> <div>{{Infobox Christian denomination<br /> |name = Presbyterian Church of Vanuatu&lt;br&gt;{{native name|fr|Église presbytérienne de Vanuatu}}<br /> |image = PCVanuatu logo.png<br /> |main_classification = [[Protestant]]<br /> |orientation = [[Calvinist]]<br /> |theology = [[Reformed tradition|Reformed]] [[Evangelical]]<br /> |polity = [[Presbyterian]]<br /> |founder =<br /> |founded_date = 1948<br /> |area = Vanuatu<br /> |associations = [[World Communion of Reformed Churches]], [[World Council of Churches]]<br /> |members = 78,000 baptised and 65,000 active members&lt;ref&gt;{{cite web|url=http://pcv.vu |title=PCV &amp;#124; Home |publisher=Pcv.vu |date=2014-03-24 |accessdate=2015-05-19}}&lt;/ref&gt;<br /> |congregations = 400 and 450 house fellowships<br /> |ministers = 200<br /> |church_website = <br /> }}<br /> <br /> The '''Presbyterian Church of Vanuatu''' ({{lang-fr|Église presbytérienne de Vanuatu}}), or the '''Presbitirin Jyos Blong Vanuatu''' in [[Bislama]], is the largest [[Christian denomination]] in [[Vanuatu]].&lt;ref name=&quot;Oikoumene.org&quot;&gt;{{cite web|url=http://www.oikoumene.org/en/member-churches/presbyterian-church-of-vanuatu |title=Presbyterian Church of Vanuatu — World Council of Churches |publisher=Oikoumene.org |date= |accessdate=2015-05-19}}&lt;/ref&gt;<br /> <br /> ==History==<br /> It was created by missionaries of the [[London Missionary Society]] in the mid-1800s. In 1838 Rev John William arrived on the Island of [[Futuna Island, Vanuatu|Futuna]]. In [[Eromango]] Rev. William was martyred and eaten. In 1841 Apela and Samuele were placed to Futuna. Both of them were martyred. They prepared the way for [[Presbyterians]] from [[Presbyterian Church in Canada|Canada]], [[Church of Scotland|Scotland]], [[Presbyterian Church of Australia|Australia]], and [[Presbyterian Church of Aotearoa New Zealand|New Zealand]]. The Presbyterian Mission Synod contributed to the mission in the [[New Hebrides]]. Two prominent missionaries were [[John Gibson Paton]] from [[Scotland]] and [[John Geddie (missionary)|John Geddie]] from [[Nova Scotia]]. Even today the Scottish Presbyterian tradition is visible in the life of the Vanuatuan church. The church developed rapidly from the south to the north. It employed indigenous pastors and teachers. The church become autonomous in 1948 as the Presbyterian Church in the New Hebrides.&lt;ref name=&quot;reformiert-online1&quot;&gt;{{cite web|author= |url=http://www.reformiert-online.net/adressen/detail.php?id=112183&amp;lg=de |title=Adressdatenbank reformierter Kirchen und Einrichtungen |publisher=Reformiert-online.net |date= |accessdate=2015-05-19}}&lt;/ref&gt; Vanuatu became free from [[British Empire|British]] and [[France|French]] colonization in 1980. Most of the members of the new government were Presbyterians, because the Presbyterian church is the only denomination that established a theological seminary and concentrated on educating the Ni-Vanuatu people.&lt;ref&gt;{{cite web |url=http://www.presbyterianmission.org/ministries/global/vanuatu/ |title=Vanuatu — Presbyterians at work around the world — Mission and Ministry — Presbyterian Mission Agency |publisher=Presbyterianmission.org |date= |accessdate=2015-05-19 |archive-url=https://web.archive.org/web/20160313202237/http://www.presbyterianmission.org/ministries/global/vanuatu/ |archive-date=2016-03-13 |url-status=dead }}&lt;/ref&gt;<br /> <br /> ==Statistics==<br /> The denomination has approximately 78,000 members and 400 congregations, as well as 450 house fellowships in 6 presbyteries as of January 1, 2006.&lt;ref name=&quot;Oikoumene.org&quot;/&gt; It is the largest denomination in the country, representing more than 30% of the population of [[Vanuatu]].&lt;ref&gt;{{cite web |url=http://www.unitingworld.org.au/about/our-overseas-partners/the-pacific/the-presbyterian-church-of-vanuatu/ |title=The Presbyterian Church of Vanuatu « UnitingWorld |publisher=Unitingworld.org.au |date= |accessdate=2015-05-19 |archive-url=https://web.archive.org/web/20170621145514/http://www.unitingworld.org.au/about/our-overseas-partners/the-pacific/the-presbyterian-church-of-vanuatu/ |archive-date=2017-06-21 |url-status=dead }}&lt;/ref&gt; <br /> [[File:PortVilaPresbyterianChurch.jpg|thumb|left|250px|Paton Memorial Church in [[Port Vila]].]]<br /> <br /> The PCV (Presbyterian Church of Vanuatu) is headed by a moderator with offices in Port Vila. The PCV is particularly strong in the provinces of Tafea, Shefa, and Malampa. The Province of Sanma is mainly Presbyterian with a strong Roman Catholic minority in the Francophone areas of the province. There are some Presbyterian people, but no organised Presbyterian churches in Penama and Torba, both of which are traditionally Anglican. Vanuatu is the only country in the South Pacific with a significant Presbyterian heritage and membership.<br /> <br /> The church runs schools. PCV ministers are trained in the Presbyterian's official theological institute, the [[Talua Ministry Training Centre]] on [[Espiritu Santo|South Santo]].&lt;ref&gt;{{cite web|url=http://www.unitingworld.org.au/partners/our-overseas-partners/the-pacific/the-presbyterian-church-of-vanuatu/ |accessdate=March 9, 2013 |url-status=dead |archiveurl=https://web.archive.org/web/20130418202356/http://www.unitingworld.org.au/partners/our-overseas-partners/the-pacific/the-presbyterian-church-of-vanuatu/ |archivedate=April 18, 2013 |title=UnitingWorld &amp;#124; Home }}&lt;/ref&gt; It offers diploma of theology, diploma of mission and Bachelor of Ministries Programs. Graduates from the college become church leaders in various denominations and evangelists to isolated islands.&lt;ref&gt;{{cite web|url=http://www.talua.org/talua.htm |title=Find out about Talua Ministry Training Centre, Vanuatu |publisher=Talua.org |date= |accessdate=2015-05-19}}&lt;/ref&gt;<br /> [[File:Flags of Vanuatu and the PCV.JPG|thumb|230px|Flagpole at the [[Talua Ministry Training Centre]] with the flags of [[Flag of Vanuatu|Vanuatu]] (top) and the Presbyterian Church in Vanuatu.]]<br /> <br /> ==Doctrine==<br /> The Presbyterian Church in Vanuatu affirms the [[Apostles Creed]] and [[Westminster Confession of Faith]].&lt;ref name=&quot;reformiert-online1&quot;/&gt;<br /> <br /> ==Interchurch relations==<br /> The Presbyterian Church of Vanuatu is a member of the [[World Communion of Reformed Churches]].&lt;ref&gt;{{cite web |url=http://wcrc.ch/wcrc-member-churches/ |title=Member Churches :: World Communion of Reformed Churches (WCRC) |publisher=Wcrc.ch |accessdate=2015-05-19 |url-status=dead |archiveurl=https://web.archive.org/web/20131221035027/http://wcrc.ch/wcrc-member-churches/ |archivedate=2013-12-21 }}&lt;/ref&gt;<br /> <br /> The Presbyterian Church in Vanuatu has partner relations with the [[Presbyterian Church of Australia]].&lt;ref&gt;{{cite web|url=http://www.apwm.org.au/partner-churches/vanuatu/ |title=Australian Presbyterian World Mission |publisher=APWM.org.au |date= |accessdate=2015-05-19}}&lt;/ref&gt; The Australian church supports the Talua Ministry Training Centre, which provides the ministry training of the Presbyterian Church in Vanuatu.&lt;ref&gt;{{cite web|url=http://www.apwm.org.au/wp-content/uploads/2011/11/Vanuatu.pdf |accessdate=June 14, 2013 |url-status=dead |archiveurl=https://web.archive.org/web/20130410232158/http://www.apwm.org.au/wp-content/uploads/2011/11/Vanuatu.pdf |archivedate=April 10, 2013 }}&lt;/ref&gt;<br /> <br /> ==References==<br /> {{Reflist}}<br /> <br /> ==External links==<br /> * [http://pcv.vu/ Official website]<br /> <br /> {{Authority control}}<br /> <br /> [[Category:Presbyterian denominations in Oceania]]<br /> [[Category:Churches in Vanuatu]]<br /> [[Category:Presbyterianism in Vanuatu]]<br /> [[Category:Members of the World Communion of Reformed Churches]]<br /> [[Category:Christian organizations established in 1948]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=RoboCop_(comics)&diff=1169393310 RoboCop (comics) 2023-08-08T21:00:17Z <p>205.189.94.9: </p> <hr /> <div>{{more citations needed|date=April 2010}}<br /> {{Use mdy dates|date=April 2022}}<br /> '''''RoboCop''''' refers to a [[comic book]] series spun off from the feature film [[RoboCop|of the same name]].<br /> <br /> == Storyline ==<br /> The main character is a police officer from future Detroit who gets murdered in the line of duty. He is revived and transformed into a cybernetic cop by the [[megacorporation]] Omni Consumer Products (OCP) and now goes by RoboCop. Since the debut of the character in 1987, the franchise have been exercised through various media, including multiple comic book [[Limited series (comics)|mini-series]] and [[ongoing series]].<br /> <br /> ==Marvel Comics==<br /> {{Infobox comic book title<br /> |title = RoboCop<br /> |image = Marvel Robocop 01 cover.jpg<br /> |caption = Cover of the 1st issue<br /> |schedule = Monthly<br /> |genre = [[Action fiction|Action]]&lt;br/&gt;[[Crime fiction|Crime]]&lt;br/&gt;[[Cyberpunk]]&lt;ref&gt;Krevel, Mojca. &quot;Cyberpunk literature and Slovenes: too mainstream, too marginal, or simply too soon?.&quot; Acta Neophilologica 33.1-2 (2000): 69-77.&lt;/ref&gt;&lt;br/&gt;[[Thriller (genre)|Thriller]]<br /> |publisher = [[Marvel Comics]]<br /> |startmo = March<br /> |startyr = 1990<br /> |endmo = January<br /> |endyr = 1992<br /> |issues = 23<br /> |main_char_team = [[RoboCop (character)|Alex J. Murphy/RoboCop]]<br /> |writers = [[Alan Grant (writer)|Alan Grant]]&lt;br/&gt;[[Simon Furman]]<br /> |artists = [[Lee Sullivan (comics)|Lee Sullivan]]<br /> |pencillers = <br /> |inkers = <br /> |letterers = <br /> |colorists = <br /> |editors = <br /> |creative_team_month = <br /> |creative_team_year = <br /> |creators = [[Alan Grant (writer)|Alan Grant]] (writer)&lt;br /&gt;[[Lee Sullivan (comics)|Lee Sullivan]] (illustrator)<br /> |TPB = <br /> |ISBN = <br /> |TPB# = <br /> |ISBN# = <br /> |subcat = Marvel Comics<br /> |sort = RoboCop<br /> }}<br /> In March 1990, [[Marvel Comics]] released the first issue of an ongoing ''RoboCop'' superhero comic book series based on the film. The series ran for 23 issues, ending in January 1992. In addition, a one-shot was released in August 1990, reprinting in color the 1987 black and white magazine adaptation of the film. That same month also saw a black and white magazine adaptation of the film sequel ''[[RoboCop 2]]'', as well as a three issue mini-series, printing in color the same contents as the ''RoboCop 2'' magazine. (The ''RoboCop 2'' adaptation, as well as the monthly comic, are notable for depicting the same locations, set design and OCP logo as the first film and not the substitute designs/sets seen in the actual film ''RoboCop 2''. This would continue in the Dark Horse comics).<br /> <br /> The stories told within these issues take place between the second and third ''RoboCop'' films. Entering a Marvel Universe, though not the [[Earth-616|main superhero universe]] by Marvel, ''RoboCop''{{'}}s futuristic setting is expanded with more futuristic elements like gangs riding on hover bikes, urban droids carrying out public services like waste disposal, and the fact that almost anyone with the know-how or money can create a giant killer robot. About mid-way through the comic's run, pressure from fan letters convinced Marvel to eliminate some of the more fantastical elements, such as flying characters, citing that ''RoboCop'' was set only in the near future. This led to a few conundrums and contradictions such as having biker gangs riding flying cycles in one issue and then switching to standard motorcycles in the next. The comic also had to uncomfortably deal with inconsistent characterization misconceptions in the films. For example, in the first movie, OCP's Chairman &quot;The Old Man&quot; is portrayed as a good-natured oldster who grew OCP from a small business and has little patience for the greedy corporate types he employs. In the second film, his character has changed to a corrupt villain. This proved very unpopular with fans of the first film who had liked the character. It also created a major paradox for the writers of the Marvel comic monthly series, as we see the &quot;Old Man&quot; as a good guy in the ''RoboCop'' film adaptation, as a villain in the ''RoboCop 2'' adaptation, and strictly a good guy in the early monthly series of original stories. What followed was a transformation that uncomfortably teetered between strictly well intended, morally ambiguous, and corrupt, but with a good excuse. Another slight, yet noticeable, change was the character of OCP executive Donald Johnson name to Daniel Johnson. This was most likely to avoid criticism from fans of the TV series ''[[Miami Vice]]'', though the original naming of the character was likely an in-joke referring to ''Miami Vice'' star [[Don Johnson]].<br /> <br /> The consistent theme throughout the 23 issues is RoboCop's continuing struggle to balance his humanity with the machine made after his brutal death. In the meantime, he fights street gangs, gangsters, drug pushers, addicts, politicians, terrorists, killer robots, mad scientist, cyborg animals, corrupt OCP employees, OCP's rival companies, foreign nations, mercenaries, OCP's attempts to mass produce RoboCops, and competitive attempts to do the same, as well as criticisms from an otherwise well-meaning public.<br /> <br /> ===''RoboCop'' (one-shot film adaptation) (Oct 1987)===<br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Writer / penciller !! Publication date<br /> |-<br /> ||[[Bob Harras]]/[[Alan Kupperberg]] and Javier Saltares||align=&quot;center&quot;| July 28, 1987 (cover dated October 1987)<br /> |-<br /> |colspan=&quot;6&quot;| This issue follows the plot of the [[RoboCop|movie of the same name]].<br /> Note: This issue was originally published in a black and white magazine format before being reprinted in a color trade paperback format in 1990.<br /> |}<br /> <br /> ===''RoboCop Vol. 1'' (23 issue ongoing series) (Mar 1990–Jan 1992)===<br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Title !! Writer / penciller !! Publication date<br /> |-<br /> ||1||align=&quot;center&quot;| ''Kombat Zone'' ||align=&quot;center&quot;| [[Alan Grant (writer)|Alan Grant]]/[[Lee Sullivan (comics)|Lee Sullivan]]||align=&quot;center&quot;| January 16, 1990 (cover dated March 1990)<br /> |-<br /> |colspan=&quot;6&quot;| OCP enters the next phases for building its CEO's prophesied Delta City. A competing company called Nixco tries to muscle in on the Delta City contract and steals specs from RoboCop's design to release their own version of law enforcement. Manufacturing a small army of robotic Nixcops, their first mission is to destroy RoboCop along with an apprehended criminal who can tie a Nixco executive to a murder.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||2||align=&quot;center&quot;| ''Murphy's Law'' ||align=&quot;center&quot;| Alan Grant/Lee Sullivan||align=&quot;center&quot;| February 20, 1990 (cover dated April 1990)<br /> |-<br /> |colspan=&quot;6&quot;| RoboCop escapes the Nixcops but is severely damaged. His witness is taken into Nixco custody where he is subjected to experimental surgery. Later, Robcop's partner, Anne Lewis, gets kidnapped, forcing Murphy to jump back into action. Arriving at the scene, he find Nixcops and ED-209 droids fighting it out for Anne's rescue; each one's creator seeing her kidnapping as a chance for positive publicity. RoboCop defeats both sides and rescues Anne. (At this early point in the development of Marvel's &quot;Robo-universe&quot;, OCP's chairman, The Old Man, and his executive, Johnson, are rarely seen and are depicted as benevolent and on the side of the law. All this will change as Marvel's writers attempt to keep up with the movie series changes to these characters) <br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||3||align=&quot;center&quot;| ''Dreamerama'' ||align=&quot;center&quot;| Alan Grant/Lee Sullivan||align=&quot;center&quot;| March 20, 1990 (cover dated May 1990)<br /> |-<br /> |colspan=&quot;6&quot;| A small group of thieves rob the recorded dreams of some of the most important businessmen and plan to use them as blackmail. The main suspect is Cybex, the man who initially came up with the ideas for Delta City, the ED-209's, and even RoboCop. Helped by an unfair contract, OCP stole his ideas and left him penniless. A previous fight with OCP also ended with him becoming crippled.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||4||align=&quot;center&quot;| ''Dead Man's Dreams'' ||align=&quot;center&quot;| Alan Grant/Lee Sullivan||align=&quot;center&quot;| April 17, 1990 (cover dated June 1990)<br /> |-<br /> |colspan=&quot;6&quot;| Cybex captures Robocop to study him and learn what he can to build his own loyal army of cyborgs. With Nixco's president being one of the men whose dreams were stolen, Nixco sends an assassin (the kidnapped criminal witness from issue two, now under mind control) to take out Cybex. RoboCop follows the dueling criminals to a construction site where he manages to arrests them all. <br /> (This is the first time that Delta City, post RoboCop 2, is shown to be under construction as Old Detroit is demolished. This will tie in later with RoboCop 3, the movie.)<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||5||align=&quot;center&quot;| ''War: Part 1 (War Monger)'' ||align=&quot;center&quot;| Alan Grant/Lee Sullivan||align=&quot;center&quot;| May 15, 1990 (cover dated July 1990)<br /> |-<br /> |colspan=&quot;6&quot;| OCP is presented with the chance to finally test RoboCop in a war scenario. The Spanish military is at war with North Africa and pays for the services of RoboCop to assassinate their enemy's leader, General Abu Dara aka the Desert Hawk. Dropped into Algeria with heavy artillery, RoboCop makes his way across the desert fighting robotic tentacles and motor cycle troops without his prime directives holding him back.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||6||align=&quot;center&quot;| ''War: Part 2 (War Crimes)'' ||align=&quot;center&quot;| Alan Grant/Lee Sullivan||align=&quot;center&quot;| June 19, 1990 (cover dated August 1990)<br /> |-<br /> |colspan=&quot;6&quot;| RoboCop realizes Desert Hawk is just a leader trying to save his people from famine. Convinced of the general's noble intentions, RoboCop contacts the Old Man and tells him how North Africa had found a viable means of underground irrigation. Impressed with the promise of this technology, the Old Man buys the end of the war by supporting Desert Hawk's people in return for half the shares in his novel system, hydroponics.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||7||align=&quot;center&quot;| ''Robosaur'' ||align=&quot;center&quot;| Alan Grant/Lee Sullivan||align=&quot;center&quot;| July 17, 1990 (cover dated September 1990)<br /> |-<br /> |colspan=&quot;6&quot;| Detroit opens up its newest park, the Detroit Dino Park. Here families can enjoy the site of genetically recreated dinosaurs. Problem is the cages are constantly being sabotaged, leaving RoboCop to face down these towering monsters threatening the public. The saboteur turns to be a Dino Park employee secretly working for Nixco. The dinosaur rampages are meant to disgrace OCP since Dino Park is the first major lease holder in their &quot;crime free&quot; Delta City.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||8||align=&quot;center&quot;| ''Gangbuster'' ||align=&quot;center&quot;| Alan Grant/Lee Sullivan||align=&quot;center&quot;| August 21, 1990 (cover dated October 1990)<br /> |-<br /> |colspan=&quot;6&quot;| In order to obtain more land for the expansion of Delta City, OCP instigates a war between two rival gangs, the Urban Kurs and the Psykoids, to decrease property values. Thanks to Anne having an inside man in the Urban Kurs, she's always updated as to where the gang wars are taking place, leading to a [[mass arrest]] of both gangs by her and RoboCop.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||9||align=&quot;center&quot;| ''Vigilante: Part 1 (Power Play)'' ||align=&quot;center&quot;| Alan Grant/Lee Sullivan||align=&quot;center&quot;| September 18, 1990 (cover dated November 1990)<br /> |-<br /> |colspan=&quot;6&quot;| OCP broadcasts a new reality TV show called the Detroit Vigilante, inspiring people to take to the streets and start acts of vigilantism. After arresting several of them, RoboCop confronts the Detroit Vigilante about his responsibility as a TV persona but to no avail. (The costumed people in this storyline are parodies of Marvel and DC heroes. This is also where The Old Man starts to become a morally-conflicted character. He adopts a consequentialist philosophy toward law enforcement and disobeys his executives.)<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||10||align=&quot;center&quot;| ''Vigilante: Part 2 (Rough Justice)'' ||align=&quot;center&quot;| Alan Grant/Lee Sullivan||align=&quot;center&quot;| October 16, 1990 (cover dated December 1990)<br /> |-<br /> |colspan=&quot;6&quot;| One of the new vigilantes seeks out the gang who murdered his son. Meanwhile, two crooks dressed-up as vigilantes come face to face with the Detroit Vigilante. This escalates into an all out war involving every vigilante in the city. It all comes to a head when the father's personal vendetta collides with the vigilante war, causing an explosion that kills everyone but RoboCop. (This was the last issue to feature futuristic flying vehicles and characters.)<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||11||align=&quot;center&quot;| ''Unfinished Business'' ||align=&quot;center&quot;| [[Evan Skolnick]]/[[Herb Trimpe]]||align=&quot;center&quot;| November 20, 1990 (cover dated January 1991)<br /> |-<br /> |colspan=&quot;6&quot;| An ex-OCP cop named Daniel O'Hara is brought back to life in an experimental robot. He was killed by RoboCop and he wants his revenge. Their battle is intense, but he ends up losing. His system gets overloaded with his own nightmares. (This issue's artist, Herb Trimpe, is known for his depiction of robots in the Marvel universe. Also, though all set after the events of ''RoboCop 2'', these first 11 issues seem at odds with a lot of the events in that film. From this point on, Marvel tries to reconcile these contradictions.)<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||12||align=&quot;center&quot;| ''Purgatory'' ||align=&quot;center&quot;| [[Simon Furman]]/Lee Sullivan||align=&quot;center&quot;| December 18, 1990 (cover dated February 1991)<br /> |-<br /> |colspan=&quot;6&quot;| Men are being abducted off the streets of Old Detroit and the criminals behind this are in Purgatory; a section of the city that's off limits to cops by strict order of OCP. Defying this order, RoboCop enters the crime-infested streets of Purgatory in search of the perpetrators. (This is the first issue written after the completion of RoboCop 2 and the events of that film are referenced here. Though there is no indication that OCP now owns Detroit as was established in that film. This is also when OCP executives, including &quot;The Old Man and Johnson&quot; are shown to change from simply morally lost to villainy. It is the first issue to feature the revised OCP logo from &quot;RoboCop 2&quot;.)<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||13||align=&quot;center&quot;| ''Past Sins'' ||align=&quot;center&quot;| Simon Furman/Lee Sullivan||align=&quot;center&quot;| January 15, 1991 (cover dated March 1991)<br /> |-<br /> |colspan=&quot;6&quot;| RoboCop's journey through Purgatory leads him to a factory secretly creating an army of new RoboCops under the direction of OCP. To his horror, he sees the men who have been abducted off the streets are lobotomized before undergoing extensive surgery to becoming cyborgs for OCP. Enraged and horrified, RoboCop attacks the factory. Along the way he meets a guilt stricken scientist named Thyle who was involved with his own creation back with Bob Morton.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||14||align=&quot;center&quot;| ''Dreams'' ||align=&quot;center&quot;| Simon Furman/Lee Sullivan||align=&quot;center&quot;| February 19, 1991 (cover dated April 1991)<br /> |-<br /> |colspan=&quot;6&quot;| The OCP executive behind the RoboCop factory turns out to be Johnson, the Old Man's newest right hand man. Hoping to speed up the Old Man's plans for Delta City, he wanted to create an armada of RoboCops in the hopes to eliminate crime faster. To keep the factory a secret, Johnson plays his final card with a remote to RoboCop's self-destruct program. (This is the first time that Johnson is depicted as an enemy of RoboCop. This change in behavior, change of name from &quot;Donald&quot; in the film to &quot;Daniel&quot; in the comics as well as an increasing lack of resemblance to the character in the film, has led to speculation that this may be a different OCP executive. Perhaps a relative of the original movie character.)<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||15||align=&quot;center&quot;| ''Ashes'' ||align=&quot;center&quot;| Simon Furman/Lee Sullivan||align=&quot;center&quot;| March 19, 1991 (cover dated May 1991)<br /> |-<br /> |colspan=&quot;6&quot;| Thyle disarms Murphy's self-destructig explosive and they both return to the factory. Johnson activates the RoboCops out of desperation, but the cyborgs lack the human balance needed to keep their programming in check. They die burning the factory down. The Old Man shows up and Murphy threatens him with evidence from the factory. He uses this as leverage to free himself from OCP control.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||16||align=&quot;center&quot;| ''TV Crimes'' ||align=&quot;center&quot;| Simon Furman/[[Andrew Wildman]]||align=&quot;center&quot;| April 16, 1991 (cover dated June 1991)<br /> |-<br /> |colspan=&quot;6&quot;| The owners of a brand-new device, Implant TV, are mind-controlling people through their televisions and making them unknowingly commit crimes. Having recovered his free will, RoboCop takes the case and apprehends the criminals. During this mission, he encounters his former self and experiences a unification of his two identities. (Murphy's gravestone in the cover is marked 1981–2015, implying the original film takes place in the year 2015. This is one of the few places where dates are alluded to in RoboCop media, another being the original Arcade game which states the film's setting as 1990)<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||17||align=&quot;center&quot;| ''Private Lives'' ||align=&quot;center&quot;| Simon Furman/Lee Sullivan||align=&quot;center&quot;| May 21, 1991 (cover dated July 1991)<br /> |-<br /> |colspan=&quot;6&quot;| Anne's husbands gets taken as a hostage by the Wraith, a new, mysterious criminal. He wants to blackmail Anne into giving him classified information of future police operations. She seems to cave in and meets with Wraith to surrender the documents, but it all turns out to be bluff to rescue the hostage. Murphy saves Anne at the last minute, but the Wraith escapes with her husband.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||18||align=&quot;center&quot;| ''Mind Bomb: Part 1'' ||align=&quot;center&quot;| Simon Furman/Lee Sullivan||align=&quot;center&quot;| June 18, 1991 (cover dated August 1991)<br /> |-<br /> |colspan=&quot;6&quot;| A criminal known as Lot's Wife makes a move against RoboCop with an attack on Metro West itself. Using a catatonic man with an unnatural gift, she sets him up to be arrested and taken into the station where he emits a wave of influential psychic vibrations. These vibrations cause everyone around him to have freak episodes or disconcern for others and their safety with aggressive behaviors.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||19||align=&quot;center&quot;| ''Mind Bomb: Part 2'' ||align=&quot;center&quot;| Simon Furman/Lee Sullivan||align=&quot;center&quot;| July 16, 1991 (cover dated September 1991)<br /> |-<br /> |colspan=&quot;6&quot;| In a surprising prologue, Ellen Murphy arrives home at night to find Jimmy in the clutches of a whip swinging costumed villain. Meanwhile, facing the psychic backlash of the stranger in the precinct, RoboCop battles an internal conflict with his humanity at ends with the machine half of him. With the humanity half proving more dominant, RoboCop discovers the stranger behind the insanity at Metro West and kills him to save the precinct.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||20||align=&quot;center&quot;| ''The Cutting Edge'' ||align=&quot;center&quot;| Simon Furman/Andrew Wildman||align=&quot;center&quot;| August 20, 1991 (cover dated October 1991)<br /> |-<br /> |colspan=&quot;6&quot;| Now in total control of himself without directives dictating his actions, RoboCop strives to be a regular cop again without the use of his robotic aids. He fears his program could take him back over if utilized for help. In the meantime, he finds out that his wife and son have been kidnapped. During the case, Murphy decides the only way to help his family is by coinciding with his programming once more despite the personal consequences.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||21||align=&quot;center&quot;| ''Beyond the Law: Part 1'' ||align=&quot;center&quot;| Simon Furman/Lee Sullivan||align=&quot;center&quot;| September 17, 1991 (cover dated November 1991)<br /> |-<br /> |colspan=&quot;6&quot;| Using his resources for intel, the Old Man informs RoboCop his family has been taken hostage by a rebel leader named Aza in San Arica. In 22 hours, Aza will contact RoboCop to assassinate San Arica's president or his wife and son will be killed. An OCP jet is offered to fly RoboCop to San Arica to save his family which RoboCop reluctantly takes. He knows OCP has some personal stake in his going to San Arica but knows he has no other choice.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||22||align=&quot;center&quot;| ''Beyond the Law: Part 2'' ||align=&quot;center&quot;| Simon Furman/Lee Sullivan||align=&quot;center&quot;| October 15, 1991 (cover dated December 1991)<br /> |-<br /> |colspan=&quot;6&quot;| RoboCop arrives in San Arica and is discovered by a band of rebels under Aza's command. Quickly taking them out to ensure his family's safety, RoboCop notices their special weapons resembling ones from OCP. Meeting up with an anti-rebel guide, RoboCop learns there's been rumor of OCP funding Aza and his rebellion. As they draw closer to Aza's base, RoboCop is ambushed by Aza himself and captured while his wife watches from nearby.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||23||align=&quot;center&quot;| ''Beyond the Law: Part 3'' ||align=&quot;center&quot;| Simon Furman/Lee Sullivan||align=&quot;center&quot;| November 19, 1991 (cover dated January 1992)<br /> |-<br /> |colspan=&quot;6&quot;| RoboCop breaks free and kills Aza and his men, but finds himself under an attack by Colonel Flak, a special agent of OCP. He is ordered to kill any and all witnesses of RoboCop's rescue mission. Outfitted with more weapons, Flak proves to be a difficult opponent, but RoboCop manages to kill him. In an attempt to keep her safe, RoboCop decides to mislead his wife into believing he truly is a machine and not Alex Murphy. He decides never to see her again.<br /> |}<br /> <br /> ===''RoboCop 2'' (3-issue film adaptation mini-series) (Aug–Sep 1990)===<br /> This series was originally released as a black-and-white magazine format (just like the first film adaptation) on June 5, 1990, and then was printed in a color trade paperback format on June 12, 1990 before finally being split up and released as a three-issue mini-series (which was also in color) starting on June 26, 1990, and running through to July 24.<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Title !! Writer / penciller !! Publication date<br /> |-<br /> ||1||align=&quot;center&quot;| ''Kids Stuff'' ||align=&quot;center&quot;| [[Alan Grant (writer)|Alan Grant]]/[[Mark Bagley]]||align=&quot;center&quot;| June 26, 1990 (cover dated Late August 1990)<br /> |-<br /> |colspan=&quot;6&quot;| See ''[[RoboCop 2]]'' for plot summary.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||2||align=&quot;center&quot;| ''Nuke Out'' ||align=&quot;center&quot;| Alan Grant/Mark Bagley||align=&quot;center&quot;| July 10, 1990 (cover dated Early September 1990)<br /> |-<br /> |colspan=&quot;6&quot;| See ''RoboCop 2'' for plot summary.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||3||align=&quot;center&quot;| ''The Mark of Cain'' ||align=&quot;center&quot;| Alan Grant/Mark Bagley||align=&quot;center&quot;| July 24, 1990 (cover dated Late September 1990)<br /> |-<br /> |colspan=&quot;6&quot;| See ''RoboCop 2'' for plot summary.<br /> |}<br /> <br /> ==Dark Horse Comics==<br /> The comic book license for ''RoboCop'' was then acquired by [[Dark Horse Comics]]. Between May&lt;ref&gt;{{Cite web|url=http://www.darkhorse.com/Comics/92-071/RoboCop-vs-Terminator-1-of-4|title = RoboCop vs. Terminator #1 (Of 4) :: Profile :: Dark Horse Comics}}&lt;/ref&gt; and August 1992,&lt;ref&gt;{{Cite web|url=http://www.darkhorse.com/Comics/92-135/RoboCop-vs-Terminator-4-of-4|title = RoboCop vs. Terminator #4 (Of 4) :: Profile :: Dark Horse Comics}}&lt;/ref&gt; Dark Horse released a four issue mini-series ''[[RoboCop Versus The Terminator (comics)|RoboCop Versus The Terminator]]'', written by [[Frank Miller]], with artwork by [[Walt Simonson]]. This led to several new RoboCop mini-series by Dark Horse as follows:<br /> <br /> ===''RoboCop Versus The Terminator'' (4-issue mini-series) (Sep–Dec 1992)===<br /> See [[RoboCop Versus The Terminator (comics)|''RoboCop Versus The Terminator'']] for main article and issue summaries.<br /> <br /> ===''RoboCop: Prime Suspect'' (4-issue mini-series) (Oct 1992–Jan 1993)===<br /> This mini-series follows RoboCop being framed for murder and his attempts to clear his name. It takes place shortly after ''RoboCop 3''.<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Writer / penciller !! Publication date<br /> |-<br /> ||1||align=&quot;center&quot;| [[John Arcudi]]/[[John Paul Leon]]||align=&quot;center&quot;| October 20, 1992<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||2||align=&quot;center&quot;| John Arcudi/John Paul Leon||align=&quot;center&quot;| November 17, 1992<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||3||align=&quot;center&quot;| John Arcudi/John Paul Leon||align=&quot;center&quot;| December 15, 1992<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||4||align=&quot;center&quot;| John Arcudi/John Paul Leon||align=&quot;center&quot;| January 19, 1993<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |}<br /> <br /> ===''RoboCop 3'' (3-issue film adaptation mini-series) (Jul–Nov 1993)===<br /> This mini-series adapts the [[RoboCop 3|film of the same name]].<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Writer / penciller !! Publication date<br /> |-<br /> ||1||align=&quot;center&quot;| [[Steven Grant]]/Hoang Nguyen||align=&quot;center&quot;| July 20, 1993<br /> |-<br /> |colspan=&quot;6&quot;| See ''RoboCop 3'' for plot summary.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||2||align=&quot;center&quot;| Steven Grant/Hoang Nguyen||align=&quot;center&quot;| August 31, 1993<br /> |-<br /> |colspan=&quot;6&quot;| See ''RoboCop 3'' for plot summary.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||3||align=&quot;center&quot;| Steven Grant/Hoang Nguyen||align=&quot;center&quot;| November 2, 1993<br /> |-<br /> |colspan=&quot;6&quot;| See ''RoboCop 3'' for plot summary.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |}<br /> <br /> ===''RoboCop: Mortal Coils'' (4-issue mini-series) (Sep-Dec 1993)===<br /> This mini-series has RoboCop chasing down some criminals related to a coffin from a recent OCP break-in as he follows them to a snowy [[Denver, Colorado]].<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Writer / penciller !! Publication date<br /> |-<br /> ||1||align=&quot;center&quot;| [[Steven Grant]]/Nick Gazzo||align=&quot;center&quot;| September 21, 1993<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||2||align=&quot;center&quot;| Steven Grant/Nick Gazzo||align=&quot;center&quot;| October 19, 1993<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||3||align=&quot;center&quot;| Steven Grant/Nick Gazzo||align=&quot;center&quot;| November 16, 1993<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||4||align=&quot;center&quot;| Steven Grant/Nick Gazzo||align=&quot;center&quot;| December 21, 1993<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |}<br /> <br /> ===''RoboCop: Roulette'' (4-issue mini-series) (Dec 1993–Mar 1994)===<br /> Bodies keep popping up in Old Detroit which slowly keep leading back to OCP. There is also an ED-209 unit on a rampage which RoboCop must put a stop to. It takes place right after ''RoboCop: Prime Suspect''.<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Writer / penciller !! Publication date<br /> |-<br /> ||1||align=&quot;center&quot;| [[John Arcudi]]/[[Mitch Byrd]]||align=&quot;center&quot;| January 11, 1994 (cover dated December 1993)<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||2||align=&quot;center&quot;| John Arcudi/Mitch Byrd||align=&quot;center&quot;| February 1994 (cover dated January 1994)<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||3||align=&quot;center&quot;| John Arcudi/Mitch Byrd||align=&quot;center&quot;| March 1994 (cover dated February 1994)<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||4||align=&quot;center&quot;| John Arcudi/Mitch Byrd||align=&quot;center&quot;| April 1994 (cover dated March 1994)<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |}<br /> <br /> Two mini-stories were also published in the series &quot;Dark Horse Comics&quot;.<br /> *Dark Horse Comics #1–3 provided the events that led up to the story presented in ''Prime Suspect''.<br /> *Dark Horse Comics #6–9 provided the events that led up to the story presented in ''Mortal Coils''.<br /> <br /> A '''RoboCop versus Predator''' comic was proposed for Dark Horse. Some of the proposal pages by Joshua Boulet can be seen at the RoboCop Archive website.<br /> <br /> ==Avatar Press==<br /> {{Infobox comic book title &lt;!--Wikipedia:WikiProject Comics--&gt;<br /> |title = Frank Miller's RoboCop<br /> |image = Frank Miller RoboCop 1.jpg<br /> |caption = Cover of first issue. Cover art by [[Frank Miller]]<br /> |schedule = Irregular<br /> |limited = y<br /> |genre = [[Crime fiction|Crime]]&lt;br/&gt;[[Science fiction]] ([[Cyberpunk]])&lt;br/&gt;[[Thriller (genre)|Thriller]]<br /> |publisher = [[Avatar Press]]<br /> |startmo = July<br /> |startyr = 2003<br /> |endmo = January<br /> |endyr = 2006<br /> |issues = 9<br /> |main_char_team = [[RoboCop (character)|Alex J. Murphy/RoboCop]]<br /> |writers = [[Steven Grant]]<br /> |artists = [[Juan Jose Ryp]]<br /> |pencillers = <br /> |inkers = <br /> |letterers = <br /> |colorists = <br /> |editors = <br /> |creative_team_month = <br /> |creative_team_year = <br /> |creators = [[Frank Miller]] (original screenplay, concept supervisor)&lt;br /&gt;[[Steven Grant]] (writer)&lt;br /&gt;[[Juan Jose Ryp]] (illustrator)<br /> |TPB = <br /> |ISBN = <br /> |TPB# = <br /> |ISBN# = <br /> |subcat = Avatar Press<br /> |sort = RoboCop<br /> }}<br /> Almost a decade later, the comic rights to ''RoboCop'' were acquired by Avatar Press. Upon announcing the acquisition, the company's publisher, William Christensen, received several offers from artists and writers hoping to contribute to the project (which eventually led to the [[Avatar Press|Avatar]] one-shot ''RoboCop: Killing Machine'').<br /> <br /> ===Frank Miller's RoboCop (9-issue ongoing series) (Jul 2003–Jan 2006)===<br /> William Christensen was interested in producing a comic adaptation of Miller's &quot;lost&quot; screenplay, of which he possessed a copy. Christensen soon got in contact with Miller, who was enthusiastic about the idea of his story finally being told uncensored.<br /> <br /> The series was personally overseen by Miller, based on his own unused screenplay for the film ''[[RoboCop 2]]'' and notes of unused ideas for ''[[RoboCop 3]]''; however, scheduling prohibited him from personally writing the comic adaptation or illustrating it. It was written by [[Steven Grant]], a long-time acquaintance of Miller's who had written the comic adaptation of ''RoboCop 3'' for [[Dark Horse Comics]]. [[Juan Jose Ryp]], best known for illustrating the Avatar comic ''[[Another Suburban Romance]]'' (written by [[Alan Moore]]), became the title's illustrator while Miller drew covers.<br /> <br /> The series was composed of nine issues that were published from August [[2003 in comics|2003]] through February [[2004 in comics|2004]] under [[Avatar Press|Avatar]]'s Pulsar Press line, which specializes in licensed comic properties from movies. Issues featured covers by Miller and alternative covers by Ryp.<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Writer / penciller !! Publication date<br /> |-<br /> ||1||align=&quot;center&quot;| [[Steven Grant]] (from [[Frank Miller]]'s script)/[[Juan Jose Ryp]]||align=&quot;center&quot;| July 2003<br /> |-<br /> |colspan=&quot;6&quot;| When OCP learns that RoboCop still has emotions they deem him obsolete. Now he must battle both the crime of Old Detroit as well as his new and &quot;improved&quot; model.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||2||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Juan Jose Ryp||align=&quot;center&quot;| September 2003<br /> |-<br /> |colspan=&quot;6&quot;| As RoboCop refuses to follow the morally misaligned OCP they start bringing in militarized mercenaries to enforce the law while also working on a new set of RoboCops.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||3||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Juan Jose Ryp||align=&quot;center&quot;| September 2003<br /> |-<br /> |colspan=&quot;6&quot;| RoboCop must fight off the mercenaries who have now been turned on him.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||4||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Juan Jose Ryp||align=&quot;center&quot;| December 2003<br /> |-<br /> |colspan=&quot;6&quot;| Now that OCP has cut down their police force they have also reprogrammed RoboCop which is causing him to go insane.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||5||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Juan Jose Ryp||align=&quot;center&quot;| February 2004<br /> |-<br /> |colspan=&quot;6&quot;| RoboCop must face the new and superior RoboCop while Officer Lewis is hunted down by a madman and the mercenaries continue to cause havoc.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||6||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Juan Jose Ryp||align=&quot;center&quot;| June 2004<br /> |-<br /> |colspan=&quot;6&quot;| RoboCop and his replacement carry their fight down to the subway.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||7||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Juan Jose Ryp||align=&quot;center&quot;| October 2004<br /> |-<br /> |colspan=&quot;6&quot;| Delta City goes into further chaos as the Rehabs start taking down the cops and RoboCop goes straight to OCP.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||8||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Juan Jose Ryp||align=&quot;center&quot;| April 2005<br /> |-<br /> |colspan=&quot;6&quot;| RoboCop continues the battle as he is worn down and RoboCop 2 is now used by an even deadlier opponent.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||9||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Juan Jose Ryp||align=&quot;center&quot;| January 2006<br /> |-<br /> |colspan=&quot;6&quot;| The police get in their final battle with the Rehabs just as RoboCop and RoboCop 2 reach their climactic final battle as well.<br /> |}<br /> <br /> ====Reception====<br /> Critical reaction to Frank Miller's ''RoboCop'' comic has been mixed. Randy Lander of comic review site The Fourth Rail gave the first issue a score of 7 out of 10, saying that &quot;there's not a lot of personality to the book&quot; but added that it's &quot;certainly interesting to read and full of potential.&quot;&lt;ref&gt;[http://www.thefourthrail.com/reviews/snapjudgments/081103/frankmillerrobocop1.shtml Review by Randy Lander] {{webarchive|url=https://web.archive.org/web/20080925233011/http://www.thefourthrail.com/reviews/snapjudgments/081103/frankmillerrobocop1.shtml |date=2008-09-25 }}, The Fourth Rail&lt;/ref&gt;<br /> <br /> Ken Tucker of ''[[Entertainment Weekly]]'' gave the comic a &quot;D&quot; score, criticizing the &quot;tired story&quot; and lack of &quot;interesting action.&quot;&lt;ref&gt;[http://www.ew.com/ew/article/0,,479889,00.html Review by Ken Tucker], Entertainment Weekly, September 5, 2003&lt;/ref&gt; A [[recapping|recap]] written for the pop culture humor website I-Mockery said, &quot;Having spent quite a lot of time with these comics over the past several days researching and writing this article, I can honestly say that it makes me want to watch the movie version of ''[[RoboCop 2]]'' again just so I can get the bad taste out of my mouth. Or prove to myself that the movie couldn't be worse than this.&quot;&lt;ref&gt;[http://www.i-mockery.com/comics/longbox24/default.php &quot;Frank Miller's Roboflop&quot;], I-Mockery, March 31, 2008&lt;/ref&gt;<br /> <br /> ====Continuity====<br /> Picking up after the events of the first film. Frank Miller's vision is quite different from the comics that came before and is at odds with established continuity, especially ''RoboCop 3'' and the Dark Horse Comics run.<br /> <br /> ===RoboCop: Killing Machine (one-shot) (Aug 2004)===<br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Writer / penciller !! Publication date<br /> |-<br /> ||[[Steven Grant]]/Anderson Ricardo||align=&quot;center&quot;| August 2004<br /> |-<br /> |colspan=&quot;6&quot;| A kid hacker starts causing power outages and other problems throughout Old Detroit and even ends up unleashing a hidden OCP machine which malfunctions and becomes a killing machine.<br /> |}<br /> <br /> ===RoboCop: Wild Child (one-shot) (Jan 2005)===<br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Writer / penciller !! Publication date<br /> |-<br /> ||[[Steven Grant]]/Carlos Ferreira||align=&quot;center&quot;| January 2005<br /> |-<br /> |colspan=&quot;6&quot;| The cops are still on strike and there is chaos throughout Old Detroit. Officer Lewis' little sister Heaven and her gang come to town and start causing chaos, creating a difficult moral dilemma for Lewis and RoboCop.<br /> |}<br /> Wild Child contained an advertisement for a three-issue mini-series called &quot;Robocop: War Party&quot;, again from Grant &amp; Ferreira. However, the mini-series was never published and it is not clear how much work was done on it apart from Ferreira's full page advert.<br /> <br /> ==Dynamite Entertainment==<br /> <br /> ===''RoboCop Vol 1: Revolution''===<br /> [[Dynamite Entertainment]] announced they would be producing the next RoboCop&lt;ref&gt;{{cite web |first=Matt |last=Brady |url=http://www.newsarama.com/comics/060910-Robocop.html |title=RoboCop Returns to Comics with Dynamite |publisher=[[Newsarama]] |date=June 10, 2009 |access-date=2009-06-17}}&lt;/ref&gt; with writer [[Rob Williams (comics)|Rob Williams]]&lt;ref&gt;{{cite web |first=Matt |last=Brady |url=http://www.newsarama.com/comics/060916-Robocop-Williams.html |title=Man and Machine - Rob Williams on Dynamite's RoboCop |publisher=[[Newsarama]] |date=June 16, 2009 |access-date=2009-06-17}}&lt;/ref&gt; and artist Fabiano Neves.&lt;ref&gt;{{cite web |first=Matt |last=Brady |url=http://www.newsarama.com/comics/060911-Nick-Robocop.html |title=Nick Barrucci on RoboCop &amp; Writer Rob Williams |publisher=[[Newsarama]] |date=June 11, 2009 |access-date=2009-06-17}}&lt;/ref&gt; The first Dynamite solo adventure was &quot;Revolution&quot; which was later collected as a trade paperback.<br /> <br /> ===''Terminator/RoboCop: Kill Human Vol 1''===<br /> The first Dynamite RoboCop and Terminator crossover, ''[[List of Terminator comics#Terminator/RoboCop: Kill Human (2011)|Terminator/RoboCop: Kill Human Vol 1]]'', and the second overall. Later collected as a trade paperback.<br /> <br /> ===''RoboCop Vol 2: Road Trip''===<br /> Second Dynamite solo adventure. A trade paperback was scheduled for release in August 2014 but did not materialize.<br /> <br /> ==BOOM! Studios==<br /> In 2013, [[Boom! Studios]] obtained the rights to produce a new ''RoboCop'' series as well as republishing ''Frank Miller's Robocop''.<br /> <br /> ===RoboCop Vol 1===<br /> A republishing of the Avatar series Frank Miller's Robocop. BOOM! Studios released their own trade paperback of the series under the name RoboCop Volume One, which was larger in size and featured some black and white sketches as additional material.<br /> <br /> ===RoboCop Vol 2: Last Stand Part One===<br /> Last Stand is an eight issue mini-series written by [[Steven Grant]], adapting Frank Miller's original screenplay to ''RoboCop 3''. The first four issues have been collected in trade paperback.<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Writer / penciller !! Publication date<br /> |-<br /> ||1||align=&quot;center&quot;| [[Steven Grant]] (from [[Frank Miller]]'s script)/Korkut Öztekin||align=&quot;center&quot;| August 2013<br /> |-<br /> |colspan=&quot;6&quot;| Long after the events of the first film, the police form was shut down by OCP. RoboCop is now a renegade, protecting the people of Detroit form the OCP trying to take over their homes in order to finally build the long dreamed Delta City.<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||2||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Korkut Öztekin||align=&quot;center&quot;| September 2013<br /> |-<br /> |colspan=&quot;6&quot;| Helped by his new ally, Marie, RoboCop continues to help the citizens of old Detroit by completing a rescue raid on a clinic. A Japanese corporation sends a mysterious representative to OCP's offices. <br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||3||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Korkut Öztekin||align=&quot;center&quot;| October 2013<br /> |-<br /> |colspan=&quot;6&quot;| <br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||4||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Korkut Öztekin||align=&quot;center&quot;| November 2013<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> |}<br /> <br /> ===RoboCop Vol 3: Last Stand Part Two===<br /> The last four issues were published in trade paperback in December 2014.<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Writer / penciller !! Publication date<br /> |-<br /> ||5||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Korkut Öztekin||align=&quot;center&quot;| December 2013<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||6||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Korkut Öztekin||align=&quot;center&quot;| January 2014<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||7||align=&quot;center&quot;| Steven Grant (from Frank Miller's script)/Korkut Öztekin||align=&quot;center&quot;| February 2014<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||8||align=&quot;center&quot;| Ed Brisson (lone writing credit)/Korkut Öztekin||align=&quot;center&quot;| March 2014<br /> |-<br /> |colspan=&quot;6&quot;|<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> |}<br /> <br /> ===RoboCop: The Human Element===<br /> Boom published four one-shot comics set in the [[Robocop (2014 film)|2014 film reboot universe]]. These were collected in a trade paperback under the banner title &quot;RoboCop: The Human Element&quot; in which the stories are presented in the reverse order to which they were published. A fifth comic entitled &quot;The Gauntlet&quot; was made available exclusively for digital download with [[Target Corporation|Target]]'s exclusive edition of the Blu-Ray film release and did not feature in the collected trade paperback.<br /> <br /> Beta is the first RoboCop story not to feature the character of RoboCop himself, as it follows a soldier who becomes a RoboSoldier.<br /> <br /> {| class=&quot;wikitable&quot;<br /> |-<br /> ! Issue # !! Writer / penciller !! Publication date<br /> |-<br /> ||1||align=&quot;center&quot;| Michael Moreci / Art by Jason Copland ||align=&quot;center&quot;| 10 Feb 2014<br /> |-<br /> |colspan=&quot;6&quot;| '''Hominem Ex Machina'''<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||2||align=&quot;center&quot;| Joe Harris / Art by Piotr Kowalski||align=&quot;center&quot;| 17 Feb 2014<br /> |-<br /> |colspan=&quot;6&quot;| '''To Live And Die In Detroit''' <br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||3||align=&quot;center&quot;| Frank J. Barbiere / Art by João &quot;Azeitona&quot; Vieira||align=&quot;center&quot;| 24 Feb 2014<br /> |-<br /> |colspan=&quot;6&quot;| '''Memento Mori'''<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||4||align=&quot;center&quot;| Ed Brisson / Art by Emilio Laiso||align=&quot;center&quot;| 3 Mar 2014<br /> |-<br /> |colspan=&quot;6&quot;| '''Beta'''<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> ||5||align=&quot;center&quot;| Michael Moreci / Art by Janusz Ordon &amp; Korkut Öztekin ||align=&quot;center&quot;| 3 Jun 2014 (digital download with Target Blu-Ray)<br /> |-<br /> |colspan=&quot;6&quot;| '''Gauntlet'''<br /> |-<br /> |colspan=&quot;6&quot; bgcolor=&quot;#112266&quot;| &lt;!-- Putting in a space between issues --&gt;<br /> |-<br /> |}<br /> <br /> ===RoboCop: Dead Or Alive===<br /> In 2015, following the movie reboot tie-ins, BOOM! announced a new ongoing series set after events in the original 1987 film. Individual titles were simply published as &quot;RoboCop&quot;, although collected trade paperbacks are titled &quot;RoboCop: Dead Or Alive&quot;. The 12-issue series was written by Joshua Williamson with art by Carlos Magno for the first eight issues, while Dennis Culver handled penciling duties for the final four issues.<br /> <br /> ===RoboCop: Citizens Arrest===<br /> In January 2018, BOOM! Studios announced a new ongoing series set 30 years after events in the first film where &quot;justice is crowdsourced&quot;. The first issue was published April 2018. <br /> It was written by [[Brian Wood (comics)|Brian Wood]] with art by Jorge Coelho.<br /> <br /> In the decades since the RoboCop program first began, corporations have taken over public services and the government—and law enforcement is the biggest private contract of all. Traditional police forces no longer exist as all citizens are encouraged—and rewarded—to spy on their neighbors. There is only one authority on the streets: ROBOCOP.<br /> <br /> ==Collected editions==<br /> '''MARVEL'''<br /> * RoboCop (movie adaptation)<br /> * RoboCop 2 (movie adaptation)<br /> <br /> '''DARK HORSE'''<br /> * RoboCop: Prime Suspect<br /> * RoboCop Versus The Terminator (original Dark Horse TPB)<br /> <br /> '''AVATAR'''<br /> * Frank Miller's RoboCop<br /> <br /> '''DYNAMITE'''<br /> * RoboCop Vol 1: Revolution <br /> * Terminator/RoboCop: Kill Human<br /> <br /> '''BOOM! STUDIOS'''<br /> * RoboCop Vol 1 (reprint of Avatar series &quot;Frank Miller's RoboCop&quot;)<br /> * RoboCop Vol 2: Last Stand Part One (collects issues #1-4)<br /> * RoboCop: The Human Element (collects the one-shots &quot;Beta&quot;, &quot;Memento Mori&quot;, &quot;To Live And Die In Detroit&quot; &amp; &quot;Hominem Ex Machina&quot;)<br /> * RoboCop Versus The Terminator (remastered edition of Dark Horse series, hardcover, July 2014)<br /> * RoboCop Versus The Terminator: Gallery Edition (Oversized black &amp; white edition of Dark Horse series, hardcover, July 2014)<br /> * RoboCop Vol 3: Last Stand Part Two (collects issues #5-8, Dec 2014)<br /> * RoboCop: Dead Or Alive Volume One (collects issues #1-4, Aug 2015)<br /> * RoboCop: Dead Or Alive Volume Two (collects issues #5-8, Feb 2016)<br /> * The Complete Frank Miller RoboCop Omnibus (collects RoboCop Vol 1-3, Dec 2016)<br /> * RoboCop: Dead Or Alive Volume Three (collects issues #9-12, Mar 2017)<br /> * RoboCop: Citizens Arrest (collects all issues #1-5, Dec 2018)<br /> All Boom! Studios collected editions from ''The Human Element'' onwards are available on [[ComiXology]] with animated zooming panels, except the Gallery Edition of ''RoboCop Versus The Terminator''.<br /> <br /> The following were previously listed on Amazon for release but have not been released to date:<br /> <br /> * RoboCop Omnibus (Dark Horse)<br /> * RoboCop Vol 2: Road Trip (Dynamite)<br /> <br /> ==References==<br /> {{Reflist}}<br /> <br /> ==External links==<br /> * [http://www.avatarpress.com/robocop The official website for the mini-series by Avatar Press]<br /> * [http://comicsfondle.net/emphases/robocop/ Reviews of the various RoboCop comic book series]<br /> <br /> {{RoboCop}}<br /> {{Frank Miller}}<br /> <br /> [[Category:1990 comics debuts]]<br /> [[Category:1992 comics debuts]]<br /> [[Category:2009 comics debuts]]<br /> [[Category:Comics by Frank Miller (comics)]]<br /> [[Category:Dark Horse Comics limited series]]<br /> [[Category:Dynamite Entertainment titles]]<br /> [[Category:Marvel Comics titles]]<br /> [[Category:RoboCop comics]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Terminator_(franchise)&diff=1169392983 Terminator (franchise) 2023-08-08T20:58:14Z <p>205.189.94.9: </p> <hr /> <div>{{Short description|Science fiction action media franchise}}<br /> {{DISPLAYTITLE:''Terminator'' (franchise)}}<br /> {{Use American English|date=November 2019}}<br /> {{Use mdy dates|date=March 2017}}<br /> {{Infobox media franchise<br /> | title = ''Terminator''<br /> | image = Terminator (franchise logo).svg<br /> | image_size = 220px<br /> | caption = Official franchise logo from the latest film<br /> | creator = [[James Cameron]]&lt;br&gt;[[Gale Anne Hurd]]<br /> | origin = ''[[The Terminator]]'' (1984)<br /> | owner = [[StudioCanal]]&lt;ref&gt;{{cite web |url= https://twitter.com/TerminatorRPG/status/1343697988443639817 |title=Terminator RPG}}&lt;/ref&gt;&lt;ref&gt;{{cite web |url= https://www.prime1studio.com/t1-t-800-endoskeleton-hdmmt1-01ex.html |title=T-800 Endoskeleton}}&lt;/ref&gt;&lt;ref&gt;{{cite web |url= https://store.steampowered.com/app/954740/Terminator_Resistance/ |title=Terminator: Resistance}}&lt;/ref&gt; ([[Vivendi]]){{efn|[[Skydance Media]] currently owns the rights just to produce new ''Terminator'' films. Other elements, such as some of the films and trademarks relating to them, are owned by other entities.}}<br /> | years = 1984–present<br /> | books =<br /> | novels = ''[[T2 (novel series)|T2]]''<br /> | comics = [[List of Terminator comics|List of comics]]<br /> | magazines =<br /> | strips =<br /> | films = {{Plainlist|<br /> * ''[[The Terminator]]'' (1984)<br /> * ''[[Terminator 2: Judgment Day]]'' (1991)<br /> * ''[[Terminator 3: Rise of the Machines]]'' (2003)<br /> * ''[[Terminator Salvation]]'' (2009)<br /> * ''[[Terminator Genisys]]'' (2015)<br /> * ''[[Terminator: Dark Fate]]'' (2019)<br /> }}<br /> | tv = ''[[Terminator: The Sarah Connor Chronicles]]'' (2008–2009)<br /> | wtv = {{Plainlist|<br /> * ''[[Terminator Salvation: The Machinima Series]]'' (2009)<br /> * ''Terminator Genisys: The YouTube Chronicles'' (2015)<br /> }}<br /> | games = ''[[The Terminator Collectible Card Game]]'' (2000)<br /> | rpgs =<br /> | vgs = [[List of Terminator video games|List of video games]]<br /> | radio =<br /> | soundtracks = {{Plainlist|<br /> * ''[[The Terminator (soundtrack)|The Terminator]]'' (1984)<br /> * ''[[Terminator 2: Judgment Day (score)|Terminator 2: Judgment Day]]'' (1991)<br /> * ''[[Terminator 3: Rise of the Machines#Soundtrack|Terminator 3: Rise of the Machines]]'' (2003)<br /> * ''[[Terminator: The Sarah Connor Chronicles (soundtrack)|Terminator: The Sarah Connor Chronicles]]'' (2008)<br /> * ''[[Terminator Salvation#Music|Terminator Salvation]]'' (2009)<br /> * ''[[Terminator Genisys (soundtrack)|Terminator Genisys]]'' (2015)<br /> * ''[[Terminator: Dark Fate (soundtrack)|Terminator: Dark Fate]]'' (2019)<br /> }}<br /> | toys =<br /> | attractions = {{Plainlist|<br /> * ''[[T2-3D: Battle Across Time]]'' (1996–2020)<br /> * [[Apocalypse: The Ride|Terminator Salvation: The Ride]] (2009–2010)<br /> * [[Terminator X: A Laser Battle for Salvation]] (2009–2015)<br /> }}<br /> | otherlabel1 = <br /> | otherdata1 = <br /> | website = [https://www.paramountmovies.com/movies/terminator-dark-fate Terminator on Paramount Pictures]<br /> }}<br /> '''''Terminator''''' is&lt;!-- see WP:FILMNOW --&gt; an American [[cyberpunk]]&lt;ref&gt;Elias, Herlander. Cyberpunk 2.0: fiction and contemporary. No. 2nd Edition. Covilhã: LabCom Books, 2009, 2009.&lt;/ref&gt;&lt;ref&gt;Nandi, Arindam. &quot;“You Were Made as Well as We Could Make You”: Posthuman Identity Formations in James Cameron’s Terminator Dilogy, Ridley Scott’s Blade Runner, and the Wachowski Brothers’ the Matrix Trilogy.&quot; Quarterly Review of Film and Video (2023): 1-20.&lt;/ref&gt; [[media franchise]] created by [[James Cameron]] and [[Gale Anne Hurd]]. The franchise encompasses a series of [[science fiction action film]]s, comics, novels and additional media, concerning a total war between [[Skynet (Terminator)|Skynet]]'s [[synthetic intelligence]] – a [[self-aware]] military machine network – and [[John Connor]]'s Resistance forces comprising the survivors of the [[human|human race]]. Skynet's most famous products in its genocidal goals are [[Terminator (character concept)|the various terminator models]], such as the [[Terminator (character)|T-800]], who was portrayed by [[Arnold Schwarzenegger]] from the [[The Terminator|original ''Terminator'' film]] in 1984. By 2010, the franchise had generated $3 billion in revenue.&lt;ref&gt;{{cite web |url= http://www.businesswire.com/news/home/20100217005514/en/Pacificor-Names-Latham-Watkins-Field-Terminator-Inquiries |title=Pacificor Names Latham &amp; Watkins to Field Terminator Inquiries |website=[[Business Wire]] |publisher=[[Berkshire Hathaway]] |date=February 17, 2010 |access-date=March 5, 2017 |archiveurl= https://web.archive.org/web/20170306034914/http://www.businesswire.com/news/home/20100217005514/en/Pacificor-Names-Latham-Watkins-Field-Terminator-Inquiries |archivedate=March 6, 2017}}&lt;/ref&gt;<br /> <br /> ==Setting==<br /> [[File:Terminator(Future War).png|thumb|left|Concept art illustrating the conflicts between Skynet and the Resistance in a post-apocalyptic, futuristic setting, envisioned by creator James Cameron for the 1984 film ''[[The Terminator]]''.]]<br /> <br /> The central theme of the franchise is the battle for survival between the nearly-extinct human race and the world-spanning [[synthetic intelligence]] that is [[Skynet (Terminator)|Skynet]]. Skynet is positioned in the first film, ''[[The Terminator]]'' (1984), as a U.S. strategic &quot;Global Digital Defense Network&quot; computer system by [[Cyberdyne Systems]] which becomes self-aware. Shortly after activation, Skynet perceives all humans as a threat to its existence and formulates a plan to systematically wipe out humanity itself. The system initiates a nuclear [[Pre-emptive nuclear strike|first strike]] against Russia, thereby ensuring a devastating second strike and a nuclear holocaust which wipes out much of humanity in the resulting nuclear war. In the post-apocalyptic aftermath, Skynet later builds up its own autonomous machine-based military capability which includes the [[Terminator (character concept)|Terminators]] used against individual human targets and therefore proceeds to wage a persistent total war against the surviving elements of humanity, some of whom have militarily organized themselves into a Resistance. At some point in this future, Skynet develops the capability of [[time travel]] and both it and the Resistance seek to use this technology in order to win the war; either by altering or accelerating past events or by preventing the apocalyptic timeline.<br /> <br /> {| class=&quot;wikitable mw-collapsible floatright&quot;<br /> ! scope=&quot;colgroup&quot; colspan=&quot;5&quot; style= | ''Terminator'' story chronology<br /> |-<br /> ! scope=&quot;col&quot; colspan=&quot;6&quot; | Original continuity<br /> |-<br /> | scope=&quot;rowgroup&quot; colspan=&quot;6&quot; | {{Unbulleted list center<br /> | ''[[The Terminator]]''<br /> | ''[[Terminator 2: Judgment Day]]''<br /> | ''[[Terminator 3: Rise of the Machines]]''<br /> | ''[[Terminator Salvation: The Machinima Series]]''<br /> | ''[[Terminator Salvation]]''<br /> }}<br /> |-<br /> ! scope=&quot;col&quot; colspan=&quot;3&quot; | ''Battle Across Time'' continuity<br /> |-<br /> | scope=&quot;rowgroup&quot; colspan=&quot;3&quot; | {{Unbulleted list center<br /> | ''[[The Terminator]]''<br /> | ''[[Terminator 2: Judgment Day]]''<br /> | ''[[T2-3D: Battle Across Time]]''<br /> }}<br /> |-<br /> ! scope=&quot;col&quot; colspan=&quot;3&quot; | ''The Sarah Connor Chronicles'' continuity<br /> |-<br /> | scope=&quot;rowgroup&quot; colspan=&quot;3&quot; | {{Unbulleted list center<br /> | ''[[The Terminator]]''<br /> | ''[[Terminator 2: Judgment Day]]''<br /> | ''[[Terminator: The Sarah Connor Chronicles]]''<br /> }}<br /> |-<br /> ! scope=&quot;col&quot; | ''Genisys'' continuity<br /> |-<br /> | scope=&quot;rowgroup&quot; colspan=&quot;2&quot; | {{Unbulleted list center<br /> | ''[[Terminator Genisys]]''<br /> | ''[[Terminator Genisys: Future War]]''<br /> }}<br /> |-<br /> ! scope=&quot;col&quot; colspan=&quot;3&quot; | ''Dark Fate'' continuity<br /> |-<br /> | scope=&quot;rowgroup&quot; colspan=&quot;3&quot; | {{Unbulleted list center<br /> | ''[[The Terminator]]''<br /> | ''[[Terminator 2: Judgment Day]]''<br /> | ''[[Terminator: Dark Fate]]''<br /> }}<br /> |-<br /> |}<br /> <br /> ===Judgment Day===<br /> &lt;!-- Please do not change this section title, it has an incoming link from Judgment Day (disambiguation). --&gt;<br /> <br /> In the franchise, Judgment Day (a reference to the biblical [[Last Judgment|Day of Judgment]]) is the date on which Skynet becomes self-aware and its creators panic and attempt to deactivate the network. As a result, Skynet perceives humanity as a threat and chooses to exterminate them. Skynet launches an all-out nuclear attack on Russia in order to provoke a nuclear counter-strike against the United States, knowing this will eliminate its human enemies. Due to time travel and the consequent ability to change the future, several differing dates are given for Judgment Day. In ''[[Terminator 2: Judgment Day]]'' (1991), [[Sarah Connor (Terminator)|Sarah Connor]] states that Judgment Day will occur on August 29, 1997. However, this date is delayed following the attack on Cyberdyne Systems in the second film.<br /> <br /> [[File:Terminator Film Franchise Continuity.svg|thumb|left|An infographic illustrating the continuity between the various timelines in the ''Terminator'' franchise.]]<br /> Judgment Day has various different dates in different timelines of the subsequent films, as well as the television series, creating a multiverse of temporal phenomena. In ''[[Terminator 3: Rise of the Machines]]'' (2003) and ''[[Terminator Salvation]]'' (2009), Judgment Day was postponed to July 2003.&lt;ref&gt;{{cite book |last=Hagberg |first=David |title=Terminator 3: Rise of the Machines |date=2003 |publisher=Macmillan |isbn=9780765347411 |url=https://books.google.com/books?id=DU-hnoszGTMC |access-date=September 6, 2019}}&lt;/ref&gt;&lt;ref&gt;{{cite video game |title=[[Terminator 3: The Redemption]] |developer=Paradigm Entertainment |publisher=Atari |date=2004}}&lt;/ref&gt;&lt;ref&gt;{{cite book |last=Cox |first=Greg |title=Terminator Salvation: Cold War |date=2010 |publisher=Titan Books |isbn=9781848569348 |url=https://books.google.com/books?id=JGY4CgAAQBAJ&amp;pg=PT229 |access-date=September 6, 2019}}&lt;/ref&gt; In ''[[Terminator: The Sarah Connor Chronicles]]'' (2008–2009), the attack on Cyberdyne Systems in the second film delayed Judgment Day to April 21, 2011. In ''[[Terminator Genisys]]'' (2015), the fifth film in the franchise, Judgment Day was postponed to an unspecified day in October 2017, attributed to altered events in both the future and the past. Sarah and [[Kyle Reese]] travel through time to the year 2017 and seemingly defeat Skynet, but the system core, contained inside a subterranean blast shelter, survives unknown to them, thus further delaying, rather than preventing, Judgment Day. In ''[[Terminator: Dark Fate]]'' (2019), the direct sequel to ''Terminator 2: Judgment Day'', a date is not given for the new Judgment Day though it is named as such by Grace. As Grace is a ten–year old in 2020 and shown as a teenager in the post-Judgment Day world in flash-forwards throughout the movie, Judgment Day occurs sometime in the early 2020s in this timeline.<br /> <br /> ==Franchise rights==<br /> Before the first film was created, director [[James Cameron]] sold the rights for $1 to [[Gale Anne Hurd]], who produced the film.&lt;ref name=Rocky&gt;{{cite news |last=Abramowitz |first=Rachel |title=Rage Against the Machines: 'T3's' Rocky Road |url=https://www.latimes.com/archives/la-xpm-2002-mar-11-et-abram11-story.html |access-date=May 25, 2020 |work=Los Angeles Times |date=March 11, 2002}}&lt;/ref&gt; [[Hemdale Film Corporation]] also became a 50-percent owner of the franchise rights, until its share was sold in 1990 to [[Carolco Pictures]], a company founded by [[Andrew G. Vajna]] and [[Mario Kassar]]. ''Terminator 2: Judgment Day'' was released a year later.&lt;ref&gt;{{Cite journal|last=Shapiro|first=Marc|date=August 1991|title=Heart of Steel|url=https://archive.org/details/starlog_magazine-169|journal=[[Starlog]]|issue=169|pages=[https://archive.org/details/starlog_magazine-169/page/n26 27]–32 |access-date=May 25, 2020}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last=Jaafar |first=Ali |title=Deadline Disruptors: King Of Cannes Mario Kassar On The Glory Days Of Carolco, Why Buying Arnie A Plane Made Sense &amp; Talking Vaginas |url=https://deadline.com/2016/05/carolco-pictures-mario-kassar-cannes-interview-foxtrot-six-audition-1201752739/ |work=Deadline |date=May 12, 2016 |access-date=May 25, 2020}}&lt;/ref&gt; Carolco filed for bankruptcy in 1995 and the rights to future ''Terminator'' films were ultimately put up for auction. By that time, Cameron had become interested in making a ''Terminator 3'' film.&lt;ref name=Eller&gt;{{cite web |last=Eller |first=Claudia |title=Big Problemo in Bid for 'Terminator 3' |url=https://www.latimes.com/archives/la-xpm-1997-sep-23-fi-35255-story.html |work=Los Angeles Times |date=September 23, 1997 |access-date=May 25, 2020}}&lt;/ref&gt;&lt;ref name=Petrikin&gt;{{cite web|url=https://variety.com/1997/film/news/fox-cameron-opting-out-of-terminator-3-111660993/|title=Fox, Cameron opting out of 'Terminator 3'|author=Chris Petrikin|work=Variety|date=October 6, 1997|access-date=May 25, 2020}}&lt;/ref&gt; However, the rights were ultimately auctioned to Vajna in 1997, for $8 million.&lt;ref name=Redux&gt;{{cite web|url=https://variety.com/1998/biz/news/kassar-vajna-redux-1117469201/|title=Kassar &amp; Vajna redux|author=Chris Petrikin, Benedict Carver|work=Variety|date=March 26, 1998 |access-date=May 25, 2020}}&lt;/ref&gt; Vajna and Kassar spent another $8 million to purchase Hurd's half of the rights in 1998, becoming the full owners of the franchise.&lt;ref name=Redux/&gt;&lt;ref name=Chron&gt;{{cite web |last=Lawson |first=Terry |title='T3' was almost the big movie that couldn't get made |url=https://www.chron.com/entertainment/movies/article/T3-was-almost-the-big-movie-that-couldn-t-get-2125230.php |work=Knight Ridder Newspapers |date=July 1, 2003 |access-date=May 25, 2020}}&lt;/ref&gt; Hurd was initially opposed to the sale of the rights, while Cameron had lost interest in the franchise and a third film.&lt;ref&gt;{{cite web |last1=Weiner |first1=Rex |last2=Petrikin |first2=Chris |title=Hurd will fight sale of 'Terminator 3' rights |url=https://variety.com/1997/film/news/hurd-will-fight-sale-of-terminator-3-rights-111662485/ |website=Variety |access-date=May 25, 2020 |date=October 7, 1997}}&lt;/ref&gt;<br /> <br /> After the 2003 release of ''Terminator 3: Rise of the Machines'', the franchise rights were sold in 2007 for about $25 million to [[The Halcyon Company]],&lt;ref&gt;{{cite web |last=Fleming |first=Michael |title=More 'Terminator' on the way |url=https://variety.com/2007/film/news/more-terminator-on-the-way-1117964592/ |website=Variety |access-date=May 25, 2020 |date=May 10, 2007}}&lt;/ref&gt;&lt;ref name=Reuters/&gt; which produced ''Terminator Salvation'' in 2009. Later that year, the company faced legal issues and filed for bankruptcy, putting the franchise rights up for sale. The rights were valued at about $70 million.&lt;ref&gt;{{cite news|last=Fritz |first=Ben |url=http://latimesblogs.latimes.com/entertainmentnewsbuzz/2009/09/terminator-rights-may-change-hands-again.html |title='Terminator' rights may change hands again |work=Los Angeles Times |date=September 28, 2009 |access-date=May 25, 2020}}&lt;/ref&gt;&lt;ref&gt;{{cite news|url=https://variety.com/2009/film/news/halcyon-puts-terminator-on-block-1118010756/ |title=Halcyon puts 'Terminator' on block |work=[[Variety (magazine)|Variety]] |date=November 2, 2009 |access-date=May 25, 2020 |first=Dave |last=McNary |author-link=Dave McNary}}&lt;/ref&gt; In 2010, the rights were sold for $29.5 million to Pacificor, a [[hedge fund]] that was Halcyon's largest [[creditor]].&lt;ref&gt;{{cite web |last=Finke |first=Nikki |title='Terminator' Rights Sell for $29.5 Mil |url=https://www.deadline.com/hollywood/terminator-rights-sell-for-29-5-mil |website=Deadline |access-date=May 25, 2020 |date=February 8, 2010}}&lt;/ref&gt;&lt;ref name=Reuters&gt;{{cite web |title=Pacificor hires agency to sell &quot;Terminator&quot; rights |url=https://www.reuters.com/article/us-terminator/pacificor-hires-agency-to-sell-terminator-rights-idUSTRE64Q58I20100527 |website=Reuters |access-date=May 25, 2020 |date=May 27, 2010}}&lt;/ref&gt; In 2012, the rights were sold to [[Megan Ellison]] for less than $20 million, a lower price than what was previously offered. The low price was due to the possibility of Cameron regaining the rights in 2019, as a result of new North American copyright laws.&lt;ref name=Collura/&gt;&lt;ref&gt;{{cite web |last=Fischer |first=Russ |title=James Cameron Regains Terminator Rights in 2019 |url=https://www.slashfilm.com/james-cameron-regains-terminator-rights-in-2019/ |website=/Film |access-date=May 25, 2020 |date=June 14, 2013}}&lt;/ref&gt; [[David Ellison]] and [[Skydance Productions]] produced ''Terminator Genisys'' in 2015.&lt;ref name=Collura&gt;{{cite web |last=Collura |first=Scott |title=New Terminator Films (Finally) Coming |url=https://www.ign.com/articles/2012/12/04/new-terminator-films-finally-coming |website=IGN |access-date=May 25, 2020 |date=December 4, 2012}}&lt;/ref&gt;<br /> <br /> Cameron worked together with David Ellison to produce the 2019 film ''Terminator: Dark Fate''.&lt;ref&gt;{{cite news |last=Sblendorio |first=Peter |title=James Cameron tells Daily News why he returned for 'Terminator: Dark Fate' and teamed up again with Arnold Schwarzenegger and Linda Hamilton |url=https://www.nydailynews.com/entertainment/movies/ny-james-cameron-terminator-dark-fate-20191028-nagw7pmg45ev7oxqheqv6ap42q-story.html |access-date=May 25, 2020 |work=New York Daily News |date=October 28, 2019}}&lt;/ref&gt; As the film neared its release, Hurd filed to terminate a copyright grant made 35 years earlier. Under this move, Hurd would again become a 50-percent owner of the rights with Cameron and Skydance could lose the rights to make any additional ''Terminator'' films beginning in November 2020, unless a new deal is worked out. Skydance responded that it had a deal in place with Cameron and that it &quot;controls the rights to the ''Terminator'' franchise for the foreseeable future&quot;.&lt;ref&gt;{{cite news |last=Gardner |first=Eriq |title=Real-Life Terminator: Major Studios Face Sweeping Loss of Iconic '80s Film Franchise Rights |url= https://www.hollywoodreporter.com/thr-esq/real-life-terminator-major-studios-face-sweeping-loss-iconic-80s-film-franchise-rights-1244737 |access-date=May 25, 2020 |work=[[The Hollywood Reporter]] |date=October 2, 2019 |archive-url= https://web.archive.org/web/20191012015233/https://www.hollywoodreporter.com/thr-esq/real-life-terminator-major-studios-face-sweeping-loss-iconic-80s-film-franchise-rights-1244737 |archive-date=October 12, 2019 |url-status=live}}&lt;/ref&gt;<br /> <br /> ==Films==<br /> {| class=&quot;wikitable plainrowheaders&quot; style=&quot;text-align:center;&quot; <br /> ! Film<br /> ! U.S. release date<br /> ! Director(s)<br /> ! Screenwriter(s)<br /> ! Story by<br /> ! Producer(s)<br /> |-<br /> |-<br /> ! scope=&quot;row&quot; | ''[[The Terminator]]''<br /> | style=&quot;text-align:left&quot; | {{Start date|1984|10|26}}<br /> | rowspan=&quot;2&quot; | [[James Cameron]]<br /> | colspan=&quot;2&quot; | James Cameron &amp; [[Gale Anne Hurd]]<br /> | Gale Anne Hurd<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator 2: Judgment Day]]''<br /> | style=&quot;text-align:left&quot; | {{Start date|1991|7|3}}<br /> | colspan=&quot;2&quot; | James Cameron &amp; [[William Wisher Jr.]]<br /> | James Cameron<br /> |-<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator 3: Rise of the Machines]]''<br /> | style=&quot;text-align:left&quot; | {{Start date|2003|7|2}}<br /> | [[Jonathan Mostow]]<br /> | [[John Brancato and Michael Ferris|John Brancato &amp; Michael Ferris]]<br /> | John Brancato, Michael Ferris, [[Tedi Sarafian]]<br /> | [[Colin Wilson (film producer)|Colin Wilson]], [[Mario Kassar]], [[Hal Lieberman]], [[Andrew G. Vajna]] &amp; [[Joel B. Michaels]]<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator Salvation]]''<br /> | style=&quot;text-align:left&quot; | {{Start date|2009|5|21}}<br /> | [[McG]]<br /> | colspan=&quot;2&quot; | John Brancato &amp; Michael Ferris<br /> | [[Moritz Borman]], Derek Anderson, Victor Kubicek &amp; Jeffrey Silver<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator Genisys]]''<br /> | style=&quot;text-align:left&quot; | {{Start date|2015|7|1}}<br /> | [[Alan Taylor (director)|Alan Taylor]]<br /> | colspan=&quot;2&quot; | [[Patrick Lussier]] &amp; [[Laeta Kalogridis]]<br /> | Dana Goldberg &amp; [[David Ellison]]<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator: Dark Fate]]''<br /> | style=&quot;text-align:left&quot; | {{Start date|2019|11|1}}<br /> | [[Tim Miller (director)|Tim Miller]]<br /> | [[Billy Ray (screenwriter)|Billy Ray]], [[David S. Goyer|David Goyer]] &amp; Justin Rhodes<br /> | David Goyer, [[Josh Friedman]], James Cameron, Justin Rhodes &amp; [[Charles H. Eglee|Charles Eglee]]<br /> | David Ellison &amp; James Cameron<br /> |}<br /> <br /> ===''The Terminator'' (1984)===<br /> {{Main|The Terminator}}<br /> ''The Terminator'' is a 1984 [[science fiction action film]] released by [[Orion Pictures]], co-written and directed by [[James Cameron]] and starring [[Arnold Schwarzenegger]], [[Linda Hamilton]] and [[Michael Biehn]]. It is the first work in the ''Terminator'' franchise. In the film, robots take over the world in the near future, directed by the artificial intelligence Skynet. With its sole mission to completely annihilate humanity, it develops [[android (robot)|android]] assassins called Terminators that outwardly appear human. A man named [[John Connor]] starts the Tech-Com resistance to fight the machines, defeat Skynet and free humanity. With a human victory imminent, the machines' only choice is to send a Terminator back in time to kill John's mother, [[Sarah Connor (Terminator)|Sarah Connor]] and prevent the boy's birth, thereby stopping the resistance from being founded in the first place. With the fate of humanity at stake, John sends soldier [[Kyle Reese]] back to protect Sarah Connor and thus ensure his own existence. It was released on October 26, 1984 and grossed $78.4 million worldwide.<br /> <br /> ===''Terminator 2: Judgment Day'' (1991)===<br /> &lt;!-- Please use the full name here as the section title because a) it's correct, and b) to distinguish it from the section above on the concept of Judgment Day within the franchise. --&gt;<br /> {{Main|Terminator 2: Judgment Day}}<br /> ''Terminator 2: Judgment Day'' is the 1991 sequel to the original ''Terminator'' film and was released by [[TriStar Pictures]]. It was co-written, directed and produced by James Cameron and stars [[Arnold Schwarzenegger]], [[Linda Hamilton]], [[Edward Furlong]], [[Robert Patrick]] and [[Joe Morton]]. After robots fail to prevent John Connor from being born, they try again in 1995, this time attempting to terminate him as a child by using a more advanced Terminator, the [[T-1000]]. As before, John sends back a protector for his younger self, a reprogrammed Terminator, who is a doppelgänger to [[Terminator (character)|the one from 1984]]. After years of preparing for the future war, Sarah decides to use the same tactics the robots used on her: preventing Skynet from being invented by destroying Cyberdyne Systems before they create it. It was released on July 3, 1991, to critical acclaim and grossed $523.7 million worldwide. It also won several [[Academy Awards]], one most notably for its then-cutting-edge computer animation. The film was remastered for 3D and re-released in August 2017.<br /> <br /> ===''Terminator 3: Rise of the Machines'' (2003)===<br /> {{Main|Terminator 3: Rise of the Machines}}<br /> ''Terminator 3: Rise of the Machines'', released by [[Warner Bros. Pictures]] in North America and [[Columbia TriStar Film Distributors]] internationally, is the 2003 sequel to ''Terminator 2'' and is written by [[John Brancato and Michael Ferris|John Brancato]], [[John Brancato and Michael Ferris|Michael Ferris]], directed by [[Jonathan Mostow]] and starring Arnold Schwarzenegger, [[Nick Stahl]], [[Claire Danes]] and [[Kristanna Loken]]. As a result of the destruction of Cyberdyne, the Skynet takeover has been postponed, not averted. In an attempt to ensure a victory by the robots, a new Terminator, the [[T-X]], is sent back to terminate the lives of as many of John Connor's future lieutenants as is possible, including his future wife Kate Brewster and also John himself. Kate's father, General Robert Brewster ([[David Andrews (actor)|David Andrews]]), who is supervising Skynet's development, is also targeted for termination by the T-X. After Connor's future self is terminated by a doppelgänger of his previous protector, Kate reprograms him and sends him back to save them both from the T-X. It was released on July 2, 2003 to generally favorable reviews and grossed $433.4 million worldwide.<br /> <br /> ===''Terminator Salvation'' (2009)===<br /> {{Main|Terminator Salvation}}<br /> ''Terminator Salvation'' is the fourth installment of the ''Terminator'' film series and was made by [[The Halcyon Company]] and distributed by Warner Bros. Pictures and [[Columbia Pictures]]. It was released on May 21, 2009 to mixed reviews and grossed $371.4 million. It was written by John Brancato and Michael Ferris, directed by [[McG]],&lt;ref&gt;{{cite news |first=Borys |last=Kit |url=https://www.reuters.com/article/filmNews/idUSN0133618320071202 |title=Bale to segue from 'Dark Knight' to 'Terminator' |work=[[Reuters]] |date=April 14, 2008}}&lt;/ref&gt; and stars [[Christian Bale]] as John Connor and [[Sam Worthington]] (who was personally recommended by James Cameron&lt;ref name=&quot;Variety-Worthington&quot;&gt;{{cite news |first2=Diane |last2=Garrett |first1=Michael |last1=Fleming |title=Worthington to star in 'Terminator' |url=https://variety.com/2008/film/features/worthington-to-star-in-terminator-1117980831/amp/ |work=Variety |date=February 12, 2008 |access-date=November 23, 2021|quote=Worthington will play the role of Marcus, a central figure in a three-picture arc that begins after Skynet has destroyed much of humanity...}}&lt;/ref&gt;) as Marcus Wright.&lt;ref&gt;{{cite news |first=Gina|last=Serpe|title=Bale Goes Batty For Terminator 4|url=http://www.eonline.com/news/article/index.jsp?uuid=9864803c-63b6-42a6-b26f-4c2b3d101a6b|work=[[E! News]]|date=December 2, 2007|access-date=April 14, 2008}}&lt;/ref&gt; Following the events of ''Terminator 3: Rise of the Machines'', after Skynet has destroyed much of humanity in a nuclear holocaust, John struggles to become the leader of humanity as he is destined, while Marcus Wright finds his place in an unfamiliar post-apocalyptic world. In this future, altered by the events of the second film, the [[T-800]] Terminators ([[Roland Kickinger]] with [[Computer-generated imagery|CG]]-rendered facial likeness of Arnold Schwarzenegger&lt;!-- Schwarzenegger did not act in the film due to his duties as Governor of California; a likeness of his face was used. --&gt;&lt;ref name=&quot;Digital&quot;&gt;{{cite news|author=Michael Fleming|title=Digital Governator set for 'Terminator'|work=[[Variety (magazine)|Variety]]|date=April 22, 2009|url=https://variety.com/2009/film/markets-festivals/digital-governator-set-for-terminator-1118002743/amp/|access-date=November 23, 2021}}&lt;/ref&gt;) are coming online sooner than expected. The film also stars [[Anton Yelchin]] as Kyle Reese,&lt;ref name=&quot;haggis&quot;&gt;{{cite news|first=Gregg|last=Goldstein|title=Yelchin finds 'Salvation'|work=[[The Hollywood Reporter]]|date=March 19, 2008|url=https://www.hollywoodreporter.com/hr/content_display/film/news/e3ifff588c2bae9eaff4982de057e9344ff|access-date=April 14, 2008 |archive-url = https://web.archive.org/web/20080419002212/http://www.hollywoodreporter.com/hr/content_display/film/news/e3ifff588c2bae9eaff4982de057e9344ff &lt;!-- Bot retrieved archive --&gt; |archive-date = April 19, 2008}}&lt;/ref&gt; [[Bryce Dallas Howard]], [[Moon Bloodgood]], [[Common (rapper)|Common]], [[Michael Ironside]] and [[Helena Bonham Carter]].<br /> <br /> ===''Terminator Genisys'' (2015)===<br /> {{Main|Terminator Genisys}}<br /> &lt;!-- This section should only contain a summary of the film's production and plot as the other films do. --&gt;<br /> ''Terminator Genisys'' is the fifth installment of the franchise and also serves as a [[reboot (fiction)|reboot]] that features the main characters from the first two films created by [[James Cameron]], [[Gale Anne Hurd]] and [[William Wisher, Jr.]], portrayed by a new cast, with the exception of Arnold Schwarzenegger reprising his role as the eponymous character. Additionally, [[J. K. Simmons]] joined the cast as Detective O'Brien, serving as an ally for the film's protagonists. The film was written by [[Laeta Kalogridis]] and [[Patrick Lussier]] and directed by [[Alan Taylor (director)|Alan Taylor]]. It was made by [[Skydance Productions]] and distributed by [[Paramount Pictures]]. The story takes place in an alternate reality resulting from a chain of events related to Skynet's ([[Matt Smith (actor)|Matt Smith]]) actions throughout a previous timeline. Prior to this alteration, on the verge of winning the war against Skynet, John Connor ([[Jason Clarke (actor)|Jason Clarke]]) sends his trusted right-hand officer Kyle Reese ([[Jai Courtney]]) back through time to save his mother's life and ensure his own existence. However, Kyle arrives at an alternate timeline where Skynet had never launched its initial attack in 1997 and Sarah Connor ([[Emilia Clarke]]) was brought up by a reprogrammed Terminator (Schwarzenegger), sent by an unknown party to be her guardian ever since childhood. Now Sarah, Kyle and the Guardian need to escape the T-800 Model 101 (Brett Azar with CG-rendered likeness of Schwarzenegger from the first film), the T-1000 ([[Lee Byung-hun]]) and Skynet's [[T-3000]], in an attempt to stop Judgment Day from ever happening; while trying to uncover the secrets behind Cyberdyne Systems' new [[application software]]: ''Genisys''. Assisting the trio is Detective O'Brien (Simmons), whose investigation into time travelers (especially Terminators) leads him to learn about Skynet and helps the protagonists in their mission to avert Judgment Day. The film was released in the U.S. on July 1, 2015 and grossed $440.6 million worldwide. Its commercial performance was lower than anticipated, resulting in two planned sequels and a spin-off television series being cancelled in favor of ''Terminator: Dark Fate'' (2019).<br /> <br /> ===''Terminator: Dark Fate'' (2019)===<br /> {{Main|Terminator: Dark Fate}}<br /> ''Terminator: Dark Fate'' is the sixth installment of the franchise and a direct sequel to ''Terminator 2: Judgment Day''. It is directed by [[Tim Miller (director)|Tim Miller]] and was released in the U.S. on November 1, 2019. It stars Linda Hamilton and Arnold Schwarzenegger, reprising their roles as Sarah Connor and the Terminator, respectively.&lt;ref&gt;{{cite web|url=https://www.hollywoodreporter.com/heat-vision/linda-hamilton-set-return-terminator-franchise-1040948 |title=Linda Hamilton Set to Return to 'Terminator' Franchise (Exclusive) |work=The Hollywood Reporter |date=2017-09-19 |access-date=2017-10-03}}&lt;/ref&gt; The film also stars [[Mackenzie Davis]], [[Natalia Reyes]] and [[Gabriel Luna]].&lt;ref&gt;{{cite web |title=Terminator 6 Gets Blade Runner 2049 Star Mackenzie Davis |url=https://movieweb.com/terminator-6-cast-mackenzie-davis/ |date=March 8, 2018 |website=MovieWeb |access-date=April 13, 2018}}&lt;/ref&gt;&lt;ref&gt;{{cite web |url=https://deadline.com/2018/04/the-terminator-gabriel-luna-tim-miller-james-cameron-skydance-paramount-1202364049/|title=Gabriel Luna is New Terminator, Natalia Rayes &amp; Diego Boneta Set To Star Tim Miller-Jim Cameron Reboot |website=Deadline Hollywood |date=April 13, 2018 |access-date=April 13, 2018}}&lt;/ref&gt; Jude Collie and Brett Azar were also cast as a young John Connor and a younger T-800, respectively.&lt;ref&gt;{{cite web |url=https://bloody-disgusting.com/movie/3502421/photo-shows-return-young-john-connor-terminator-will-take-us-back-90s/ |title=Photo Shows Return of Young John Connor In 'Terminator', Which Will Take Us Back to the '90s! |website=Bloody Disgusting |date=June 5, 2018 |access-date=June 6, 2018}}&lt;/ref&gt;<br /> <br /> The previous film, ''Terminator Genisys'', had been intended as the first in [[#Terminator Genisys trilogy|a new stand-alone film trilogy]], but the planned sequels were canceled following the film's disappointing box-office performance. The producer of that film, David Ellison, recruited James Cameron to produce a new film with him, which would become ''Terminator: Dark Fate''.&lt;ref name=Fox&gt;{{cite news |last=Haas |first=Mariah |title='Terminator: Dark Fate' to be rated R |url=https://www.foxnews.com/entertainment/terminator-dark-fate-rated-r |work=Fox News |agency=Associated Press |date=July 20, 2019 |access-date=July 24, 2019}}&lt;/ref&gt;&lt;ref&gt;{{cite magazine|url=https://deadline.com/2017/01/terminator-james-cameron-deadpool-tim-miller-david-ellison-skydance-1201890848/|title=He's Back! James Cameron To Godfather 'Terminator' With 'Deadpool' Helmer Tim Miller|magazine=Deadline|author=Mike Fleming Jr|date=January 20, 2017}}&lt;/ref&gt;&lt;ref&gt;{{cite web|url=https://www.cinemablend.com/news/1638270/whats-actually-happening-with-the-terminator-franchise-according-to-the-producer|title=What's Actually Happening With The Terminator Franchise, According To The Producer|publisher=Cinema Blend|date=21 March 2017}}&lt;/ref&gt;&lt;ref&gt;{{cite web|author=Alex Leadbeater |url=https://screenrant.com/terminator-6-arnold-schwarzenegger-james-cameron/ |title=Terminator 6: Schwarzenegger Says He's Back |website=Screen Rant |date=2017-05-20 |access-date=2017-10-03}}&lt;/ref&gt;<br /> <br /> In the film, the machines send a Terminator, [[Rev-9]] (Luna), back in time to eliminate Dani Ramos (Reyes), whose destiny is linked to the Human Resistance's war against them. The Resistance sends one of their soldiers Grace (Davis) back to protect her and a chain of events lead Grace and Dani to join forces with Sarah Connor and the T-800.<br /> <br /> The writers' room included [[Josh Friedman]], creator of the television series ''[[Terminator: The Sarah Connor Chronicles]]''. Other writers included [[David S. Goyer]], Justin Rhodes and [[Billy Ray (screenwriter)|Billy Ray]].&lt;ref&gt;{{cite magazine|last1=Kroll|first1=Justin|last2=Lang|first2=Brent|url=https://variety.com/2017/film/news/terminator-new-movie-writer-billy-ray-1202617746/|title=''New Terminator Film'' Writers Room Adds Billy Ray To Polish The Script|date=November 17, 2017|magazine=Variety|access-date=November 18, 2017}}&lt;/ref&gt; The creative team stated that the new film would feature a young 18- to 21-year-old, who could potentially lead the franchise should the first film be successful. Miller made mention of creating a theme park attraction akin to ''[[T2 3-D: Battle Across Time]]'' should the film prove successful.&lt;ref&gt;{{cite web|url=https://www.theterminatorfans.com/terminator-6-writers-room/ |title=Terminator 6 Writers Room |publisher=TheTerminatorFans.com |date=2017-09-20 |access-date=2017-10-03}}&lt;/ref&gt; Because the series deals with time-travel, the film ignores the premise of the last three films and the TV series and is not titled ''Terminator 6'', as it is also a direct sequel to ''Terminator 2: Judgment Day''.&lt;ref&gt;{{cite web|url=https://www.theterminatorfans.com/exclusive-schwarzenegger-talks-terminator-6/ |title=EXCLUSIVE: SCHWARZENEGGER Talks TERMINATOR 6 |publisher=TheTerminatorFans.com |date=2017-09-21 |access-date=2017-10-03}}&lt;/ref&gt; Filming began in [[Isleta del Moro]], [[Almería]]&lt;ref&gt;{{cite news |last=Martínez |first=D. |title='Terminator' ya se rueda en la playa del Rinconcillo en la Isleta del Moro |url=http://www.diariodealmeria.es/ocio/Terminator-rueda-Rinconcillo-Isleta-Moro_0_1249975105.html |date=30 May 2018 |access-date=30 May 2018 |newspaper=[[Diario de Almería]] |language=es |publisher=[[Grupo Joly]]}}&lt;/ref&gt;&lt;ref&gt;{{cite news |last=Martínez |first=D. |title=La Isleta acoge unos decorados para el rodaje de la película 'Terminator' |url=http://www.diariodealmeria.es/ocio/Isleta-decorados-rodaje-pelicula-Terminator_0_1246075952.html |date=17 May 2018 |access-date=30 May 2018 |newspaper=[[Diario de Almería]] |language=es |publisher=[[Grupo Joly]]}}&lt;/ref&gt; on June 4, 2018, shooting for a month there, before shooting the rest in the United States.<br /> <br /> This film was intended as the first in [[#Terminator: Dark Fate trilogy|a new trilogy of ''Terminator'' films]],&lt;ref&gt;{{cite web|last=Libbey |first=Dirk |url=https://www.cinemablend.com/news/1685210/where-the-terminator-franchise-is-going-next-according-to-james-cameron |title=Where The Terminator Franchise Is Going Next, According To James Cameron |publisher=Cinemablend.com |date=2017-07-26 |access-date=2017-10-03}}&lt;/ref&gt; but these plans were canceled due to very mixed audience reactions and the film's underperforming box office record.&lt;ref&gt;{{cite web |url=https://www.theguardian.com/film/2019/nov/05/darkest-fate-how-the-terminator-franchise-was-finally-terminated |title=Darkest fate: how the Terminator franchise was finally terminated |work=The Guardian |last=Lee |first=Benjamin |date=5 November 2019 |access-date=21 July 2021}}&lt;/ref&gt;&lt;ref name=Ice/&gt;<br /> <br /> ==Television==<br /> {| class=&quot;wikitable plainrowheaders&quot; style=&quot;text-align:center&quot;<br /> |-<br /> ! scope=&quot;col&quot; | Series<br /> ! scope=&quot;col&quot; | Season<br /> ! scope=&quot;col&quot; | Episodes<br /> ! scope=&quot;col&quot; | First released<br /> ! scope=&quot;col&quot; | Last released<br /> ! scope=&quot;col&quot; | Showrunner(s)<br /> ! scope=&quot;col&quot; | Network(s)<br /> |-<br /> ! scope=&quot;row&quot; rowspan=&quot;2&quot; | ''[[Terminator: The Sarah Connor Chronicles]]''<br /> | 1<br /> | 9<br /> | {{Start date|2008|1|13}}<br /> | {{End date|2008|3|3}}<br /> | rowspan=&quot;2&quot;| [[Josh Friedman]]<br /> | rowspan=&quot;2&quot;| [[Fox Broadcasting Company|Fox]]<br /> |-<br /> | 2<br /> | 22<br /> | {{Start date|2008|9|8}}<br /> | {{End date|2009|4|10}}<br /> |-<br /> |}<br /> <br /> ===''Terminator: The Sarah Connor Chronicles'' (2008–2009)===<br /> {{Main|Terminator: The Sarah Connor Chronicles}}<br /> ''Terminator: The Sarah Connor Chronicles'' follows Sarah ([[Lena Headey]]) and John Connor ([[Thomas Dekker (actor)|Thomas Dekker]]) as they try to &quot;live under the radar&quot; after destroying Cyberdyne in ''Terminator 2''. [[Summer Glau]] plays a Terminator named [[Cameron (Terminator)|Cameron]] and [[Brian Austin Green]] plays Derek Reese, the brother of [[Kyle Reese#Terminator: The Sarah Connor Chronicles|Kyle Reese]], both sent back in time to protect the Connors and prevent another Judgment Day.<br /> <br /> ==Future==<br /> In February 2021, [[Netflix]] announced plans to develop a ''Terminator'' [[anime]] series with [[Skydance]], [[Mattson Tomlin]] and [[Production I.G]].&lt;ref&gt;{{cite web|title=Netflix Orders Terminator Anime Series by Production I.G|url=https://www.animenewsnetwork.com/news/2021-02-26/netflix-orders-terminator-anime-series-by-production-i.g/.170023|website=[[Anime News Network]]|last=Loo|first=Egan|date=February 26, 2021|access-date=February 26, 2021}}&lt;/ref&gt;<br /> <br /> In December 2022, while promoting ''[[Avatar: The Way of Water]]'', producer and director of the first two ''Terminator'' films James Cameron revealed that another series reboot was &quot;in discussion, but nothing has been decided&quot;. The reboot would likely feature an entirely new cast and reset the continuity of the entire film series. Cameron suggested that in hindsight, bringing back both Arnold Schwarzenegger and Linda Hamilton for ''Terminator: Dark Fate'' had been a mistake.&lt;ref&gt;{{cite web |last1=Sharma |first1=Ruchira |title=James Cameron says another Terminator film is 'in discussion' |url=https://www.gq-magazine.co.uk/culture/article/terminator-reboot-james-cameron |website=[[GQ]] |access-date=22 December 2022 |date=22 December 2022}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last1=Rowan |first1=Iona |title=James Cameron reveals discussions over new Terminator movie |url=https://www.digitalspy.com/movies/a42352527/james-cameron-new-terminator-movie/ |website=[[Digital Spy]] |access-date=28 December 2022 |date=28 December 2022}}&lt;/ref&gt; In May 2023, Schwarzenegger stated in an interview that he would not appear in any future franchise installments after the last few films were &quot;not well-written&quot;.&lt;ref&gt;{{cite web |last1=Bergeson |first1=Samantha |title=Arnold Schwarzenegger Is 'Done' with 'Terminator' Franchise After 'Dark Fate' Flop: 'Not Well-Written' |url=https://www.indiewire.com/news/breaking-news/arnold-schwarzenegger-done-with-the-terminator-franchise-1234863717/ |website=[[IndieWire]] |access-date=29 May 2023 |date=17 May 2023}}&lt;/ref&gt; Later that month, it was reported that Cameron was developing a script for a ''Terminator'' reboot.&lt;ref&gt;{{cite web |last1=Cavanaugh |first1=Patrick |title=Terminator: James Cameron Reportedly Started Writing Script for New Installment |url=https://comicbook.com/movies/news/terminator-new-movie-james-cameron-sequel-script-plans-ai/ |website=ComicBook.com |access-date=29 May 2023 |language=en |date=25 May 2023}}&lt;/ref&gt;<br /> <br /> ==Web series==<br /> {| class=&quot;wikitable plainrowheaders&quot; style=&quot;text-align:center&quot;<br /> |-<br /> ! scope=&quot;col&quot; | Series<br /> ! scope=&quot;col&quot; | Season<br /> ! scope=&quot;col&quot; | Episodes<br /> ! scope=&quot;col&quot; | First released<br /> ! scope=&quot;col&quot; | Last released<br /> ! scope=&quot;col&quot; | Showrunner(s)<br /> ! scope=&quot;col&quot; | Network(s)<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator Salvation: The Machinima Series]]''<br /> | 1<br /> | 6<br /> | {{Start date|2009|5|18}}<br /> | {{End date|2009|6|24}}<br /> | Andy Shapiro<br /> | [[Machinima Inc.|Machinima]]<br /> |-<br /> ! scope=&quot;row&quot; | ''Terminator Genisys: The YouTube Chronicles''<br /> | 1<br /> | 3<br /> | colspan=&quot;2&quot; | {{Start date|2015|6|22}}<br /> | [[Jay Bushman]]<br /> | [[YouTube]]<br /> |-<br /> |}<br /> <br /> ===''Terminator Salvation: The Machinima Series'' (2009)===<br /> {{Main|Terminator Salvation: The Machinima Series}}<br /> Set in 2016, years after Judgment Day, Blair Williams (voiced by [[Moon Bloodgood]]) is fighting the war against the machines in downtown Los Angeles, while tracking down the computer hacker named Laz Howard (voiced by [[Cam Clarke]]) and trying to persuade him to join sides with the resistance.<br /> <br /> ===''Terminator Genisys: The YouTube Chronicles'' (2015)===<br /> ''Terminator Genisys: The YouTube Chronicles'' was released in three parts on June 22, 2015 to promote the fifth film, produced by Heresy.&lt;ref&gt;{{cite news|title=YouTube Stars Debut 'Terminator' Web Series With Arnold Schwarzenegger |publisher= Tubefilter |last=Brouwer|first=Bree |date=June 22, 2015|url= https://www.tubefilter.com/2015/06/22/terminator-genisys-youtube-chronicles-arnold-schwarzenegger/ |access-date= June 5, 2019}}&lt;/ref&gt;&lt;ref&gt;{{cite news|title=YouTube Stars Participate In 'Terminator'-Themed Campaign With Arnold Schwarzenegger At YouTube Space LA |publisher= Tubefilter |last= Brouwer|first= Bree |date= April 10, 2015 |url= https://www.tubefilter.com/2015/04/10/youtube-space-la-terminator-campaign-olga-kay-superwoman/ |access-date= June 5, 2019}}&lt;/ref&gt; The web series was directed by Charles Paek and written by Jay Bushman. It features several popular [[List of YouTubers|YouTube stars]] appearing with Arnold Schwarzenegger as the T-800, as they stand together against the T-360 (played by fellow YouTube personality, [[Toby Turner]]).{{cn|date=July 2021}}<br /> <br /> ==Cast and crew==<br /> {{main list|List of Terminator (franchise) characters}}<br /> <br /> ===Principal cast===<br /> {{Cast indicator|appeared=the franchise|A|E|P|O|S|V|Y}}<br /> * {{Cast indicator/note|model|M|a model with the actor or actress's likeness served as a body double}}<br /> * {{Cast indicator/note|likeness|L|the actor or actress lent only their likeness for the film}}<br /> <br /> {| class=&quot;wikitable&quot; style=&quot;text-align:center;&quot;<br /> |-<br /> ! rowspan=&quot;4&quot; style=&quot;width:13.9%;&quot; | Characters<br /> ! colspan=&quot;6&quot; | Films<br /> ! Theme park attraction <br /> ! colspan=&quot;2&quot; | Television series<br /> |-<br /> ! rowspan=&quot;2&quot; style=&quot;width:9%;&quot; | ''[[The Terminator]]''<br /> ! rowspan=&quot;2&quot; style=&quot;width:9%;&quot; | ''[[Terminator 2: Judgment Day|Terminator 2:&lt;br&gt;Judgment Day]]''<br /> ! rowspan=&quot;2&quot; style=&quot;width:9%;&quot; | ''[[Terminator 3: Rise of the Machines|Terminator 3:&lt;br&gt;Rise of the Machines]]''<br /> ! rowspan=&quot;2&quot; style=&quot;width:9%;&quot; | ''[[Terminator Salvation]]''<br /> ! rowspan=&quot;2&quot; | ''[[Terminator Genisys]]''<br /> ! rowspan=&quot;2&quot; style=&quot;width:9%;&quot; | ''[[Terminator: Dark Fate|Terminator:&lt;br&gt;Dark Fate]]''<br /> ! rowspan=&quot;2&quot; style=&quot;width:9%;&quot; | ''[[T2-3D: Battle Across Time|T2-3D:&lt;br&gt;Battle Across Time]]''<br /> ! colspan=&quot;2&quot; | ''[[Terminator: The Sarah Connor Chronicles|Terminator:&lt;br&gt;The Sarah Connor Chronicles]]''<br /> |-<br /> ! [[List of Terminator: The Sarah Connor Chronicles episodes#Season 1 (2008)|Season 1]]<br /> ! [[List of Terminator: The Sarah Connor Chronicles episodes#Season 2 (2008–09)|Season 2]]<br /> |-<br /> ! 1984<br /> ! 1991<br /> ! 2003<br /> ! 2009<br /> ! 2015<br /> ! 2019<br /> ! 1996<br /> ! colspan=&quot;2&quot; | 2008–2009<br /> |-<br /> ! colspan=&quot;10&quot; style=&quot;background-color:silver;&quot; | '''Machines'''<br /> |-<br /> ! [[Terminator (character)|Terminator&lt;br&gt;{{small|T-800 Model 101}}]]{{efn|In the first three films, the characters portrayed by [[Arnold Schwarzenegger]] are each credited as '''Terminator'''. In ''[[Terminator 2: Judgment Day]]'', the character briefly uses the alias of '''Uncle Bob''' on the behest of [[John Connor]]. In ''[[Terminator 3: Rise of the Machines]]'', the character refers to itself as a '''T-101''' and is referred to in promotional materials as a '''T-850'''. In ''[[Terminator Genisys]]'', the character is referred to as '''Pops''' and credited as '''Guardian'''. In ''[[Terminator: Dark Fate]]'', the character goes by the name '''Carl'''.}}<br /> | colspan=&quot;3&quot; | [[Arnold Schwarzenegger]]<br /> | Arnold Schwarzenegger{{ref|likeness|L}}{{ref|special|S}}{{efn|[[Arnold Schwarzenegger]]'s facial likeness was utilized via [[computer-generated imagery|CGI]], applied to Kickinger's body performance. The CGI model was made from a mold of his face made in 1984, scanned to create the digital makeup.&lt;ref name=&quot;Digital&quot;/&gt;}}&lt;hr /&gt;[[Roland Kickinger]]{{ref|young|Y}}{{ref|model|M}}<br /> | colspan=&quot;2&quot; | Arnold Schwarzenegger&lt;hr /&gt;Brett Azar{{ref|young|Y}}{{ref|model|M}}<br /> | Arnold Schwarzenegger<br /> | {{N/A| ''[[Computer animation|CGI]] [[endoskeleton]] only''}}<br /> | {{cEmpty}}<br /> |-<br /> ! [[T-1000]]<br /> | {{cEmpty}}<br /> | [[Robert Patrick]]<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | [[Lee Byung-hun]]<br /> | {{cEmpty}}<br /> | Robert Patrick<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> |-<br /> ! [[T-X]]<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | [[Kristanna Loken]]<br /> | colspan=&quot;6&quot; {{cEmpty}}<br /> |-<br /> ! [[Skynet (Terminator)|Skynet]]{{efn|In ''[[Terminator Salvation]]'', Skynet appears on a computer screen using the physical appearance of Dr. Serena Kogan (portrayed by [[Helena Bonham Carter]]). In ''[[Terminator Genisys]]'', Skynet makes a physical appearance under the disguise of a resistance soldier who is credited as '''Alex''' (portrayed by [[Matt Smith (actor)|Matt Smith]]). In the latter film, Skynet, now known as '''Genisys''', makes additional appearances as a holographic human male ranging in age from 10 to 18 years old, and aged again into another form also portrayed by Smith.}}<br /> | {{cEmpty}}<br /> | colspan=&quot;2&quot; {{N/A| ''No physical actor, Network Facility only''}}<br /> | [[Helena Bonham Carter]]<br /> | [[Matt Smith]]&lt;hr /&gt;Ian Etheridge{{ref|young|Y}}&lt;hr /&gt;Seth Meriwether{{ref|young|Y}}&lt;hr /&gt;Nolan Gross{{ref|young|Y}}<br /> | {{cEmpty}}<br /> | colspan=&quot;3&quot; {{N/A| ''No physical actor, Network Facility only''}}<br /> |-<br /> ! Marcus Wright&lt;br /&gt;{{small|Cyborg}}<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> | [[Sam Worthington]]<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> |-<br /> ! T-600<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> | [[Brian Steele]]<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> | colspan=&quot;2&quot; | Chris Gann<br /> |-<br /> ! [[T-3000]]<br /> | colspan=&quot;4&quot; {{cEmpty}}<br /> | [[Jason Clarke]]<br /> | colspan=&quot;4&quot; {{cEmpty}}<br /> |-<br /> ! Grace Harper&lt;ref&gt;{{cite web |title=Terminator Dark Fate Global Premiere - Part 1 |url=https://www.youtube.com/watch?v=FCbcB_ZLluI | archive-url=https://ghostarchive.org/varchive/youtube/20211030/FCbcB_ZLluI| archive-date=2021-10-30|website=YouTube |access-date=February 2, 2020 |date=November 6, 2019 |at=11:00}}{{cbignore}}&lt;/ref&gt;&lt;br /&gt;{{small|Augment}}<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> | [[Mackenzie Davis]]&lt;hr /&gt;Stephanie Gil{{ref|young|Y}}<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> |-<br /> ! [[Rev-9]]<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> | [[Gabriel Luna]]<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> |-<br /> ! [[Cameron (Terminator)|Cameron&lt;br /&gt;{{small|T-900 TOK715 Terminator}}]]<br /> | colspan=&quot;7&quot; {{cEmpty}}<br /> | colspan=&quot;2&quot; | [[Summer Glau]]<br /> |-<br /> ! Cromartie / John Henry&lt;br /&gt;{{small|T-888}}<br /> | colspan=&quot;7&quot; {{cEmpty}}<br /> | [[Owain Yeoman]]&lt;hr&gt;[[Garret Dillahunt]]<br /> | Garret Dillahunt<br /> |-<br /> ! Catherine Weaver&lt;br /&gt;{{small|T-1001}}<br /> | colspan=&quot;8&quot; {{cEmpty}}<br /> | [[Shirley Manson]]<br /> |-<br /> ! colspan=&quot;10&quot; style=&quot;background-color:silver;&quot; | '''Humans'''<br /> |-<br /> ! [[Sarah Connor (Terminator)|Sarah Connor]]<br /> | [[Linda Hamilton]]<br /> | Linda Hamilton&lt;hr /&gt;[[Linda Hamilton#Early life|Leslie Hamilton]]{{ref|model|M}}<br /> | {{cEmpty}}<br /> | Linda Hamilton{{ref|voice|V}}<br /> | [[Emilia Clarke]]&lt;hr&gt;Willa Taylor{{ref|young|Y}}<br /> | Linda Hamilton&lt;hr&gt;[[Maddy Curley]]{{ref|young|Y}}{{ref|model|M}}<br /> | Linda Hamilton<br /> | colspan=&quot;2&quot; | [[Lena Headey]]<br /> |-<br /> ! [[Kyle Reese]]<br /> | [[Michael Biehn]]<br /> | Michael Biehn{{ref|extend|E}}{{efn|[[Michael Biehn]] reprised his role as Kyle Reese in a cameo scene in which he visits Sarah in a dream of hers. His scene was cut from the theatrical release,&lt;ref&gt;{{cite news|url=https://news.google.com/newspapers?id=fgoxAAAAIBAJ&amp;pg=5536,51262&amp;dq=michael+biehn+terminator+2&amp;hl=en|title=Biehn out of 'Terminator 2'|work=Reading Eagle|date=July 1, 1991|access-date=August 24, 2010}}&lt;/ref&gt; but was later restored when the film was re-released in 1993 and 1997 under the name ''[[Terminator 2: Judgment Day#Extended Edition|Terminator 2: Judgment Day – Special Edition]]''.}}<br /> | {{cEmpty}}<br /> | [[Anton Yelchin]]<br /> | [[Jai Courtney]]&lt;hr&gt;Bryant Prince{{ref|young|Y}}<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | colspan=&quot;2&quot; | [[Jonathan Jackson (actor)|Jonathan Jackson]]&lt;hr&gt;[[Skyler Gisondo]]{{ref|young|Y}}<br /> |-<br /> ! Dr. Peter Silberman<br /> | colspan=&quot;3&quot; | [[Earl Boen]]<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | Earl Boen{{ref|archive|A}}<br /> | {{cEmpty}}<br /> | [[Bruce Davison]]<br /> | {{cEmpty}}<br /> |-<br /> ! Lieutenant Ed Traxler<br /> | [[Paul Winfield]]<br /> | colspan=&quot;8&quot; {{cEmpty}}<br /> |-<br /> ! Vukovich<br /> | [[Lance Henriksen]]<br /> | colspan=&quot;8&quot; {{cEmpty}}<br /> |-<br /> ! [[John Connor]]<br /> | {{cEmpty}}<br /> | [[Edward Furlong]]&lt;hr /&gt;Dalton Abbott{{ref|young|Y}}&lt;hr /&gt;[[Michael Edwards (actor)|Michael Edwards]]{{ref|old|O}}<br /> | [[Nick Stahl]]<br /> | [[Christian Bale]]<br /> | [[Jason Clarke]]<br /> | Edward Furlong{{ref|likeness|L}}{{ref|special|S}}&lt;hr /&gt;Jude Collie{{ref|young|Y}}{{ref|model|M}}&lt;hr /&gt;Aaron Kunitz{{ref|voice|V}}<br /> | Edward Furlong<br /> | colspan=&quot;2&quot; | [[Thomas Dekker (actor)|Thomas Dekker]]&lt;hr /&gt;John DeVito{{ref|young|Y}}<br /> |-<br /> ! [[Miles Dyson]]<br /> | {{cEmpty}}<br /> | [[Joe Morton]]<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | [[Courtney B. Vance|Courtney Vance]]<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | [[Phil Morris (actor)|Phil Morris]]<br /> | {{cEmpty}}<br /> |-<br /> ! Danny Dyson<br /> | {{cEmpty}}<br /> | [[DeVaughn Nixon]]<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | [[Dayo Okeniyi]]<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | Shawn Prince<br /> | {{cEmpty}}<br /> |-<br /> ! Tarissa Dyson<br /> | {{cEmpty}}<br /> | [[S. Epatha Merkerson]]<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> | [[Charlayne Woodard]]<br /> | {{cEmpty}}<br /> |-<br /> ! Enrique Salceda<br /> | {{cEmpty}}<br /> | [[Castulo Guerra]]<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> | [[Tony Amendola]]<br /> | {{cEmpty}}<br /> |-<br /> ! Kate Connor&lt;br /&gt;(née Brewster)<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | [[Claire Danes]]<br /> | [[Bryce Dallas Howard]]<br /> | colspan=&quot;6&quot; {{cEmpty}}<br /> |-<br /> ! Robert Brewster<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | [[David Andrews (actor)|David Andrews]]<br /> | colspan=&quot;6&quot; {{cEmpty}}<br /> |-<br /> ! Scott Mason<br /> | colspan=&quot;2&quot; {{cEmpty}}<br /> | [[Mark Famiglietti]]<br /> | colspan=&quot;6&quot; {{cEmpty}}<br /> |-<br /> ! Blair Williams<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> | [[Moon Bloodgood]]<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> |-<br /> ! Lieutenant Barnes<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> | [[Common (rapper)|Common]]<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> |-<br /> ! General Ashdown<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> | [[Michael Ironside]]<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> |-<br /> ! Dr. Serena Kogan<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> | [[Helena Bonham Carter]]<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> |-<br /> ! Star<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> | [[Jadagrace Berry]]<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> |-<br /> ! Detective O'Brien<br /> | colspan=&quot;4&quot; {{cEmpty}}<br /> | [[J. K. Simmons]]&lt;hr /&gt;[[Wayne Bastrup]]{{ref|young|Y}}<br /> | colspan=&quot;4&quot; {{cEmpty}}<br /> |-<br /> ! Lieutenant Matias<br /> | colspan=&quot;4&quot; {{cEmpty}}<br /> | [[Michael Gladis]]<br /> | colspan=&quot;4&quot; {{cEmpty}}<br /> |-<br /> ! Detective Cheung<br /> | colspan=&quot;4&quot; {{cEmpty}}<br /> | [[Sandrine Holt]]<br /> | colspan=&quot;4&quot; {{cEmpty}}<br /> |-<br /> ! Dani Ramos<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> | [[Natalia Reyes]]<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> |-<br /> ! Diego Ramos<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> | [[Diego Boneta]]<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> |-<br /> ! Felipe Gandal<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> | [[Tristán Ulloa]]<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> |-<br /> ! Major Dean<br /> | colspan=&quot;5&quot; {{cEmpty}}<br /> | Fraser James<br /> | colspan=&quot;3&quot; {{cEmpty}}<br /> |-<br /> ! Derek Reese<br /> | colspan=&quot;7&quot; {{cEmpty}}<br /> | colspan=&quot;2&quot; | [[Brian Austin Green]]<br /> |-<br /> ! James Ellison<br /> | colspan=&quot;7&quot; {{cEmpty}}<br /> | colspan=&quot;2&quot; | [[Richard T. Jones]]<br /> |-<br /> ! Charley Dixon<br /> | colspan=&quot;7&quot; {{cEmpty}}<br /> | colspan=&quot;2&quot; | [[Dean Winters]]<br /> |-<br /> ! Allison Young<br /> | colspan=&quot;8&quot; {{cEmpty}}<br /> | [[Summer Glau]]<br /> |-<br /> ! Jesse Flores<br /> | colspan=&quot;8&quot; {{cEmpty}}<br /> | [[Stephanie Jacobsen]]<br /> |-<br /> ! Riley Dawson<br /> | colspan=&quot;8&quot; {{cEmpty}}<br /> | [[Leven Rambin]]<br /> |}<br /> <br /> ===Additional crew===<br /> {| class=&quot;wikitable sortable&quot; style=&quot;text-align:center&quot;<br /> |-<br /> ! rowspan=&quot;3&quot; | Crew<br /> ! colspan=&quot;6&quot; | Film<br /> |-<br /> ! width=15%| ''[[The Terminator]]''<br /> ! width=15%| ''[[Terminator 2: Judgment Day|Terminator 2:&lt;br&gt;Judgment Day]]''<br /> ! width=15%| ''[[Terminator 3: Rise of the Machines|Terminator 3:&lt;br&gt;Rise of the Machines]]''<br /> ! width=15%| ''[[Terminator Salvation]]''<br /> ! width=15%| ''[[Terminator Genisys]]''<br /> ! width=15%| ''[[Terminator: Dark Fate|Terminator:&lt;br&gt;Dark Fate]]''<br /> |-<br /> ! 1984<br /> ! 1991<br /> ! 2003<br /> ! 2009<br /> ! 2015<br /> ! 2019<br /> |-<br /> ! Executive Producer(s)<br /> | John Daly, Derek Gibson<br /> | Gale Ann Hurd, Mario F. Kassar<br /> | Moritz Borman, Guy East, Nigel Sinclair, Gale Ann Hurd<br /> | Peter D. Graves, Bahman Naraghi, Mario F. Kassar, [[Andrew G. Vajna]], Joel B. Michaels, [[Dan Lin]], Jeanne Allgood.<br /> | Bill Carraro, [[Robert Cort]], [[Megan Ellison]], Laeta Kalogridis, Patrick Lussier<br /> | Dana Goldberg, [[Don Granger]], Edward Cheng, [[Tim Miller (director)|Tim Miller]], John J. Kelly, [[Bonnie Curtis]], Julie Lynn<br /> |-<br /> ! Composer<br /> | colspan=&quot;2&quot; | [[Brad Fiedel]]<br /> | [[Marco Beltrami]]<br /> | [[Danny Elfman]]<br /> | [[Lorne Balfe]]<br /> | [[Tom Holkenborg]]<br /> |-<br /> ! Cinematography<br /> | colspan=&quot;2&quot; | [[Adam Greenberg (cinematographer)|Adam Greenberg]]<br /> | [[Don Burgess (cinematographer)|Don Burgess]]<br /> | [[Shane Hurlbut]]<br /> | [[Kramer Morgenthau]]<br /> | Ken Seng<br /> |-<br /> ! Editor<br /> | [[Mark Goldblatt]]<br /> | [[Conrad Buff IV]]&lt;br /&gt;Mark Goldblatt&lt;br /&gt;[[Richard A. Harris]]<br /> | Nicolas De Toth&lt;br /&gt;[[Neil Travis]]<br /> | [[Conrad Buff IV|Conrad Buff]]<br /> | [[Roger Barton (film editor)|Roger Barton]]<br /> | [[Julian Clarke]]<br /> |-<br /> ! Production companies<br /> | [[Hemdale]]&lt;br /&gt;[[Valhalla Entertainment|Pacific Western Productions]]&lt;br /&gt;Cinema '84<br /> | [[Carolco Pictures]]&lt;br /&gt;Pacific Western Productions&lt;br /&gt;[[Lightstorm Entertainment]]&lt;br /&gt;[[StudioCanal]]<br /> | [[Intermedia (production company)|Intermedia]]&lt;br /&gt;[[C2 Pictures]]<br /> | [[The Halcyon Company]]&lt;br /&gt;[[Wonderland Sound and Vision]]<br /> | [[Skydance Productions]]<br /> | [[Skydance Media]]&lt;br /&gt;[[20th Century Fox]]&lt;br /&gt;[[Tencent Pictures]]&lt;br /&gt;Lightstorm Entertainment&lt;br /&gt;[[TSG Entertainment]]<br /> |-<br /> ! Distributor<br /> | [[Orion Pictures]]<br /> | [[TriStar Pictures]]<br /> | [[Warner Bros. Pictures]]&lt;br /&gt;[[Columbia TriStar Film Distributors International]]<br /> | Warner Bros. Pictures&lt;br /&gt;[[Sony Pictures Releasing International]]<br /> | [[Paramount Pictures]]<br /> | Paramount Pictures&lt;br /&gt;20th Century Fox<br /> |}<br /> <br /> ==Reception==<br /> ===Box office performance===<br /> {| class=&quot;wikitable plainrowheaders&quot; style=&quot;text-align:center;&quot; <br /> |-<br /> ! rowspan=&quot;2&quot; | Film<br /> ! rowspan=&quot;2&quot; | U.S. release date<br /> ! colspan=&quot;3&quot; | Box office revenue<br /> ! colspan=&quot;2&quot; style=&quot;text=&quot;wrap&quot;| Box office ranking<br /> ! rowspan=&quot;2&quot; | Budget<br /> ! scope=&quot;col&quot; rowspan=&quot;2&quot; class=unsortable | {{nowrap|{{Abbr|Ref(s)|References}}}}<br /> |-<br /> ! North America<br /> ! International<br /> ! Worldwide<br /> ! North America<br /> ! Worldwide<br /> |-<br /> ! scope=&quot;row&quot; | ''[[The Terminator]]''<br /> | October 26, 1984<br /> | $38,371,200<br /> | $40,000,000<br /> | $78,371,200<br /> | #1,917<br /> | style=&quot;background:#d3d3d3;&quot;|<br /> | $6.4 million<br /> |&lt;ref&gt;{{cite web|url=https://boxofficemojo.com/movies/?id=terminator.htm|title=The Terminator (1984)|website=[[Box Office Mojo]]|access-date=September 5, 2012}}&lt;/ref&gt;<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator 2: Judgment Day]]''<br /> | July 3, 1991<br /> | $205,881,154<br /> | $312,106,698<br /> | $517,987,852<br /> | #152 (#106)&lt;sup&gt;'''(A)'''&lt;/sup&gt;<br /> | #136<br /> | $94–102 million<br /> |&lt;ref&gt;{{cite web|url=https://boxofficemojo.com/movies/?id=terminator2.htm|title=Terminator 2: Judgment Day (1991)|website=Box Office Mojo|access-date=September 5, 2012}}&lt;/ref&gt;<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator 3: Rise of the Machines]]''<br /> | July 2, 2003<br /> | $150,371,112<br /> | $283,000,000<br /> | $433,371,112<br /> | #288<br /> | #188<br /> | $170–$187.3 million<br /> |&lt;ref&gt;{{cite web |title=Terminator 3: Rise of the Machines (2003) - Financial Information |url=https://www.the-numbers.com/movie/Terminator-3-Rise-of-the-Machines#tab=summary |website=The Numbers}}&lt;/ref&gt;&lt;ref&gt;{{Cite web|url=https://slate.com/culture/2005/05/how-schwarzenegger-raked-in-the-bucks-on-terminator-3.html|title=How Schwarzenegger raked in the bucks on Terminator 3.|last=Epstein|first=Edward Jay|date=2005-05-09|website=Slate Magazine|language=en|access-date=2019-12-17}}&lt;/ref&gt;<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator Salvation]]''<br /> | May 21, 2009<br /> | $125,322,469<br /> | $246,030,532<br /> | $371,353,001<br /> | #418<br /> | #242<br /> | $200 million<br /> |&lt;ref&gt;{{cite web|url=https://boxofficemojo.com/movies/?id=terminatorsalvation.htm|title=Terminator Salvation (2009)|website=Box Office Mojo|access-date=September 5, 2012}}&lt;/ref&gt;<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator Genisys]]''<br /> | July 1, 2015<br /> | $89,760,956<br /> | $350,842,581<br /> | $440,603,537<br /> | #706<br /> | #186<br /> | $155–158 million<br /> |&lt;ref&gt;{{cite web|url=https://www.boxofficemojo.com/movies/?id=terminator2015.htm|title=Terminator: Genisys (2015)|website=Box Office Mojo|access-date=September 23, 2015}}&lt;/ref&gt;&lt;ref&gt;{{cite web|url=https://www.hollywoodreporter.com/news/summer-box-office-whats-behind-790318|title=Summer Box Office: What's Behind Warner Bros.' Risky Move to Release Nine Movies|author=Pamela McClintock|work=[[The Hollywood Reporter]]|publisher=([[Prometheus Global Media]])|date=April 25, 2015|quote= David Ellison's Skydance took the lead on the $170 million Terminator reboot|access-date= November 1, 2019}}&lt;/ref&gt;<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator: Dark Fate]]''<br /> | November 1, 2019<br /> | $62,253,077<br /> | $198,866,215<br /> | $261,119,292<br /> | #1,368<br /> | #602<br /> | $185–196 million<br /> |&lt;ref&gt;{{cite web|title=Terminator: Dark Fate (2019)|url=https://www.boxofficemojo.com/title/tt6450804/?ref_=bo_se_r_2|website=[[Box Office Mojo]]|access-date=December 1, 2019}}&lt;/ref&gt;<br /> |-<br /> ! colspan=&quot;2&quot; | Total<br /> ! ${{formatnum:{{#expr:38371200+205881154+150371112+125322469+89760956+62253077}}}}<br /> ! ${{formatnum:{{#expr:40000000+312106698+283000000+246030532+350842581+198866215}}}}<br /> ! $2,102,805,994<br /> ! #30<br /> ! #27<br /> ! $810.4–832.4 million<br /> !&lt;ref name=&quot;num-franchise&quot;&gt;{{cite web |title= Terminator Franchise Box Office History - The Numbers |url= https://www.the-numbers.com/movies/franchise/Terminator#tab=summary |website= [[The Numbers (website)|The Numbers]]}}&lt;/ref&gt;<br /> |-<br /> | colspan=&quot;9&quot; style=&quot;text-align:left;&quot;| {{smalldiv|'''List indicator(s)'''<br /> * A dark gray cell indicates the information is not available for the film.<br /> * &lt;sup&gt;'''(A)'''&lt;/sup&gt; indicates the adjusted totals based on current ticket prices (calculated by [[Box Office Mojo]]).}}<br /> |}<br /> <br /> ===Critical and public response===<br /> {| class=&quot;wikitable plainrowheaders&quot; style=&quot;text-align:center;&quot; <br /> |-<br /> ! Film<br /> ! [[Rotten Tomatoes]]<br /> ! [[Metacritic]]<br /> ! [[CinemaScore]]&lt;ref name=&quot;CinemaScore&quot;&gt;{{cite web |url=https://www.cinemascore.com/ |title=CinemaScore |publisher=[[CinemaScore]] |access-date=April 16, 2022 |archive-url= https://web.archive.org/web/20220413083139/https://www.cinemascore.com/ |archive-date=April 13, 2022 |url-status=live}}&lt;/ref&gt;<br /> |-<br /> ! scope=&quot;row&quot; | ''[[The Terminator]]''<br /> | 100% (8.80/10 average rating) (68 reviews)&lt;ref&gt;{{cite web | url= https://www.rottentomatoes.com/m/terminator/ | title=The Terminator | website= [[Rotten Tomatoes]] | access-date=July 27, 2023}}&lt;/ref&gt;<br /> | 84 (21 reviews)&lt;ref&gt;{{cite web|url= https://www.metacritic.com/video/titles/terminator |title=The Terminator (1984): Reviews | website= [[Metacritic]] |access-date=January 29, 2020}}&lt;/ref&gt;<br /> | {{N/A}}<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator 2: Judgment Day]]''<br /> | 91% (8.50/10 average rating) (86 reviews)&lt;ref name=&quot;rt-t2&quot;&gt;{{cite web | url= https://www.rottentomatoes.com/m/terminator_2_judgment_day/ | title=Terminator 2: Judgment Day | website= [[Rotten Tomatoes]] | access-date=July 27, 2023}}&lt;/ref&gt;<br /> | 75 (22 reviews)&lt;ref name=&quot;mc-t2&quot;&gt;{{cite web |url= https://www.metacritic.com/video/titles/terminator2 |title=Terminator 2: Judgment Day (1991): Reviews|website= Metacritic |access-date=August 24, 2021}}&lt;/ref&gt;<br /> | A+<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator 3: Rise of the Machines]]''<br /> | 70% (6.50/10 average rating) (207 reviews)&lt;ref&gt;{{cite web | url= https://www.rottentomatoes.com/m/terminator_3_rise_of_the_machines/ | title=Terminator 3: Rise of the Machines | website= [[Rotten Tomatoes]] | access-date=October 24, 2022}}&lt;/ref&gt;<br /> | 66 (41 reviews)&lt;ref&gt;{{cite web |url= https://www.metacritic.com/video/titles/terminator3 |title=''Terminator 3: Rise of the Machines'' (2003) Reviews|website= Metacritic |access-date=July 10, 2023}}&lt;/ref&gt;<br /> | B+<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator Salvation]]''<br /> | 33% (5.10/10 average rating) (281 reviews)&lt;ref&gt;{{cite web | url= https://www.rottentomatoes.com/m/terminator_4/ | title=Terminator Salvation | website= [[Rotten Tomatoes]] | access-date=July 27, 2023}}&lt;/ref&gt;<br /> | 49 (46 reviews)&lt;ref&gt;{{cite web |url= https://www.metacritic.com/film/titles/terminatorsalvation |title=Terminator Salvation (2009): Reviews|website= Metacritic |access-date=January 29, 2020}}&lt;/ref&gt;<br /> | B+<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator Genisys]]''<br /> | 26% (4.70/10 average rating) (278 reviews)&lt;ref&gt;{{cite web | url= https://www.rottentomatoes.com/m/terminator_genisys/ | title=Terminator Genisys | website= [[Rotten Tomatoes]] | access-date=July 27, 2023}}&lt;/ref&gt;<br /> | 38 (41 reviews)&lt;ref&gt;{{cite web | url= https://www.metacritic.com/movie/terminator-genisys | title=Terminator Genisys | website= [[Metacritic]] | access-date=July 10, 2023}}&lt;/ref&gt;<br /> | B+<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator: Dark Fate]]''<br /> | 70% (6.20/10 average rating) (351 reviews)&lt;ref&gt;{{cite web|url=https://www.rottentomatoes.com/m/terminator_dark_fate|title= Terminator: Dark Fate |website=[[Rotten Tomatoes]]|access-date=July 27, 2023}}&lt;/ref&gt;<br /> | 54 (51 reviews)&lt;ref&gt;{{cite web | url= https://www.metacritic.com/movie/terminator-dark-fate | title= Terminator: Dark Fate Reviews |website= [[Metacritic]] | access-date= July 10, 2023}}&lt;/ref&gt;<br /> | B+<br /> |}<br /> <br /> {| class=&quot;wikitable plainrowheaders&quot; style=&quot;text-align:center;&quot; <br /> |-<br /> ! Television<br /> ! [[Rotten Tomatoes]]<br /> ! [[Metacritic]]<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator: The Sarah Connor Chronicles]]'' &lt;small&gt;(season 1)&lt;/small&gt;<br /> | 76% (6.95/10 average rating) (34 reviews)&lt;ref&gt;{{cite web | url= https://www.rottentomatoes.com/tv/terminator_the_sarah_connor_chronicles/s01 | title=Terminator: The Sarah Connor Chronicles Season 1 | website= [[Rotten Tomatoes]]}}&lt;/ref&gt;<br /> | 74 (24 reviews)&lt;ref name=&quot;meta1&quot;&gt;{{cite web|url= https://www.metacritic.com/tv/shows/terminatorthesarahconnorchronicles|title=Terminator: The Sarah Connor Chronicles: Reviews | website= [[Metacritic]] |access-date= January 29, 2020}}&lt;/ref&gt;<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator: The Sarah Connor Chronicles]]'' &lt;small&gt;(season 2)&lt;/small&gt;<br /> | 94% (7.42/10 average rating) (16 reviews)&lt;ref&gt;{{cite web |url= https://www.rottentomatoes.com/tv/terminator_the_sarah_connor_chronicles/s02 | title= Terminator: The Sarah Connor Chronicles Season 2 | website= [[Rotten Tomatoes]]}}&lt;/ref&gt;<br /> | 67 (4 reviews)&lt;ref name=&quot;meta2&quot;&gt;{{cite web |url= https://www.metacritic.com/tv/shows/terminatorthesarahconnorchroniclesseason2 |title= Terminator: The Sarah Connor Chronicles Season Two: Reviews| website= [[Metacritic]] |access-date= January 29, 2020}}&lt;/ref&gt;<br /> |}<br /> <br /> ===Cultural impact===<br /> The ''Terminator'' franchise, most notably James Cameron's original films, ''The Terminator'' and ''Terminator 2: Judgment Day'', has had a significant impact on popular culture. The film franchise placed #17 on the top 25 greatest film franchises by [[IGN]]&lt;ref&gt;{{cite web|url=https://www.ign.com/articles/2006/12/04/top-25-movie-franchises-of-all-time-17|title=Top 25 Movie Franchises of All Time: #17|work=IGN|date=December 4, 2006|accessdate=November 23, 2021}}&lt;/ref&gt; and is also in the top 30 highest-grossing franchises. According to [[Rotten Tomatoes]], the ''Terminator'' franchise is the sixth highest rated franchise on the site behind the [[Toy Story (franchise)|''Toy Story'' franchise]], the ''[[Dollars Trilogy]]'', [[The Lord of the Rings (film series)|''The Lord of the Rings'' film trilogy]], the [[Mad Max (franchise)|''Mad Max'' franchise]] and the [[Star Wars|original ''Star Wars'' trilogy]], but in front of the [[Indiana Jones|''Indiana Jones'' franchise]].<br /> <br /> In 2008, ''The Terminator'' was selected for preservation in the [[National Film Registry]] by the [[Library of Congress]] as being &quot;culturally, historically or aesthetically significant&quot;.&lt;ref&gt;{{cite news | url=https://www.sfgate.com/entertainment/amp/Library-of-Congress-adds-Terminator-to-archive-3178098.php | archive-url=https://archive.today/20120912022824/http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2008/12/30/DDB91515MB.DTL | archive-date=September 12, 2012 | work=The San Francisco Chronicle | title=Library of Congress adds 'Terminator' to archive | date=December 31, 2008|accessdate=November 23, 2021}}&lt;/ref&gt; The [[American Film Institute]] (AFI) has also recognized both films on a number of occasions: the line &quot;[[I'll be back]]&quot; from ''The Terminator'' placed as the 37th-best movie quote, while &quot;[[Hasta la vista, baby]]&quot; from ''Terminator 2'' ranked 76th on the same list. The Terminator character from ''The Terminator'' was voted the 22nd-greatest villain; meanwhile, the T-800 (of the same likeness) in ''Terminator 2: Judgment Day'' was voted the 48th-greatest hero; this is the only time the same character has appeared on the two opposing lists. In the 100 Years...100 series list, the ''Terminator'' franchise was voted the 42nd most thrilling. In addition, ''Terminator 2: Judgment Day'' ranked 8th on AFI's top 10 list in the science fiction genre.&lt;ref&gt;{{cite web |url=http://connect.afi.com/site/PageServer?pagename=100YearsList |title=American Film Institute |publisher=Connect.afi.com |access-date=January 14, 2012 |url-status=dead |archive-url=https://web.archive.org/web/20110716070826/http://connect.afi.com/site/PageServer?pagename=100YearsList |archive-date=July 16, 2011 |df=mdy-all}}&lt;/ref&gt;<br /> <br /> Both films are the source of numerous pop culture references, such as the use of &quot;I'll be back&quot; in countless other media, including different variations of the phrase by Arnold himself in many of his subsequent films and, in cameo appearances by Robert Patrick, as the T-1000, in ''[[Last Action Hero]]'' and ''[[Wayne's World (film)|Wayne's World]]''. ''[[The Simpsons]]'' have also spoofed both films and the T-1000, in particular, on a number of occasions.&lt;ref&gt;Season 5 Episode 16 [[Homer Loves Flanders]]&lt;/ref&gt;&lt;ref&gt;Season 2 Episode 14 [[Principal Charming]]&lt;/ref&gt;&lt;ref&gt;{{Cite web|url=http://www.duffzone.org/content.php?title=reft2|title=The Simpsons Gallery|website=www.duffzone.org|access-date=2019-12-17}}&lt;/ref&gt;<br /> <br /> ''Terminator 2'' is the only film in the series to garner attention at the Academy Awards, with six nominations and four wins,&lt;ref&gt;{{cite news| url= https://www.imdb.com/title/tt0103064/awards |title=Terminator 2: Judgment Day: Awards |access-date=March 4, 2017}}&lt;/ref&gt; and is rated highly among critics.&lt;ref name=&quot;rt-t2&quot; /&gt;&lt;ref name=&quot;mc-t2&quot; /&gt; In 2006 the readers of ''[[Total Film]]'' rated ''The Terminator'' as cinema's 72nd best film and ''Terminator 2: Judgment Day'' the 33rd.&lt;ref&gt;{{cite web| url= https://lifevsfilm.blogspot.com/p/total-film-top-100.html | title= Total Film Top 100|via=Blogspot}}&lt;/ref&gt;<br /> <br /> The first five ''Terminator'' films have had very respectable box office gross, though after James Cameron left the series it saw diminishing returns in subsequent films. ''The Terminator'' made $78 million worldwide, far surpassing its $6 million budget and becoming a major [[sleeper hit]]. ''Terminator 2: Judgment Day'' grossed approximately $520 million globally, becoming a major blockbuster and the top-grossing film of 1991. ''Terminator 3: Rise of the Machines'' earning $433 million, making it the seventh highest-grossing film of 2003. ''Terminator Salvation'' grossed an estimated $371 million worldwide, a figure below industry expectations. ''Terminator Genisys'' grossed $440 million. ''Terminator: Dark Fate'' raised approximately $261 million worldwide with an estimated loss of $130 million, becoming both the least successful film in the franchise and a [[box office bomb]] in its own right.&lt;ref name=&quot;mojo-franchise&quot;&gt;{{cite web|url=https://boxofficemojo.com/franchises/chart/?id=terminator.htm|title=Terminator |website=Box Office Mojo|publisher=[[Amazon.com]]|archive-url=https://web.archive.org/web/20191022183257/https://www.boxofficemojo.com/franchises/chart/?id=terminator.htm |archive-date=October 22, 2019}}&lt;/ref&gt;<br /> <br /> ==Music==<br /> ===Soundtracks===<br /> {| class=&quot;wikitable plainrowheaders&quot; style=&quot;text-align:center;&quot; <br /> ! Title<br /> ! U.S. release date<br /> ! Length<br /> ! Composer(s)<br /> ! Label<br /> |-<br /> ! scope=&quot;row&quot; | ''[[The Terminator: Original Soundtrack]]''<br /> | style=&quot;text-align:left&quot;| 1984<br /> | style=&quot;text-align:left&quot;| 35:32<br /> | rowspan=&quot;2&quot;| [[Brad Fiedel]]<br /> | [[Enigma Records]]<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator 2: Judgment Day (score)|Terminator 2: Judgment Day (Original Motion Picture Soundtrack)]]''<br /> | style=&quot;text-align:left&quot;| July 1, 1991<br /> | style=&quot;text-align:left&quot;| 53:01<br /> | rowspan=&quot;2&quot;| [[Varèse Sarabande]]<br /> |-<br /> ! scope=&quot;row&quot; | ''Terminator 3: Rise of the Machines – Original Motion Picture Soundtrack''<br /> | style=&quot;text-align:left&quot;| June 24, 2003<br /> | style=&quot;text-align:left&quot;| 51:22<br /> | [[Marco Beltrami]]<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator: The Sarah Connor Chronicles – Original Television Soundtrack]]''<br /> | style=&quot;text-align:left&quot;| December 23, 2008<br /> | style=&quot;text-align:left&quot;| 63:54<br /> | [[Bear McCreary]]<br /> | La-La Land Records<br /> |-<br /> ! scope=&quot;row&quot; | ''Terminator Salvation: Original Soundtrack''<br /> | style=&quot;text-align:left&quot;| May 19, 2009<br /> | style=&quot;text-align:left&quot;| 50:27<br /> | [[Danny Elfman]]<br /> | [[Reprise Records]]<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator Genisys (soundtrack)|Terminator Genisys: Music from the Motion Picture]]''<br /> | style=&quot;text-align:left&quot;| June 24, 2015<br /> | style=&quot;text-align:left&quot;| 75:05<br /> | [[Lorne Balfe]]<br /> | [[Skydance Media]]<br /> |-<br /> ! scope=&quot;row&quot; | ''[[Terminator: Dark Fate (soundtrack)|Terminator: Dark Fate (Music from the Motion Picture)]]''<br /> | style=&quot;text-align:left&quot;| November 1, 2019<br /> | style=&quot;text-align:left&quot;| 58:00<br /> | [[Tom Holkenborg]]<br /> | [[Paramount Music]]<br /> |}<br /> <br /> ==Other media==<br /> ===Video games===<br /> {{Main|List of Terminator video games}}<br /> Various video games have been released since 1991.<br /> <br /> ===Novels===<br /> {{Main|T2 (novel series)}}<br /> A series of novels were released from 2001 to 2004, under the name ''T2''.<br /> <br /> ===Comics===<br /> {{See also|List of Terminator comics}}<br /> <br /> ====''The Terminator'' spin-off comics====<br /> In 1988, [[NOW Comics]] published an ongoing series with John Connor as the main character in 2031, after sending Kyle Reese back to 1984 to protect his mother. The Terminators in this canon had more human-like endoskeletons and some issues would deal with subordinates of Connor's in the ruins of certain geographic areas. The seventeen issue series was followed by two limited series.&lt;ref&gt;{{cite comic | title = The Terminator | publisher = [[NOW Comics]] | date = 1988–1989 | issue = #1–17}}&lt;/ref&gt;&lt;ref&gt;{{cite comic | writer = [[Ron Fortier]] | penciller = [[Alex Ross]] | title = [[Terminator: The Burning Earth]] | publisher = [[NOW Comics]] | date = March–July 1990 | issue = #1–5}}&lt;/ref&gt;&lt;ref&gt;{{cite comic | title = Terminator: All My Futures Past | publisher = [[NOW Comics]] | date = 1990 | issue = #1–2}}&lt;/ref&gt;<br /> <br /> [[Dark Horse Comics]] acquired the rights in 1990. In ''The Terminator'' (with ''Tempest'' added in [[trade paperback (comics)|trade paperbacks]] to distinguish itself from other comics), a group of human soldiers and four Terminators come to the present, to stop Skynet in differing ways.&lt;ref&gt;{{cite comic | writer = [[John Arcudi]] | penciller = [[Chris Warner (comics)|Chris Warner]]| title = The Terminator | publisher = [[Dark Horse Comics]] | issue = 4 issues | date = August–November 1990}}&lt;/ref&gt; In the sequel, ''Secondary Objectives'', the surviving Terminator is reprogrammed to destroy another Terminator sent to aid him and kill Sarah Connor.&lt;ref&gt;{{cite comic | writer = [[James Dale Robinson]] | penciller=[[Paul Gulacy]]| title = The Terminator: Secondary Objectives | publisher = [[Dark Horse Comics]] | issue = 4 issues | date = July–October 1991}}&lt;/ref&gt; In its sequel, ''The Enemy Within'', a team of human assassins attempt to return to the past and kill a Skynet developer.&lt;ref&gt;{{cite comic | writer = [[Ian Edginton]] | artist = [[Vincent Giarrano]] | title = The Terminator: The Enemy Within | publisher = [[Dark Horse Comics]] | issue = 4 issues | date = November 1991 to February 1992}}&lt;/ref&gt; The 1992 ''Endgame'' concludes this arc. Human colonel Mary Randall protects Sarah Connor as she goes into labor.&lt;ref&gt;{{cite comic | writer = [[James Dale Robinson]] | penciller=[[Jackson Guice]] | title = The Terminator: Endgame | publisher = [[Dark Horse Comics]] | issue = 3 issues | date = September–November 1992}}&lt;/ref&gt;<br /> <br /> Dark Horse published a 1992 [[one-shot (comics)|one-shot]] written by [[James Dale Robinson]] and drawn by [[Matt Wagner]]. Here, a female Terminator and a resistance fighter battle for the life of a woman named Sarah Connor, but not the correct one.&lt;ref&gt;{{cite comic | writer = [[James Dale Robinson]] | artist = [[Matt Wagner]] | title = The Terminator | publisher = [[Dark Horse Comics]] |date=July 1991}}&lt;/ref&gt; The comic book had the unusual feature of a physical &quot;[[pop up book|pop-up]]&quot; in one scene.<br /> <br /> A 1993 limited series ''Hunters and Killers'', set during the war, has special Terminators created to impersonate leaders in the Russian resistance.&lt;ref&gt;{{cite comic | writer = [[Toren Smith]] | cowriters = [[Adam Warren (comics)|Adam Warren]], [[Chris Warner (comics)|Chris Warner]] | penciller = [[Bill Jaaska]] | title = The Terminator: Hunters and Killers | publisher = [[Dark Horse Comics]] | issue = 3 issues | date = March–May 1992}}&lt;/ref&gt; Another limited series, published in 1998, follows the misadventures of two malfunctioning Terminators in [[Death Valley]].&lt;ref&gt;{{cite comic | writer = [[Alan Grant (writer)|Alan Grant]] | artist = [[Guy Davis (comics)|Guy Davis]] | title = The Terminator: Death Valley | publisher = [[Dark Horse Comics]] | issue = 5 issues | date = August–December 1998}}&lt;/ref&gt; This set up the following year's comic ''The Dark Years'', set in 2030. In ''The Dark Years'', a Terminator is sent to eliminate John Connor and his mother in 1999.&lt;ref&gt;{{Cite comic | writer = [[Alan Grant (writer)|Alan Grant]] | penciller = [[Mel Rubi]]| copencillers = [[Trevor McCarthy]] | title = The Terminator: The Dark Years | publisher = [[Dark Horse Comics]] | issue = #1–4 | date = September–December 1999}}&lt;/ref&gt; In 2013, Dark Horse released a sequel comic based on the 2009 film ''Terminator Salvation'', entitled ''Terminator Salvation: The Final Battle''.&lt;ref&gt;{{Cite web|url=http://www.comicbookresources.com/?page=article&amp;id=46690|title=SDCC EXCLUSIVE: JMS Explores Skynet in &quot;Terminator: The Final Battle&quot;|website=Comic Book Resources|date=July 17, 2013|access-date=2016-03-11}}&lt;/ref&gt;<br /> <br /> [[Malibu Comics]] published twin series in 1995. One was a sequel to ''Terminator 2: Judgment Day'', in which Sarah and John encounter two Terminators. The other was a prequel that explains the scenario. The conclusions to the series were published in one issue.&lt;ref&gt;{{cite comic | title = Terminator 2: Judgment Day – Cybernetic Dawn | publisher = [[Malibu Comics]] | date = November 1995 to February 1996, April 1996 | issue = #1–5}}&lt;/ref&gt;&lt;ref&gt;{{cite comic | title = Terminator 2: Judgment Day – Nuclear Twilight | publisher = [[Malibu Comics]] | date = November 1995 to February 1996, April 1996 | issue = #1–5}}&lt;/ref&gt;<br /> <br /> [[Beckett Comics]] published three series to promote ''Terminator 3: Rise of the Machines'', each consisting of two issues.&lt;ref&gt;{{cite comic | writer = [[Ivan Brandon]] | penciller = [[Goran Parlov]] | title = Terminator 3: Before the Rise | publisher = [[Beckett Comics]] | issue = 2 issues | date = July and August 2003}}&lt;/ref&gt;&lt;ref&gt;{{cite comic | writer = Miles Gunter | penciller = Mike Hawthone | title = Terminator 3: Eyes of the Rise | publisher = [[Beckett Comics]] | issue = 2 issues | date = September and October 2003}}&lt;/ref&gt;&lt;ref&gt;{{cite comic | writer = Miles Gunter | penciller = [[Kieron Dwyer]] | title = Terminator 3: Fragmented | publisher = [[Beckett Comics]] | issue = 2 issues | date = November and December 2003}}&lt;/ref&gt;<br /> <br /> ''Terminator 2: Infinity'' (later simply ''Terminator Infinity'' (2007) comic book series by [[Dynamite Entertainment]], was set in 2033. It was, for two issues, tied into another one of Dynamite's publications, called ''[[Painkiller Jane]]''.&lt;ref&gt;{{Comic book reference | writer = [[Simon Furman]] | title = [[Terminator 2: Infinity]] | publisher = [[Dynamite Entertainment]] | issue = #1–5 | date = July–November 2005}}&lt;/ref&gt;<br /> <br /> Dynamite's continuation, ''Terminator: Revolution'' and [[IDW Publishing]]'s ''Salvation'' tie-in comic book were legally possible as the former was specifically based on the ''Terminator 2'' license.&lt;ref&gt;[http://www.comicbookresources.com/?page=article&amp;id=18502 Furman on Making Dynamite's Terminator Revolutionary], [[Comic Book Resources]], October 20, 2008&lt;/ref&gt;<br /> <br /> ====Crossover comics====<br /> Terminators have crossed over with ''[[RoboCop]]'', ''[[Superman]]'' and ''[[Alien vs. Predator]]''. In ''[[RoboCop versus The Terminator (comic book)|RoboCop versus The Terminator]]'' (1992) and ''[[Superman vs. The Terminator: Death to the Future]]'' (2000), the heroes must prevent the war-ravaged future.&lt;ref&gt;{{cite comic | writer = [[Frank Miller]] | artist = [[Walt Simonson]] | title = [[RoboCop versus The Terminator (comic book)|RoboCop versus The Terminator]] | publisher = [[Dark Horse Comics]] | issue = 4 issues | date = May–August 1992}}&lt;/ref&gt;&lt;ref&gt;{{cite comic | writer = [[Alan Grant (writer)|Alan Grant]] | penciller = [[Steve Pugh]] | title = [[Superman vs. The Terminator: Death to the Future]] | publisher = [[Dark Horse Comics]] | issue = 4 issues | date = January–March 2000}}&lt;/ref&gt;<br /> <br /> In 2000's ''[[Alien versus Predator versus The Terminator]]'' from Dark Horse, where Skynet, has reactivated [[Far future in fiction|farther in the future]] and is creating an [[Alien (creature in Alien franchise)|Alien]]-Terminator hybrid. [[Ellen Ripley]]'s clone (from ''[[Alien Resurrection]]'') and the [[Predator (fictional species)|Predators]] join forces to stop Skynet.&lt;ref&gt;{{cite comic | writer = [[Mark Schultz (comics)|Mark Schultz]] | penciller = [[Mel Rubi]] | title = [[Alien versus Predator versus The Terminator]] | publisher = [[Dark Horse Comics]] | issue = 4 issues | date = April–July 2000}}&lt;/ref&gt;<br /> <br /> In 2020, Dark Horse and [[IDW Publishing]] published ''[[Transformers vs. The Terminator]]'', in which the [[Autobot|Autobots]] and the [[Decepticon|Decepticons]] are antagonised by the T-800 as Skynet sends the Terminator back through time to destroy the [[Autobot|Cybertronians]] and restore the future timeline.<br /> <br /> ===Collectible card game===<br /> {{Main|The Terminator Collectible Card Game}}<br /> The Terminator Collectible Card Game was released in 2000 by [[Precedence Entertainment|Precedence]].&lt;ref name=&quot;MILLER2&quot;&gt;{{Citation |last=Miller |first=John Jackson |title=Scrye Collectible Card Game Checklist &amp; Price Guide, Second Edition | year=2003 |pages=596–597 |postscript=.}}&lt;/ref&gt;<br /> <br /> ===Role-playing Game===<br /> First announced in 2020 by [[Nightfall Games]], creators of [[SLA Industries]], The Terminator RPG was released in PDF form in Summer 2022, with a physical version following later in the year. The game is based on the first film and Dark Horse Comics line of graphic novels and comics.<br /> <br /> ===Theme park attractions===<br /> {{main|T2-3D: Battle Across Time}}<br /> <br /> ''[[T2-3D: Battle Across Time]]'', a film ride based on the franchise, opened at [[Universal Studios Florida]] in 1996. The ride is presented as an alternate sequel to ''[[Terminator 2: Judgment Day]]'' and features Arnold Schwarzenegger, Linda Hamilton, Edward Furlong and Robert Patrick reprising their roles as The Terminator, Sarah Connor, John Connor and The T-1000 respectively. James Cameron is one of three directors involved with the attraction and would be the final time he would have any strict involvement in anything with the Terminator name until ''Terminator: Dark Fate''.{{cn|date=May 2023}}<br /> <br /> [[Terminator X: A Laser Battle for Salvation]] operated at various locations beginning in 2009. [[Terminator Salvation: The Ride]] operated at California's [[Six Flags Magic Mountain]] from 2009 to 2010.{{cn|date=May 2023}}<br /> <br /> ==Canceled projects==<br /> ===''Terminator Salvation'' trilogy===<br /> In May 2007, the production rights to the ''Terminator'' series had passed from the feuding of [[Andrew G. Vajna]] and [[Mario Kassar]] to [[The Halcyon Company]]. The producers of the company hoped to start a new trilogy based on the franchise.&lt;ref&gt;{{cite news|author=B. Alan Orange|title=There Will Be a Terminator 4!|url=https://movieweb.com/there-will-be-a-terminator-4/|work=[[MovieWeb]]|date=May 9, 2007|access-date=2007-05-09}}&lt;/ref&gt; However, due to the box office failure of the fourth film and legal troubles, the ''Salvation'' trilogy was ultimately cancelled. [[William Wisher]], who co-wrote the first two films, had written material for a potential ''Terminator 5'' and ''Terminator 6'' that would follow on from the events of ''Terminator Salvation''. The two-part story would involve an element of time travel that brings back the deceased character of [[Sarah Connor (Terminator)|Sarah Connor]], allowing her to interact with [[Kyle Reese]] beyond their initial meeting in the first film. Schwarzenegger would also reprise his role for the sixth film. The films would also include new Terminator villains from Skynet. Wisher had written a 24-page [[film treatment]] for ''Terminator 5'' and a four-page concept outline for ''Terminator 6''.&lt;ref&gt;{{cite web |last=Fleming |first=Mike Jr. |title=Exclusive: Wisher's Take On 'Terminator' |url=https://deadline.com/2010/02/exclusive-wishers-take-on-terminator-5-6-25147/ |website=Deadline |access-date=November 1, 2019 |date=February 11, 2010}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last=Cresswell |first=Jackson |title=Terminator 5 and 6 Ideas From Terminator/T2 Scribe William Wisher |url=https://collider.com/terminator-5-and-6-ideas-from-terminator-t2-scribe-william-wisher/ |website=Collider |access-date=November 1, 2019 |date=February 11, 2010}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last=Eisenberg |first=Eric |title=How Terminator: Salvation's Sequels Could Have Gone Down, According To The Writer |url=https://www.cinemablend.com/news/1690829/how-terminator-salvations-sequels-could-have-gone-down-according-to-the-writer |website=CinemaBlend |access-date=November 1, 2019 |date=August 11, 2017}}&lt;/ref&gt;<br /> <br /> ===''Terminator Genisys'' trilogy===<br /> By December 2013, there were plans for ''Terminator Genisys'' to be the start of a new trilogy of films.&lt;ref name=THR&gt;{{cite news|url= https://www.hollywoodreporter.com/live-feed/new-terminator-tv-series-works-663447|title= New 'Terminator' TV series in the works|author= Lesley Goldberg|work= [[The Hollywood Reporter]]|publisher= [[Guggenheim Partners]]|date= December 6, 2013|access-date= August 1, 2019}}&lt;/ref&gt;&lt;ref&gt;{{cite web|url= https://screenrant.com/terminator-new-tv-series-movie-reboot-tie-in/|title= New 'Terminator' TV Series To Tie-In With Movie Reboot Trilogy|author= Ben Kendrick|website= Screen Rant|date= December 6, 2013|access-date= August 1, 2019}}&lt;/ref&gt; In September 2014, Paramount announced release dates for the two ''Genisys'' sequels: May 19, 2017 and June 29, 2018.&lt;ref name=Deadline2014&gt;{{cite news |title=Paramount Carves Out Dates For Next Two 'Terminator' Pics, Sets 'The Gambler' Redo For Oscar-Qualifying Run |url= https://deadline.com/2014/09/terminator-sequels-release-dates-set-830260/ |access-date= August 1, 2019 |website=Deadline Hollywood |date= September 5, 2014}}&lt;/ref&gt; ''Terminator Genisys'' producer David Ellison described the film and its intended trilogy as standalone projects based on Cameron's original ''Terminator'' films. Ellison stated ''Terminator Genisys'' is neither a sequel or a prequel to the previous films, saying: &quot;For us this is Terminator 1, this is not Terminator 5&quot;.&lt;ref name=io9&gt;{{cite news |last=Woerner |first=Meredith |title=I Stared Into The Red Eye Of The T-800 On The Terminator: Genisys Set |url=https://gizmodo.com/i-stared-into-the-red-eye-of-the-t-800-on-the-terminato-1693524469 |work=Gizmodo |date=March 25, 2015 |access-date=November 23, 2021}}&lt;/ref&gt; The sequels to ''Genisys'' were tentatively known as ''Terminator 2'' and ''Terminator 3''.&lt;ref name=io9/&gt;&lt;ref name=Deadline2014/&gt;&lt;ref name=THR2016/&gt; The two sequels were to be filmed back to back during nine months of continuous shooting.&lt;ref name=SlashVisit&gt;{{cite web |last=Sciretta |first=Peter |title=40 Things We Learned on the Set of 'Terminator: Genisys' |url=https://www.slashfilm.com/terminator-gensys-set-visit/ |website=/Film |access-date=August 1, 2019 |date=March 25, 2015}}&lt;/ref&gt;<br /> <br /> The storylines for the two sequels were devised by ''Genisys'' writers Kalogridis and Lussier.&lt;ref&gt;{{cite news |title='Terminator Genisys': Jason Clarke is a Different Type of John Connor |url=https://screenrant.com/terminator-5-genisys-jason-clarke-interview/ |work=ScreenRant |date=March 25, 2015 |access-date=August 1, 2019}}&lt;/ref&gt;&lt;ref name=io9/&gt; The trilogy was being planned out before ''Terminator Genisys'' began filming, as producers David Ellison and Dana Goldberg wanted the full storyline finished ahead of time rather than having to &quot;figure it out as you go along&quot;, stating: &quot;We spent a lot of time breaking that down, and we do know what the last line of the third movie is, should we be lucky enough to get to make it&quot;.&lt;ref name=Sciretta&gt;{{cite web |last=Sciretta |first=Peter |title=Why a Trilogy Was Planned Before Making 'Terminator Genisys;' How Will the Terminator TV Series Connect? |url=https://www.slashfilm.com/new-terminator-tv-series/ |website=/Film |access-date=August 1, 2019 |date=June 26, 2015}}&lt;/ref&gt; Production on the sequels was contingent on whether ''Terminator Genisys'' would be successful;&lt;ref name=Sciretta/&gt; development of the trilogy stalled in 2015 after the film's disappointing box-office performance.&lt;ref&gt;{{cite web|url=https://www.hollywoodreporter.com/news/dangers-financiers-think-they-can-827843|title=The Dangers When Financiers Think They Can Produce Movies, Too|author=Kim Masters|work=The Hollywood Reporter|date=October 2015|access-date=August 1, 2019}}&lt;/ref&gt;&lt;ref&gt;{{cite web|url=https://www.thewrap.com/thegrill-2015-terminator-genisys-producer-on-franchises-future-not-on-hold-but-re-adjusting/|title=TheGrill 2015: 'Terminator: Genisys' Producer on Franchise's Future: Not on Hold but 'Re-Adjusting' (Video)|work=TheWrap|date=October 6, 2015|access-date=August 1, 2019}}&lt;/ref&gt;&lt;ref&gt;{{cite web|url=https://deadline.com/2015/09/mission-impossible-rogue-nation-china-record-1201544867/|title='Mission: Impossible&amp;nbsp;– Rogue Nation' Becomes Highest Grossing 2D Film in China|author=Anthony D'Alessandro, Nancy Tartaglione|website=Deadline Hollywood|date=September 23, 2015|access-date=August 1, 2019}}&lt;/ref&gt; The planned sequels were ultimately cancelled,&lt;ref name=Slash/&gt; with ''Terminator 2'' being removed from Paramount's release schedule in January 2016.&lt;ref name=THR2016&gt;{{cite web|url=https://www.hollywoodreporter.com/heat-vision/paramount-takes-terminator-sequel-release-857746|title=Paramount Takes 'Terminator' Sequel Off Release Schedule|author=Pamela McClintock|date=January 20, 2016|work=The Hollywood Reporter|access-date=August 1, 2019}}&lt;/ref&gt;<br /> <br /> The new trilogy would have explained who sent Pops back in time to protect Sarah Connor.&lt;ref name=Verge&gt;{{cite news |last=Bishop |first=Brian |title=How the director of Terminator Genisys recreated James Cameron's 1984; &quot;The best compliment would be a lawsuit&quot;. |url=https://www.theverge.com/2015/7/2/8874735/terminator-genisys-movie-director-alan-taylor-interview |work=The Verge |date=July 2, 2015 |access-date=August 1, 2019}}&lt;/ref&gt; In February 2015, Schwarzenegger stated he would reprise his role as Pops for the second film in the trilogy, with filming set to begin in 2016.&lt;ref&gt;{{cite news|title=Arnold Schwarzenegger Says He'll be Back for Terminator Genisys Sequel|url=https://www.comingsoon.net/movies/news/413433-arnold-schwarzenegger-says-hell-be-back-for-terminator-genisys-sequel#/slide/1|access-date=August 1, 2019|publisher=ComingSoon.net|date=February 24, 2015}}&lt;/ref&gt; [[Jai Courtney]] and [[Matt Smith (actor)|Matt Smith]] would also reprise their respective roles as Kyle Reese and Skynet.&lt;ref&gt;{{cite web |last=Chitwood |first=Adam |title=Jai Courtney Calls Terminator: Genisys a &quot;Reset&quot; for the Franchise; Says You Don't Have to Have Seen the Previous Films to Understand It |url=https://collider.com/terminator-genisys-sequels-jai-courtney/ |website=Collider |access-date=January 2, 2020 |date=October 13, 2014}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last=Reynolds |first=Simon |title=Doctor Who's Matt Smith cast in Terminator 5, role expands in sequels |url=https://www.digitalspy.com/movies/a568512/doctor-whos-matt-smith-cast-in-terminator-5-role-expands-in-sequels/ |website=Digital Spy |access-date=January 2, 2020 |date=May 2, 2014}}&lt;/ref&gt; J. K. Simmons would have had further involvement in the new trilogy,&lt;ref name=Verge/&gt; and [[Dayo Okeniyi]] would have a significant role reprising his character Danny Dyson in the second film,&lt;ref name=Verge/&gt;&lt;ref name=SlashVisit/&gt; which would have focused on John Connor's life after becoming part machine. Jason Clarke said about the cancelled ''Genisys'' sequel:&lt;ref name=Slash&gt;{{Cite news|url=https://www.slashfilm.com/terminator-genisys-sequel/|title=Abandoned 'Terminator Genisys' Sequel Plot Focused on Cyborg John Connor|date=April 6, 2018|work=/Film|access-date=August 1, 2019}}&lt;/ref&gt;<br /> {{cquote|What I remember was that second one was going to be about John's journey after he was taken by Skynet…like going down to what he became; half machine, half man. That's where the second one was going to start, and that's about all I knew. It's such a bummer we didn't get to do that.}}<br /> <br /> ===''Terminator Genisys''–connected television series===<br /> By December 2013, Skydance Productions and [[Annapurna Pictures]] were developing a new ''Terminator'' television series. [[Ashley Miller (screenwriter)|Ashley Miller]] and [[Zack Stentz]], who had worked together previously on ''Terminator: The Sarah Connor Chronicles'', were named as writers and executive producers. The series was to deviate from the franchise's history at a critical moment in 1984's ''The Terminator'' and would also integrate with the then-projected film series' direct sequels to ''Terminator Genisys''.&lt;ref name=THR/&gt;&lt;ref name=Sciretta/&gt;&lt;ref&gt;{{cite web |url=https://collider.com/terminator-tv-show-details-13-episode-cable-series/ |title=TERMINATOR: TV Show Still in Development; Skydance Heads Hint at 13-Episode Cable Series |first=Matt |last=Goldberg |work=[[Collider (website)|Collider]] |publisher=[[Complex (magazine)|Complex]] |date=June 26, 2015 |access-date=July 27, 2015 |archive-url=https://web.archive.org/web/20150731024819/http://collider.com/terminator-tv-show-details-13-episode-cable-series/ |archive-date=July 31, 2015 |url-status=live}}&lt;/ref&gt;<br /> <br /> ===''Terminator: Dark Fate'' trilogy===<br /> Plans for a new ''Terminator'' film trilogy were announced in July 2017.&lt;ref&gt;{{Cite news |url=http://www.news.com.au/entertainment/movies/titanic-and-avatar-director-james-cameron-hoping-to-make-new-terminator-trilogy/news-story/27e00d478eeaf4c8b31c681a29bde13c |title=Titanic and Avatar director James Cameron hoping to make new Terminator trilogy |first=James |last=Wigney |date=July 23, 2017 |work=News Corp Australia Network |access-date=November 14, 2017 |archive-url=https://web.archive.org/web/20170922002134/http://www.news.com.au/entertainment/movies/titanic-and-avatar-director-james-cameron-hoping-to-make-new-terminator-trilogy/news-story/27e00d478eeaf4c8b31c681a29bde13c |archive-date=September 22, 2017 |url-status=live}}&lt;/ref&gt; While working on the story for ''Terminator: Dark Fate'' that year, Cameron and the writers envisioned the film as the first in the new trilogy. They also worked out the basic storylines for each planned film.&lt;ref name=Deadline&gt;{{cite news |last=Boucher |first=Geoff |title='Terminator: Dark Fate': James Cameron On Rewired Franchise, Possible New Trilogy |url=https://deadline.com/2019/08/terminator-dark-fate-james-cameron-on-re-wired-franchise-possible-new-trilogy-1202707063/ |access-date=September 1, 2019 |work=Deadline Hollywood |date=August 29, 2019 |archive-url=https://web.archive.org/web/20190831031105/https://deadline.com/2019/08/terminator-dark-fate-james-cameron-on-re-wired-franchise-possible-new-trilogy-1202707063/ |archive-date=August 31, 2019 |url-status=live}}&lt;/ref&gt;&lt;ref name=Chitwood&gt;{{cite web |last=Chitwood |first=Adam |title=James Cameron Says 'Terminator: Dark Fate' Begins a New Trilogy; Talks R-Rating |url=https://collider.com/terminator-dark-fate-new-trilogy/ |website=Collider |access-date=September 1, 2019 |date=August 30, 2019 |archive-url=https://web.archive.org/web/20190831045259/http://collider.com/terminator-dark-fate-new-trilogy/ |archive-date=August 31, 2019 |url-status=live}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last=Anderton |first=Ethan |title='Terminator: Dark Fate' Intended to Kick Off a New Trilogy, Because That Worked So Well Before |url=https://www.slashfilm.com/new-terminator-trilogy/ |website=/Film |access-date=September 1, 2019 |date=August 31, 2019 |archive-url=https://web.archive.org/web/20190901160939/https://www.slashfilm.com/new-terminator-trilogy/ |archive-date=September 1, 2019 |url-status=live}}&lt;/ref&gt;&lt;ref name=IGN/&gt;<br /> <br /> In October 2019, Cameron stated that sequels to ''Terminator: Dark Fate'' would further explore the relationship between humans and artificial intelligence, while stating that a resolution between the two feuding sides would be the ultimate outcome.&lt;ref name=&quot;IGN&quot;&gt;{{cite web |last=Vejvoda |first=Jim |title=How Terminator: Dark Fate Sets Up Two Sequels |url=https://www.ign.com/articles/2019/10/22/terminator-dark-fate-sequels-james-cameron-interview |website=[[IGN]] |access-date=October 25, 2019 |date=October 22, 2019 |archive-url=https://web.archive.org/web/20191025164234/https://www.ign.com/articles/2019/10/22/terminator-dark-fate-sequels-james-cameron-interview |archive-date=October 25, 2019 |url-status=live}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last=Weintraub |first=Steve 'Frosty' |title=James Cameron Reveals What Future 'Terminator' Sequels Will Explore |url=https://collider.com/what-future-terminator-sequels-will-explore/ |website=[[Collider (website)|Collider]] |access-date=October 25, 2019 |date=October 22, 2019 |archive-url=https://web.archive.org/web/20191025164235/https://collider.com/what-future-terminator-sequels-will-explore/ |archive-date=October 25, 2019 |url-status=live}}&lt;/ref&gt; That month, Schwarzenegger stated that Cameron would write the ''Terminator: Dark Fate'' sequels and that Cameron would begin work on the next film in early 2020, for release in 2022.&lt;ref&gt;{{cite web |title=Arnold Schwarzenegger |url=https://www.pressreader.com/france/la-manche-libre-saint-lo/20191019/page/139 |website=La Manche Libre |access-date=October 25, 2019 |location=[[Saint-Lô]] |language=fr |date=October 19, 2019 |archive-url=https://web.archive.org/web/20191025164235/https://www.pressreader.com/france/la-manche-libre-saint-lo/20191019/page/139 |archive-date=October 25, 2019 |url-status=live}}&lt;/ref&gt;<br /> <br /> Although the events of ''Terminator: Dark Fate'' erase Schwarzenegger's T-800 character from existence, Cameron did not rule out the possibility of Schwarzenegger reprising the character: &quot;Look, if we make a shit ton of money with this film [''Terminator: Dark Fate''] and the cards say that they like Arnold, I think Arnold can come back. I'm a writer. I can think of scenarios. We don't have a plan for that right now, let me put it that way&quot;.&lt;ref&gt;{{cite web |last=Weintraub |first=Steve 'Frosty' |title=Why James Cameron Didn't Want to Be on Set While 'Terminator: Dark Fate' Was Filming |url=https://collider.com/james-cameron-terminator-dark-fate-interview/ |website=[[Collider (website)|Collider]] |access-date=October 25, 2019 |date=October 22, 2019 |archive-url=https://web.archive.org/web/20191023150036/https://collider.com/james-cameron-terminator-dark-fate-interview/ |archive-date=October 23, 2019 |url-status=live}}&lt;/ref&gt; Natalia Reyes was to reprise her role for a sequel.&lt;ref&gt;{{cite web |last=Lussier |first=Germain |title=Terminator: Dark Fate's 'New Sarah Connor' Is Ready to Lead |url=https://gizmodo.com/terminator-dark-fates-new-sarah-connor-is-ready-to-lea-1839400130|website=Gizmodo |access-date=January 31, 2020 |date=October 28, 2019}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last=Davids |first=Brian |title='Terminator' Star Natalia Reyes on Training in Tom Cruise's &quot;Pain Cave&quot; and Earning Linda Hamilton's Blessing |url=https://www.hollywoodreporter.com/heat-vision/terminator-star-natalia-reyes-training-tom-cruises-pain-cave-1252127 |website=The Hollywood Reporter |access-date=January 31, 2020 |date=November 4, 2019}}&lt;/ref&gt; Hamilton stated in October that she would probably reprise her role as well,&lt;ref&gt;{{cite web |last=Shepherd |first=Jack |title=Linda Hamilton discusses revisiting Sarah Connor in Terminator: Dark Fate: &quot;It was very painful&quot; |url=https://www.gamesradar.com/terminator-dark-fate-interview-linda-hamilton-arnold-schwarzenegger/ |website=[[GamesRadar]] |access-date=October 25, 2019 |date=October 24, 2019 |at=4:55 |archive-url=https://web.archive.org/web/20191025164234/https://www.gamesradar.com/terminator-dark-fate-interview-linda-hamilton-arnold-schwarzenegger/ |archive-date=October 25, 2019 |url-status=live}}&lt;/ref&gt; although she joked that she would fake her own death to avoid appearing in it, saying that making ''Terminator: Dark Fate'' &quot;really was hard&quot; because of the physical training she had to undergo.&lt;ref&gt;{{cite web |last=Napoli |first=Jessica |title=Linda Hamilton reveals she lost so much weight for 'Terminator' sequel, production had to 'build her a new butt' |url=https://www.foxnews.com/entertainment/linda-hamilton-weight-loss-terminator-movie |website=[[Fox News]] |access-date=October 25, 2019 |date=October 16, 2019 |archive-url=https://web.archive.org/web/20191025064556/https://www.foxnews.com/entertainment/linda-hamilton-weight-loss-terminator-movie |archive-date=October 25, 2019 |url-status=live}}&lt;/ref&gt;&lt;ref&gt;{{cite web |last=Holmes |first=Adam |title=Would Linda Hamilton Return For A Terminator: Dark Fate Sequel? She'd Rather Fake Her Death |url=https://www.cinemablend.com/news/2483006/would-linda-hamilton-return-for-a-terminator-dark-fate-sequel-shed-rather-fake-her-death |website=CinemaBlend |access-date=October 25, 2019 |date=October 23, 2019 |archive-url=https://web.archive.org/web/20191024081024/https://www.cinemablend.com/news/2483006/would-linda-hamilton-return-for-a-terminator-dark-fate-sequel-shed-rather-fake-her-death |archive-date=October 24, 2019 |url-status=live}}&lt;/ref&gt; Hamilton later said that she would be happy not to star in another ''Terminator'' film, but she kept the possibility open, with a potential exception being that a sequel be done on a smaller scale and budget.&lt;ref&gt;{{cite news |last=Davids |first=Brian |title=Linda Hamilton &quot;Would Be Quite Happy to Never Return&quot; to 'Terminator' |url=https://www.hollywoodreporter.com/heat-vision/linda-hamilton-would-be-happy-never-return-terminator-1274687 |access-date=January 31, 2020 |work=The Hollywood Reporter |date=January 29, 2020}}&lt;/ref&gt;<br /> <br /> ''Dark Fate'' director [[Tim Miller (director)|Tim Miller]] stated in November that he did not expect to return for a sequel.&lt;ref&gt;{{cite web |last=Schager |first=Nick |title=''Terminator: Dark Fate'' Director Tim Miller Explains the Film's Massive Twist and What Is Next for the Franchise |url= https://www.esquire.com/entertainment/movies/a29656397/terminator-dark-fate-director-tim-miller-john-connor-twist-franchise-sequel/ |work=Esquire |access-date=January 31, 2020 |date=November 1, 2019}}&lt;/ref&gt; Production of a sequel was contingent on whether ''Dark Fate'' was a box-office success.&lt;ref&gt;{{cite news |last=Wakeman |first=Gregory |title=Will there be a sequel to 'Terminator: Dark Fate?' Here's what its cast and director told us |url=https://www.metro.us/entertainment/movies/will-there-be-sequel-terminator-dark-fate |access-date=January 31, 2020 |work=[[Metro New York]] |date=October 29, 2019}}&lt;/ref&gt; Following the underwhelming performance of ''Dark Fate'' at the box-office (with an estimated loss of at least $120 million), sources close to [[Skydance]] told ''[[The Hollywood Reporter]]'' that there are no plans for further films, effectively cancelling the planned ''Dark Fate'' trilogy.&lt;ref name=Ice&gt;{{cite news |first=Pamela |last=McClintock |url= https://www.hollywoodreporter.com/news/terminator-dark-fate-puts-franchise-ice-faces-120m-loss-1251926? |title=''Terminator: Dark Fate'' Puts Franchise on Ice, Faces $120M-plus Loss |work=[[The Hollywood Reporter]] |date=November 3, 2019 |access-date=November 3, 2019 |archive-url= https://web.archive.org/web/20191104145044/https://www.hollywoodreporter.com/news/terminator-dark-fate-puts-franchise-ice-faces-120m-loss-1251926 |archive-date=November 4, 2019 |url-status=live}}&lt;/ref&gt;<br /> <br /> ==See also==<br /> * [[Grandfather paradox]]<br /> * [[Time travel in fiction]]<br /> * [[List of the highest-grossing media franchises]]<br /> <br /> ==Notes==<br /> {{noteslist}}<br /> <br /> ==References==<br /> {{Reflist}}<br /> <br /> ==External links==<br /> * {{Amg movie|281909}}<br /> * {{cite web | url = http://www.scifiscripts.com/scripts_n_z.html | title = Scripts N-Z | publisher = (Includes Terminator-franchise scripts) SciFiScripts.com | access-date = 2014-02-15 | url-status=dead | archive-url = https://web.archive.org/web/20131229121316/http://scifiscripts.com/scripts_n_z.html | archive-date = December 29, 2013 | df = mdy-all}}<br /> * {{cite web | url = https://gizmodo.com/a-whiteboard-that-explains-terminators-entire-history-5192446| title= A Whiteboard That Explains Terminator's Entire History | publisher = Gizmodo | first= Charlie Jane | last= Anders | date= March 31, 2009 | access-date= November 23, 2021}}<br /> <br /> {{Terminator}}<br /> {{James Cameron}}<br /> {{Skydance Media}}<br /> {{Arnold Schwarzenegger}}<br /> {{Portal bar|United States|Film|Television|Comics|Video games|Science Fiction|War|1980s|1990s}}<br /> <br /> {{DEFAULTSORT:Terminator (Franchise)}}<br /> [[Category:Terminator (franchise)| ]]<br /> [[Category:Action film franchises]]<br /> [[Category:American film series]]<br /> [[Category:Apocalyptic fiction]]<br /> [[Category:Post-apocalyptic fiction]]<br /> [[Category:Fiction about artificial intelligence]]<br /> [[Category:Malware in fiction]]<br /> [[Category:Cyborgs in fiction]]<br /> [[Category:Fiction about robots]]<br /> [[Category:Biorobotics in fiction]]<br /> [[Category:Genetic engineering in fiction]]<br /> [[Category:Nanotechnology in fiction]]<br /> [[Category:Fiction about assassinations]]<br /> [[Category:Fiction about time travel]]<br /> [[Category:Science fiction film franchises]]<br /> [[Category:Sony Pictures franchises]]<br /> [[Category:Metro-Goldwyn-Mayer franchises]]<br /> [[Category:Lionsgate franchises]]<br /> [[Category:Warner Bros. franchises]]<br /> [[Category:Paramount Pictures franchises]]<br /> [[Category:20th Century Studios franchises]]<br /> [[Category:Mass media franchises introduced in 1984]]<br /> [[Category:Temporal war fiction]]<br /> [[Category:Film series introduced in 1984]]<br /> [[Category:Dystopian fiction]]<br /> [[Category:Post-apocalyptic literature]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Talk:Bob_Murdoch_(ice_hockey,_born_1946)&diff=1168757671 Talk:Bob Murdoch (ice hockey, born 1946) 2023-08-04T20:32:11Z <p>205.189.94.9: /* BIOGRAPHY */ what needs to be added here.</p> <hr /> <div>{{Talk header}}<br /> {{WikiProject banner shell|blp=yes|1=<br /> {{WikiProject Biography|class=Start|living=yes|sports-priority=|sports-work-group=yes|listas=Murdoch, Bob}}<br /> {{WikiProject Ice Hockey|class=Start|bio=yes|sjs=yes|unref=yes}}<br /> {{WikiProject Canada|class=Start|importance=Low|on=yes|sport=yes}}<br /> }}<br /> <br /> ==BIOGRAPHY==<br /> BOB MURDOCH has just died.<br /> 1) AS obits and tributes come in, this page should be greatly expanded to provide a fitting bio and tribute of his life in hockey.<br /> 2) Early life...From Kirkland Lake...to?? <br /> 3) Junior Hockey?<br /> 4) Academics?<br /> 5) Post-NHL Coaching--Europe?<br /> Is there anything on these teams that need to be translated and shared from another language on wiki accounts?<br /> 6) Final years, and personal life. I just viewed a very touching video from his widow and daughters on his final years in Canmore in a nursing home. Ken Dryden wrote about it in 2022.<br /> <br /> == birthday ==<br /> <br /> wondering why the 1946 birthdate was chosen over may 17, 1947. How do we know which one is correct?[[User:18abruce|18abruce]] ([[User talk:18abruce|talk]]) 12:32, 1 May 2020 (UTC)<br /> hopefully this will be clarified as his obituary is published.</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Murray_Murdoch&diff=1168752006 Murray Murdoch 2023-08-04T19:46:07Z <p>205.189.94.9: /* Personal life */ Manitoba Bisons accomplishments,</p> <hr /> <div>{{Short description|Canadian ice hockey player (1904–2001)}}<br /> {{Infobox ice hockey player<br /> | image = MurdochMurray.png<br /> | image_size = 230px<br /> | caption = Murdoch in a 1935 newspaper<br /> | position = [[Winger (ice hockey)|Left wing]]<br /> | shoots = Left<br /> | height_ft = 5<br /> | height_in = 10<br /> | weight_lb = 178<br /> | played_for = [[New York Rangers]] <br /> | birth_date = {{birth date|1904|5|19|mf=y}}<br /> | birth_place = [[Lucknow, Ontario|Lucknow]], [[Ontario]], Canada<br /> | death_date = {{death date and age|2001|5|17|1904|5|19|mf=y}}<br /> | death_place = [[Georgetown, South Carolina|Georgetown]], [[South Carolina]], U.S.<br /> | career_start = 1925<br /> | career_end = 1938<br /> }}<br /> <br /> '''John Murray Murdoch''' (May 19, 1904 – May 17, 2001) was a Canadian professional ice hockey player and coach. He played for the [[New York Rangers]] of the [[National Hockey League]] from 1926 to 1937, never missing a game in his career. With the Rangers Murdoch won the [[Stanley Cup]] twice, in [[1928 Stanley Cup Finals|1928]] and in [[1933 Stanley Cup Finals|1933]]. After his playing career he coached [[Yale Bulldogs men's ice hockey|Yale University]] from 1938 to 1965.<br /> <br /> ==Personal life==<br /> Murdoch was born in [[Lucknow, Ontario|Lucknow]], [[Ontario]] and raised in [[Edgerton, Alberta|Edgerton]], [[Alberta]]. His parents were Walter Dryden Murdoch (b. 1875) and Jennie Bell &quot;Jane&quot; Murray (b. 1878). He received a Bachelor's degree in mathematics from the [[University of Manitoba]] where he played hockey for the [[Manitoba Bisons]] from 1921 to 1924. The Bisons won four consecutive [[Turnbull Cup]] Provincial Junior Championships, and in 1923, with Murdoch as captain, also won the [[Abbott Cup]] (Western Canada), [[Memorial Cup]] and [[Allan Cup]] for amateur hockey national titles, and were inducted into the [[Manitoba Sports Hall of Fame and Museum]] as a team in 2004.<br /> <br /> ==Hockey career==<br /> He played [[Winger (ice hockey)|left wing]] for the [[New York Rangers]] in 508 games with 84 goals and 108 assists from the Rangers' first season in the [[1926–27 NHL season]] until the [[1936–37 NHL season]]. From 1938 to 1965, he was the sixth head coach of [[Yale University]] hockey team. In 1974, he was awarded the [[Lester Patrick Trophy]] for his contribution to hockey in the [[United States]].<br /> <br /> He was the last living player from the inaugural Rangers team in 1925.&lt;ref name=&quot;Ranger Greats&quot;&gt;{{cite book|url= https://www.goodreads.com/book/show/6860998-100-ranger-greats#bookDetails |title=100 Ranger Greats: Superstars, Unsung Heroes and Colorful Characters |first1=Russ |last1=Cohen |first2=John |last2=Halligan |first3=Adam |last3=Raider |publisher=[[John Wiley &amp; Sons]] |isbn= 978-0470736197 |date=2009 |access-date=February 3, 2020|page=134}}&lt;/ref&gt;<br /> <br /> ==Awards and achievements==<br /> *[[Memorial Cup]] Championship (1923)<br /> *[[Stanley Cup]] Championships (1928 &amp; 1933)<br /> *[[Lester Patrick Trophy]] Winner (1974)<br /> *[[Hobey Baker Legends of College Hockey Award]] (1987)<br /> *Honoured Member of the [[Manitoba Hockey Hall of Fame]]<br /> * In the 2009 book ''100 Ranger Greats'', was ranked No. 39 all-time of the [[List of New York Rangers players|901 New York Rangers]] who had played during the team's first [[2008–09 New York Rangers season|82 seasons]]&lt;ref name=&quot;Ranger Greats&quot;/&gt;<br /> <br /> ==Family Links==<br /> John Murray Murdoch has several relationships with NHL players:<br /> <br /> [[Dave Dryden]] and [[Ken Dryden]] are his first cousins twice removed. J. Murray Murdoch's parents were Jane Murray and Walter Murdoch (b 1875). Walter's half sister Maggie Murdoch (1855-1926) married Andrew Dryden (1849-1922). Their great grandsons are Dave and Ken Dryden.<br /> <br /> [[Mark Messier]] and [[Paul Messier (ice hockey)|Paul Messier]] are related by marriage through Murray Murdoch's wife, Marie Heinrich. Marie was the daughter of George Heinrich and Ina Dea (d 1936). Ina's brother John Dea (d 1943 in World War II) married Alice Dodd Stiles (1911-1999). Their grandsons are the Messier brothers.<br /> <br /> [[Billy Dea]] is also related by marriage. Ina Dea and John Dea's brother [[Howard Dea]] is [[Billy Dea]]'s father, and also played professional hockey. Another one of Dea's siblings, Christine, married Murray Murdoch's uncle (his father, Walter's brother), Lovell Steele Murdoch (1881-1963) - their children being Murray Murdoch's cousins. Former Ranger [[Don Murdoch]] and former California Golden Seal, Cleveland Baron and St Louis Blue [[Bob Murdoch (ice hockey, born 1954)]] are grandsons of Lovell Steele Murdoch and Christine Dea Murdoch, and distant cousins of Murray, .&lt;ref name=&quot;Don Murdoch&quot;&gt;{{cite web|title=Don Murdoch|url=http://rangers.nhl.com/club/atrplayer.htm?id=8449735|publisher=New York Rangers|accessdate=12 December 2015|archive-url=https://web.archive.org/web/20151222092918/http://rangers.nhl.com/club/atrplayer.htm?id=8449735|archive-date=22 December 2015|url-status=dead}}&lt;/ref&gt;<br /> <br /> ==Career statistics==<br /> <br /> ===Regular season and playoffs===<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; style=&quot;text-align:center; width:60em&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! rowspan=&quot;100&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! colspan=&quot;5&quot;|[[Regular season]]<br /> ! rowspan=&quot;100&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! colspan=&quot;5&quot;|[[Playoffs]]<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! [[Season (sports)|Season]]<br /> ! Team<br /> ! League<br /> ! GP !! [[Goal (ice hockey)|G]] !! [[Assist (ice hockey)|A]] !! [[Point (ice hockey)|Pts]] !! [[Penalty (ice hockey)|PIM]]<br /> ! GP !! G !! A !! Pts !! PIM<br /> |-<br /> | 1921–22<br /> | [[Manitoba Bisons|University of Manitoba]]<br /> | WJrHL<br /> | — || — || — || — || —<br /> | — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1922 Memorial Cup|1921–22]]<br /> | University of Manitoba<br /> | [[Memorial Cup|M-Cup]]<br /> | — || — || — || — || —<br /> | 2 || 2 || 0 || 2 || 0<br /> |-<br /> | 1922–23<br /> | University of Manitoba<br /> | WJrHL<br /> | — || — || — || — || —<br /> | — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1923 Memorial Cup|1922–23]]<br /> | University of Manitoba<br /> | M-Cup<br /> | — || — || — || — || —<br /> | 8 || 26 || 4 || 30 || 2<br /> |-<br /> | 1923–24<br /> | University of Manitoba<br /> | [[Manitoba Hockey League|MHL]]<br /> | 8 || 9 || 5 || 14 || 0<br /> | 1 || 0 || 1 || 1 || 0<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | 1924–25<br /> | Winnipeg Tiger Falcons<br /> | MHL<br /> | 18 || 12 || 2 || 14 || 2<br /> | — || — || — || — || —<br /> |-<br /> | 1925–26<br /> | [[Winnipeg Maroons (ice hockey)|Winnipeg Maroons]]<br /> | [[American Hockey Association (1926–1942)|CHL]]<br /> | 34 || 9 || 2 || 11 || 12<br /> | 5 || 0 || 1 || 1 || 0<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1926–27 NHL season|1926–27]]<br /> | [[New York Rangers]]<br /> | [[National Hockey League|NHL]]<br /> | 44 || 6 || 4 || 10 || 12<br /> | 2 || 0 || 0 || 0 || 0<br /> |-<br /> | [[1927–28 NHL season|1927–28]]<br /> | New York Rangers<br /> | NHL<br /> | 44 || 7 || 3 || 10 || 16<br /> | 9 || 2 || 1 || 3 || 12<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1928–29 NHL season|1928–29]]<br /> | New York Rangers<br /> | NHL<br /> | 44 || 8 || 6 || 14 || 22<br /> | 6 || 0 || 0 || 0 || 2<br /> |-<br /> | [[1929–30 NHL season|1929–30]]<br /> | New York Rangers<br /> | NHL<br /> | 44 || 13 || 12 || 25 || 22<br /> | 4 || 3 || 0 || 3 || 6<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1930–31 NHL season|1930–31]]<br /> | New York Rangers<br /> | NHL<br /> | 44 || 7 || 7 || 14 || 6<br /> | 4 || 0 || 2 || 2 || 0<br /> |-<br /> | [[1931–32 NHL season|1931–32]]<br /> | New York Rangers<br /> | NHL<br /> | 48 || 5 || 17 || 22 || 32<br /> | 7 || 0 || 2 || 2 || 2<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1932–33 NHL season|1932–33]]<br /> | New York Rangers<br /> | NHL<br /> | 48 || 5 || 11 || 16 || 23<br /> | 8 || 3 || 4 || 7 || 2<br /> |-<br /> | [[1933–34 NHL season|1933–34]]<br /> | New York Rangers<br /> | NHL<br /> | 48 || 17 || 10 || 27 || 29<br /> | 2 || 0 || 0 || 0 || 0<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1934–35 NHL season|1934–35]]<br /> | New York Rangers<br /> | NHL<br /> | 48 || 14 || 11 || 25 || 6<br /> | 4 || 0 || 2 || 2 || 4<br /> |-<br /> | [[1935–36 NHL season|1935–36]]<br /> | New York Rangers<br /> | NHL<br /> | 48 || 2 || 9 || 11 || 9<br /> | — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1936–37 NHL season|1936–37]]<br /> | New York Rangers<br /> | NHL<br /> | 48 || 0 || 14 || 14 || 16<br /> | 9 || 1 || 1 || 2 || 0<br /> |-<br /> | [[1937–38 AHL season|1937–38]]<br /> | [[Philadelphia Ramblers]]<br /> | [[American Hockey League|IAHL]]<br /> | 44 || 4 || 9 || 13 || 4<br /> | 5 || 0 || 1 || 1 || 4<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; | NHL totals<br /> ! 508 !! 84 !! 104 !! 188 !! 193<br /> ! 55 !! 9 !! 12 !! 21 !! 28<br /> |}<br /> <br /> ==Head coaching record==<br /> {{CBB Yearly Record Start<br /> |type=coach<br /> |conference=<br /> |postseason=<br /> |poll=no<br /> }}<br /> {{CIH yearly record subhead<br /> |name = [[Yale Bulldogs men's ice hockey|{{color|white|Yale Bulldogs}}]]<br /> |color = color:white; background:#00356B<br /> |startyear = 1938<br /> |conflong = NCAA Division I independent schools (ice hockey)<br /> |conference = Independent<br /> |endyear = 1961<br /> |}}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1938–39 United States collegiate men's ice hockey season|1938–39]]<br /> | name = Yale<br /> | overall = 9-10-1<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = national<br /> | season = [[1939–40 United States collegiate men's ice hockey season|1939–40]]<br /> | name = Yale<br /> | overall = 10-6-4<br /> | conference = <br /> | confstanding = <br /> | postseason = East Intercollegiate Champion<br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1940–41 United States collegiate men's ice hockey season|1940–41]]<br /> | name = Yale<br /> | overall = 11-4-2<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1941–42 United States collegiate men's ice hockey season|1941–42]]<br /> | name = Yale<br /> | overall = 13-4-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1942–43 United States collegiate men's ice hockey season|1942–43]]<br /> | name = Yale<br /> | overall = 8-5-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1943–44 United States collegiate men's ice hockey season|1943–44]]<br /> | name = Yale<br /> | overall = 3-2-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1944–45 United States collegiate men's ice hockey season|1944–45]]<br /> | name = Yale<br /> | overall = 2-4-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = national<br /> | season = [[1945–46 United States collegiate men's ice hockey season|1945–46]]<br /> | name = Yale<br /> | overall = 6-2-0<br /> | conference = <br /> | confstanding = <br /> | postseason = East Intercollegiate co-Champion<br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1946–47 United States collegiate men's ice hockey season|1946–47]]<br /> | name = Yale<br /> | overall = 15-6-1<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1947–48 NCAA Division I men's ice hockey season|1947–48]]<br /> | name = Yale<br /> | overall = 8-11-1<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1948–49 NCAA Division I men's ice hockey season|1948–49]]<br /> | name = Yale<br /> | overall = 9-13-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1949–50 NCAA Division I men's ice hockey season|1949–50]]<br /> | name = Yale<br /> | overall = 12-6-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1950–51 NCAA Division I men's ice hockey season|1950–51]]<br /> | name = Yale<br /> | overall = 14-2-1<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1951–52 NCAA Division I men's ice hockey season|1951–52]]<br /> | name = Yale<br /> | overall = 17-8-0<br /> | conference = <br /> | confstanding = <br /> | postseason = [[1952 NCAA Division I Men's Ice Hockey Tournament|NCAA Consolation Game (Win)]]<br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1952–53 NCAA Division I men's ice hockey season|1952–53]]<br /> | name = Yale<br /> | overall = 12-8-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1953–54 NCAA Division I men's ice hockey season|1953–54]]<br /> | name = Yale<br /> | overall = 11-5-3<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1954–55 NCAA Division I men's ice hockey season|1954–55]]<br /> | name = Yale<br /> | overall = 8-12-2<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1955–56 NCAA Division I men's ice hockey season|1955–56]]<br /> | name = Yale<br /> | overall = 9-9-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1956–57 NCAA Division I men's ice hockey season|1956–57]]<br /> | name = Yale<br /> | overall = 10-15-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1957–58 NCAA Division I men's ice hockey season|1957–58]]<br /> | name = Yale<br /> | overall = 8-12-2<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1958–59 NCAA Division I men's ice hockey season|1958–59]]<br /> | name = Yale<br /> | overall = 11-9-1<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1959–60 NCAA Division I men's ice hockey season|1959–60]]<br /> | name = Yale<br /> | overall = 10-15-0<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1960–61 NCAA Division I men's ice hockey season|1960–61]]<br /> | name = Yale<br /> | overall = 12-12-1<br /> | conference = <br /> | confstanding = <br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Subtotal<br /> | name = Yale<br /> | overall = 228-180-19<br /> | confrecord = <br /> }}<br /> {{CIH yearly record subhead<br /> |name = {{color|white|Yale Bulldogs}}<br /> |color = color:white; background:#00356B<br /> |startyear = 1961<br /> |conference = ECAC Hockey<br /> |endyear = 1965<br /> |}}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1961–62 NCAA Division I men's ice hockey season|1961–62]]<br /> | name = Yale<br /> | overall = 8-16-0<br /> | conference = 7-14-0<br /> | confstanding = 21st<br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1962–63 NCAA Division I men's ice hockey season|1962–63]]<br /> | name = Yale<br /> | overall = 12-9-1<br /> | conference = 11-9-0<br /> | confstanding = 11th<br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1963–64 NCAA Division I men's ice hockey season|1963–64]]<br /> | name = Yale<br /> | overall = 4-18-0<br /> | conference = 4-16-0<br /> | confstanding = 27th<br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Entry<br /> | championship = <br /> | season = [[1964–65 NCAA Division I men's ice hockey season|1964–65]]<br /> | name = Yale<br /> | overall = 11-12-0<br /> | conference = 8-12-0<br /> | confstanding = t-9th<br /> | postseason = <br /> }}<br /> {{CBB Yearly Record Subtotal<br /> | name = Yale<br /> | overall = 35-55-1<br /> | confrecord = 30-51-0<br /> }}<br /> {{CBB Yearly Record End<br /> |overall = 263-235-20<br /> |confrecord = 241-181-19<br /> }}<br /> &lt;ref&gt;{{cite news|title=Yale Bulldogs Men's Ice Hockey|url=http://www.yalebulldogs.com/sports/m-hockey/index|publisher=Yale Bulldogs|accessdate=2014-08-04}}&lt;/ref&gt;<br /> <br /> ==References==<br /> {{reflist|2}}<br /> <br /> ==External links==<br /> *{{icehockeystats|legends=13804}}<br /> <br /> {{s-start}}<br /> {{s-ach}}<br /> {{succession box | before = [[Amo Bessone]] | title = [[Hobey Baker Legends of College Hockey Award]] | years = 1987 | after = [[Fido Purpur]]}}<br /> {{s-end}}<br /> <br /> {{Yale Bulldogs men's ice hockey navbox}}<br /> <br /> {{DEFAULTSORT:Murdoch, Murray}}<br /> [[Category:1904 births]]<br /> [[Category:2001 deaths]]<br /> [[Category:Canadian ice hockey coaches]]<br /> [[Category:Canadian ice hockey left wingers]]<br /> [[Category:Ice hockey people from Alberta]]<br /> [[Category:Lester Patrick Trophy recipients]]<br /> [[Category:Manitoba Bisons ice hockey players]]<br /> [[Category:New York Rangers players]]<br /> [[Category:Philadelphia Ramblers players]]<br /> [[Category:Stanley Cup champions]]<br /> [[Category:University of Manitoba alumni]]<br /> [[Category:Winnipeg Maroons players]]<br /> [[Category:Yale Bulldogs men's ice hockey coaches]]<br /> [[Category:Canadian expatriate ice hockey players in the United States]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Manitoba_Bisons&diff=1168749682 Manitoba Bisons 2023-08-04T19:27:49Z <p>205.189.94.9: /* Men's ice hockey */ Murray Murdoch quite a bio on wikipedia!!</p> <hr /> <div>{{short description|Athletic teams that represent the University of Manitoba}}<br /> {{refimprove|date=September 2016}}<br /> {{Infobox college athletics<br /> | name = Manitoba Bisons<br /> | logo = Manitoba Bisons Logo.svg<br /> | logo_width = 175<br /> | university = [[University of Manitoba]]<br /> | association = [[U Sports]]<br /> | conference = [[Canada West Universities Athletic Association]]<br /> | director = Gene Muller<br /> | location = [[Winnipeg]], [[Manitoba]]<br /> | stadium = [[IG Field]]<br /> | arena = [[Max Bell Centre (Winnipeg)|Max Bell Centre]]<br /> | arena2 = [[Investors Group Athletic Centre]]<br /> | othersite label = Other stadiums<br /> | othersite = [[University Stadium (Winnipeg)]]<br /> | mascot = Billy the Bison<br /> | nickname = Bisons<br /> | colour1= Brown<br /> | colour2= Gold <br /> | hex1= 562E18<br /> | hex2= B6985E<br /> | pageurl = http://gobisons.ca<br /> | fightsong = &quot;[[University_of_Manitoba#School_song|Brown and Gold]]&quot;<br /> }}<br /> <br /> The '''Manitoba Bisons''' are the athletic teams that represent the [[University of Manitoba]] in [[Winnipeg, Manitoba]], Canada. The football team plays their games at [[Investors Group Field]]. The soccer team play their home games at the University of Manitoba Soccer Fields while the track and field teams use the [[University Stadium (Winnipeg)|University Stadium]] as their home track. The University has 18 different teams in 10 sports: basketball, curling, cross country running, Canadian football, golf, ice hockey, soccer, swimming, track &amp; field, and volleyball.<br /> <br /> ==Varsity sports==<br /> ===Ice hockey===<br /> ====Men's ice hockey====<br /> The Bisons iced a [[junior ice hockey]] team in the [[Manitoba Junior Hockey League]]. The Bisons won four consecutive [[Turnbull Cup]]s as Manitoba junior champions in 1922, 1923, 1924, and 1925.{{cn|date=November 2019}}<br /> <br /> The 1923 Bisons team won the [[Allan Cup]], [[Memorial Cup]] and [[Abbott Cup]], and were inducted into the [[Manitoba Hockey Hall of Fame]].{{cn|date=November 2019}} The roster included J.A. Wise (Forward), C.E. Williams (Sub Forward), C.S. Doupe (Sub Goal), F. Robertson (Sub Defence), R.E. Moulden (Forward), A.I. Chapman (Defence), [[Blake Watson]] (Forward), [[Murray Murdoch]] (Captain &amp; Centre), A.T. Puttee (Goal), J. Mitchell (Forward), A. Johnson (Defence), S.B. Field (Secretary/Treasurer), R.L. Bruce (Manager), H. Andrews (President), Hal Moulden (Coach), Walter Robertson (Trainer).{{cn|date=November 2019}}<br /> <br /> The school's [[senior ice hockey]] team won the [[1931 World Ice Hockey Championships]] playing as the [[University of Manitoba Grads]], and were inducted into the [[Manitoba Hockey Hall of Fame]] in the team category.{{cn|date=November 2019}} The roster included Sammy McCallum, Gordon MacKenzie, [[Blake Watson]], Art Puttee, Frank Morris, George Hill, Ward McVey, Jack Pidcock, Guy &quot;Weary&quot; Williamson.{{cn|date=November 2019}}<br /> <br /> In December 1934, the university appealed to [[W. A. Fry]] and the [[Amateur Athletic Union of Canada]] regarding a decision by the [[Hockey Manitoba|Manitoba Amateur Hockey Association]] (MAHA) which did not require university students be released from a private club team to play for the school team.&lt;ref&gt;{{cite news|title=Varsity Will Appeal Case to Amateur Body|date=December 10, 1934|newspaper=Winnipeg Tribune|location=Winnipeg, Manitoba|page=10|url=https://newspaperarchive.com/sports-clipping-dec-10-1934-1361537/}}{{free access}}&lt;/ref&gt; Fry agreed with the university, stating that students are under the jurisdiction of the school unless released by the school to play for a club team. He also stated that AAU of C rulings should be respected by affiliated organizations, such as the MAHA.&lt;ref&gt;{{cite news|title=Fry States Rulings Must Be Respected|date=December 12, 1934|newspaper=Winnipeg Tribune|location=Winnipeg, Manitoba|page=12|url=https://newspaperarchive.com/sports-clipping-dec-12-1934-1361544/}}{{free access}}&lt;/ref&gt;<br /> <br /> The 1965 Bisons won the [[David Johnston University Cup]] as the [[U Sports|Canadian Interuniversity Athletics Union]] champions, and were also inducted into the [[Manitoba Hockey Hall of Fame]].{{cn|date=November 2019}}<br /> <br /> =====NHL alumni=====<br /> List of National Hockey League alumni of the Bisons:{{cn|date=February 2020}}<br /> <br /> {{columns-list|colwidth=20em|<br /> *[[Clint Albright]]<br /> *[[Andy Blair (ice hockey)|Andy Blair]]<br /> *[[Art Chapman]]<br /> *[[Tom Cook (ice hockey)|Tom Cook]]<br /> *[[Jimmy Creighton]]<br /> *[[Stu Grimson]]<br /> *[[George Maneluk]]<br /> *[[Morris Mott]]<br /> *[[Murray Murdoch]]<br /> *[[Don Raleigh]]<br /> *[[Mike Ridley]]<br /> *[[Gus Rivers]]<br /> *[[Jack Ruttan]]<br /> *[[Wilfie Starr]]<br /> *[[Ron Talakoski]]<br /> }}<br /> <br /> =====Other notable people=====<br /> *[[Wayne Fleming]], National Hockey League coach, and Manitoba Bisons coach{{cn|date=February 2020}}<br /> *[[Bob Lowes]], Two-time Canadian Hockey League Coach of the Year&lt;ref&gt;{{cite web|url=http://news.umanitoba.ca/u-of-ms-golden-knights/|title=U of M's Golden Knights|last=Reid|first=Chris|date=2018-05-11|website=UM Today|access-date=2018-08-24}}&lt;/ref&gt;<br /> *[[Claude C. Robinson]], Canadian ice hockey and sports executive, inductee into the [[Hockey Hall of Fame]] and the [[Canadian Olympic Hall of Fame]]&lt;ref&gt;{{cite news|title=Has Control of Allan Cup Games|date=March 3, 1917|newspaper=[[The Winnipeg Tribune]]|location=Winnipeg, Manitoba|page=25|url=https://newspaperarchive.com/sports-clipping-mar-03-1917-3355908/}}{{free access}}&lt;/ref&gt;<br /> *[[Barry Trotz]], 1994 [[Calder Cup]] and [[2018 Stanley Cup playoffs|2018 Stanley Cup champion]] head coach, two-time [[Jack Adams Award]] winner<br /> <br /> ====Women's ice hockey====<br /> {{main|Manitoba Bisons women's ice hockey}}<br /> <br /> ===Football===<br /> {{main|Manitoba Bisons football}}<br /> <br /> The Bisons football program includes one of only four [[U Sports football]] teams to have won back-to-back [[Vanier Cup]] championships, having won in 1969 and 1970. In total, the Bisons have won three Vanier Cup national championships and 11 [[Hardy Trophy]] conference championships.{{cn|date=February 2020}}<br /> <br /> '''Notable players'''<br /> *[[Israel Idonije]], Nigerian-Canadian professional American football defensive end, primarily for the Chicago Bears of the National Football League.<br /> *[[David Onyemata]], Nigerian-Canadian professional American football defensive tackle for the New Orleans Saints of the National Football League (NFL 2016)&lt;ref&gt;{{cite web |title=Former Manitoba Bison David Onyemata nonetheless turning heads in NFL – Winnipeg |url=https://www.startribunemag.com/former-manitoba-bison-david-onyemata-still-turning-heads-in-nfl-winnipeg/ |website=Startribunemag |publisher=Global News |access-date=26 November 2016}}&lt;/ref&gt;<br /> <br /> === Soccer ===<br /> Manitoba Bisons ladies team plays in Canada West’s [[Canada West Universities Athletic Association|Universities Athletic Association]].{{cn|date=February 2020}}<br /> <br /> == Notable alumni ==<br /> {{sort list|asc|2=<br /> *[[Garth Pischke]], head coach of Manitoba Bisons men's volleyball{{cn|date=February 2020}}<br /> *[[Taylor Pischke]], Canadian beach volleyball player{{cn|date=February 2020}}<br /> *[[Venla Hovi]], Finnish Olympic medallist{{cn|date=February 2020}}<br /> }}<br /> *[[Dalima Chhibber]], Indian soccer player&lt;ref&gt;{{cite web|url=https://gobisons.ca/news/2022/8/29/general-after-helping-her-home-country-india-national-team-star-dalima-chhibber-back-with-bisons-soccer-in-2022.aspx|title=After helping her home country, India national team star Dalima Chhibber back with Bisons soccer in 2022|publisher=Manitoba Bisons Soccer ([[University of Manitoba]])|website=gobisons.ca|date=29 August 2022|access-date=3 September 2022|first1=Mike|last1=Still|first2=Braedan|last2=Willis|location=Winnipeg, Manitoba|language=en-CA|archive-url=https://ghostarchive.org/archive/3YjEF|archive-date=3 September 2022}}&lt;/ref&gt;<br /> * [[Gordon Orlikow]] (b. 1960), [[decathlon]], [[heptathlon]], and hurdles competitor, [[Athletics Canada]] Chairman, [[Canadian Olympic Committee]] member, [[Korn/Ferry International]] partner; competed for the Manitoba Bisons in track and field, and is honored on the Bisons Walkway of Honour.&lt;ref&gt;{{Cite web|url=https://gobisons.ca/sports/2021/5/5/bisons-walkway-of-honour.aspx|title=Bisons Walkway of Honour|website=University of Manitoba Athletics}}&lt;/ref&gt;<br /> <br /> ==Awards and honours==<br /> *2020 [[Lieutenant Governor Athletic Awards]]: Kelsey Wog, Swimming&lt;ref&gt;{{Cite news|url=https://news.umanitoba.ca/kelsey-wog-wins-2020-u-sports-female-athlete-of-the-year/|title=Kelsey Wog wins 2020 U SPORTS Female Athlete of the Year|website=umanitoba.ca/|date=2020-06-26|access-date=2021-07-23|language=en}}&lt;/ref&gt;<br /> ===Athletes of the Year===<br /> {|class=&quot;wikitable sortable&quot; width=&quot;50%&quot;<br /> |- align=&quot;center&quot; style=&quot; background:#B6985E;color:#562E18;&quot;<br /> | '''Year''' || '''Female Athlete''' || '''Sport''' || '''Male Athlete''' || '''Sport''' <br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> | 2008-09||Stacey Corfield || Hockey || Quin Ferguson||Track and Field<br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> | 2009-10||[[Desiree Scott]] || Soccer || Steve Christie ||Hockey<br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> |2011-12 || Addie Miles|| Hockey || Dane Pischke ||Volleyball<br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> |2012-13 ||Rachel Cockrell || Volleyball || Blair Macaulay || Hockey<br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> |2013-14||Brittany Habing ||Volleyball|| Anthony Coombs ||Football<br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> |2014-15 ||Rachel Cockrell || Volleyball || Al-Haji Mansaray || Track and Field<br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> |2016-17|| Lauryn Keen || Hockey ||Devren Dear||Volleyball<br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> |2017-18 || [[Venla Hovi]]&lt;ref&gt;{{Cite news|url=https://gobisons.ca/news/2018/3/24/general-venla-hovi-and-justus-alleyn-selected-as-the-2017-18-bison-sports-athletes-of-the-year.aspx|title=Venla Hovi and Justus Alleyn selected as the 2017-18 Bison Sports Athletes of the Year|website=gobisons.ca/|date=March 24, 2018|access-date=May 13, 2021|language=en}}&lt;/ref&gt; || Ice hockey || Justus Alleyn || Basketball <br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> |2018-19||[[Kelsey Wog]]||Swimming|| Simon Bérubé ||Track and Field<br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> |2019-20&lt;ref&gt;{{Cite news|url=https://gobisons.ca/sports/2020/3/27/2020-brown-and-gold-awards.aspx?path=general|title=2020 Brown and Gold Awards|website=gobisons.ca/| date=2020-03-27|access-date=2021-07-24|language=en}}&lt;/ref&gt; || [[Kelsey Wog]] ||Swimming||Rashawn Browne||Basketball<br /> |- align=&quot;center&quot; bgcolor=&quot;&quot;<br /> |}<br /> <br /> ===Canada West Hall of Fame===<br /> *Colleen Dufresne, Basketball Coach: [[Canada West Universities Athletic Association|Canada West]] Hall of Fame - 2019 Inductee &lt;ref&gt;{{Cite news|url=https://www.canadawesthalloffame.org/post/coleen-dufresne-wbb-coach|title=Coleen Dufresne WBB Coach|website=canadawesthalloffame.org/|date=2019-10-31|access-date=2021-07-23|language=en}}&lt;/ref&gt;<br /> *[[Desiree Scott]], Soccer: [[Canada West Universities Athletic Association|Canada West]] Hall of Fame - 2019 Inductee &lt;ref&gt;{{Cite news|url= https://www.canadawesthalloffame.org/post/desiree-scott-wsoc-student-athlete|title=Desiree Scott (WSOC Student-athlete)|website=canadawesthalloffame.org/|date=2019-09-03|access-date=2021-08-25|language=en}}&lt;/ref&gt;<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==External links==<br /> * {{Official website|http://gobisons.ca/}}<br /> <br /> {{Navboxes<br /> |titlestyle = {{CollegePrimaryStyle|Manitoba Bisons|color=white}}<br /> |list =<br /> {{Manitoba Sports}}<br /> {{Canada West Universities Athletic Association}}<br /> {{U Sports men's ice hockey}}<br /> {{U Sports soccer}}<br /> {{U Sports volleyball}}<br /> }}<br /> <br /> [[Category:Manitoba Bisons| ]]<br /> [[Category:U Sports teams]]<br /> [[Category:Ice hockey teams representing Canada internationally]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Lawnie_Wallace&diff=1168748995 Lawnie Wallace 2023-08-04T19:23:19Z <p>205.189.94.9: </p> <hr /> <div>{{short description|Canadian country singer (born 1977)}}<br /> {{BLP sources|date=January 2017}}<br /> {{Infobox musical artist<br /> | name = Lawnie Wallace<br /> | image = <br /> | background = solo_singer<br /> | birth_name = <br /> | alias = <br /> | birth_date = {{birth date and age|1977|4|15}}<br /> | origin = [[Whitchurch-Stouffville|Stouffville]], Ontario, Canada<br /> | instrument = [[Vocals]]<br /> | genre = [[Country music|Country]]<br /> | occupation = Singer, songwriter<br /> | years_active = 1994-1996; 2015–present<br /> | label = [[MCA Records|MCA Canada]]<br /> | associated_acts = <br /> | website = {{url|www.lawniewallace.com}}<br /> }}<br /> '''Lawnie Wallace''' is a Canadian [[Country music|country]] singer. Wallace recorded one studio album for [[MCA Records|MCA Canada]], 1995's ''Thought I Was Dreaming''. Four singles from the album charted on the ''[[RPM (magazine)|RPM]]'' Country Tracks chart in Canada, including the number 8-peaking title track.&lt;ref&gt;{{Cite web |url=http://www.collectionscanada.gc.ca/rpm/028020-110.01-e.php?PHPSESSID=qcl0ndp69of2tclt1feufsnbj4&amp;q1=lawnie+wallace&amp;q2=Country+Singles&amp;interval=24&amp;x=0&amp;y=0 |title=RPM: Lawnie Wallace |website=www.collectionscanada.gc.ca |publisher=Library and Archives Canada}}&lt;/ref&gt;<br /> <br /> ==Biography==<br /> {{BLP unsourced section|date=January 2017}}<br /> Wallace was born in [[Stouffville, Ontario|Stouffville]], [[Ontario]], Canada. At age seven she began performing and writing songs. At the age of fifteen Wallace signed a publishing deal with TMP Publishing, and an artist development deal with Warner Chappell. After relocating to Nashville she began co-writing songs.{{Who|date=January 2017}}<br /> <br /> Before the age of seventeen Wallace had signed a major record deal with MCA Records and began touring to promote her first album ''Thought I was Dreaming''. Following this tour she took a break from the stage, working with students wanting to get into the Radio and TV business. In 2015 Wallace released her second album, ''The Lost Years'', and has returned to performing live with her band The Chosen Ones. (Lawnie is currently promoting their art at https://www.instagram.com/art_by_lawnie/?hl=en)&lt; fcs<br /> <br /> fcs came by lawnie waalaces big company Apple Store in the big apple npc to report and serve notice that Alphabet Inc. and any subsidiaries and limited partnerships Apple Inc. (Macintosh Computers Inc.) been served for stealing fcs [Fiona Catherine Seth] proprietary information long ago<br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> <br /> (ottoman here I pulled a stunt with a pigeon trailing behind me and I got sprayed by you from a building on the northeast corner or yonge and bloor<br /> <br /> <br /> <br /> I do not think you should be forcing me into a relationship with marissa to see or to sea or to test me faithfulness to you that is unfair and I do not want to endorse briony at the south east corner of yonge and bloor where I am buried I should have been carted with everybody else when all the graves were moved Lawnie aka John Graves Simcoe<br /> <br /> from fcs aka Fiona <br /> <br /> ==Discography==<br /> <br /> ===Albums===<br /> {| class=&quot;wikitable plainrowheaders&quot; style=&quot;text-align:center;&quot;<br /> |-<br /> ! style=&quot;width:14em;&quot;| Title<br /> ! style=&quot;width:18em;&quot;| Album details<br /> |-<br /> ! scope=&quot;row&quot;| ''Thought I Was Dreaming''<br /> |<br /> *Release date: 1995<br /> *Label: [[MCA Records|MCA Canada]]<br /> |-<br /> ! scope=&quot;row&quot;| ''The Lost Years''<br /> |<br /> *Release date: 2015<br /> |}<br /> <br /> ===Singles===<br /> {| class=&quot;wikitable plainrowheaders&quot; style=text-align:center;<br /> ! scope=&quot;col&quot; rowspan=&quot;2&quot; | Year<br /> ! scope=&quot;col&quot; rowspan=&quot;2&quot; style=&quot;width:14em;&quot; | Title<br /> ! scope=&quot;col&quot;| Peak positions<br /> ! scope=&quot;col&quot; rowspan=&quot;2&quot; | Album<br /> |- style=&quot;font-size:smaller;&quot;<br /> ! style=&quot;width:6.5em&quot;|[[RPM (magazine)|CAN Country]]<br /> |-<br /> | 1995<br /> ! scope=&quot;row&quot;| &quot;Little Lies, Big Trouble&quot;<br /> | 57<br /> | align=&quot;left&quot; rowspan=&quot;4&quot;| ''Thought I Was Dreaming''<br /> |-<br /> | rowspan=&quot;3&quot;| 1996<br /> ! scope=&quot;row&quot;| &quot;[[Thought I Was Dreaming]]&quot;<br /> | 8<br /> |-<br /> ! scope=&quot;row&quot;| &quot;A Fine Line&quot;<br /> | 20<br /> |-<br /> ! scope=&quot;row&quot;| &quot;The Heartache&quot;<br /> | 46<br /> |}<br /> <br /> ===Music videos===<br /> {| class=&quot;wikitable plainrowheaders&quot;<br /> ! Year<br /> ! style=&quot;width:14em;&quot;| Title<br /> ! Director<br /> |-<br /> | 1995<br /> ! scope=&quot;row&quot;| &quot;Little Lies, Big Trouble&quot;<br /> | <br /> |-<br /> | rowspan=&quot;2&quot;| 1996<br /> ! scope=&quot;row&quot;| &quot;Thought I Was Dreaming&quot;<br /> | Paul Fox<br /> |-<br /> ! scope=&quot;row&quot;| &quot;The Heartache&quot;<br /> | [[Margaret Malandruccolo]]<br /> |}<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> {{Authority control}}<br /> <br /> {{DEFAULTSORT:Wallace, Lawnie}}<br /> [[Category:Canadian women country singers]]<br /> [[Category:Canadian country singer-songwriters]]<br /> [[Category:Living people]]<br /> [[Category:MCA Records artists]]<br /> [[Category:1977 births]]<br /> [[Category:Musicians from Ontario]]<br /> [[Category:21st-century Canadian women singers]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Ken_Dryden&diff=1168748639 Ken Dryden 2023-08-04T19:21:19Z <p>205.189.94.9: /* Personal life */ Father Murray has a wiki entry; present tense!</p> <hr /> <div>{{Short description|Canadian ice hockey goaltender and politician}}<br /> {{Use Canadian English|date=January 2023}}<br /> {{Infobox officeholder<br /> | name = Ken Dryden<br /> | image = Ken Dryden 2011.jpg<br /> | caption = Dryden in 2011<br /> | honorific_prefix = [[The Honourable]]<br /> | honorific_suffix = [[King's Privy Council for Canada|PC]] [[Order of Canada|OC]]<br /> | office1 = [[Minister of Families, Children and Social Development|Minister of Social Development]]<br /> | term_start1 = July 20, 2004<br /> | term_end1 = February 5, 2006<br /> | predecessor1 = [[Liza Frulla]]<br /> | successor1 = [[Diane Finley]]<br /> | primeminister1 = [[Paul Martin]]<br /> | riding2 = [[York Centre]]<br /> | parliament2 = Canadian<br /> | term_start2 = June 28, 2004<br /> | term_end2 = May 1, 2011<br /> | predecessor2 = [[Art Eggleton]]<br /> | successor2 = [[Mark Adler (politician)|Mark Adler]]<br /> | birth_name = Kenneth Wayne Dryden<br /> | birth_date = {{birth date and age|1947|8|8}}<br /> | birth_place = [[Hamilton, Ontario|Hamilton]], [[Ontario]], Canada<br /> | profession = {{hlist|Athlete|lawyer|teacher|writer|politician|sports commentator|businessperson}}<br /> | alma_mater = [[Cornell University]] ([[Bachelor of Arts|BA]])&lt;br /&gt;[[McGill University]] ([[Bachelor of Laws|LLB]])<br /> | party = [[Liberal Party of Canada|Liberal]]<br /> | religion = <br /> | residence = <br /> | footnotes = <br /> | spouse = Lynda Dryden<br /> | module = {{Infobox ice hockey player<br /> | embed = yes<br /> | halloffame = 1983<br /> | image_size = <br /> | height_ft = 6<br /> | height_in = 4<br /> | weight_lb = 205<br /> | position = [[Goaltender]]<br /> | catches = Left<br /> | played_for = [[Montreal Canadiens]]<br /> | ntl_team = CAN<br /> | draft = 14th overall<br /> | draft_year = 1964<br /> | draft_team = [[Boston Bruins]]<br /> | career_start = 1970<br /> | career_end = 1979<br /> }}<br /> }}<br /> <br /> '''Kenneth Wayne Dryden''' {{Post-nominals|country=CAN|PC|OC}} (born August 8, 1947) is a [[Canadians|Canadian]] politician, lawyer, businessman, author, and former [[National Hockey League]] (NHL) [[goaltender]] and executive. He is an [[Officer of the Order of Canada]]&lt;ref&gt;{{cite web |title=Appointments to the Order of Canada |url=http://www.gg.ca/document.aspx?id=14904&amp;lan=eng |publisher=Governor General of Canada |access-date=December 31, 2012}}&lt;/ref&gt; and a member of the [[Hockey Hall of Fame]]. He was a Liberal [[Member of Parliament (Canada)|Member of Parliament]] from 2004 to 2011 and [[Minister of Families, Children and Social Development|Minister of Social Development]] from 2004 to 2006. In 2017, the league counted him in history's [[100 Greatest NHL Players]].&lt;ref&gt;{{cite web|title=100 Greatest NHL Players|url=https://www.nhl.com/fans/nhl-centennial/100-greatest-nhl-players|website=NHL.com|access-date=January 27, 2017|date=January 27, 2017}}&lt;/ref&gt;&lt;ref name=&quot;:0&quot;&gt;{{Citation|last=NHL|title=Ken Dryden won Conn Smythe before he won Calder|date=2017-03-22|url=https://www.youtube.com/watch?v=UCIEwsLagTM&amp;list=PL1NbHSfosBuHEp2Bphcgz16OKz0kjnCH6&amp;index=50 |archive-url=https://ghostarchive.org/varchive/youtube/20211212/UCIEwsLagTM| archive-date=2021-12-12 |url-status=live|access-date=2017-04-25}}{{cbignore}}&lt;/ref&gt; He received the [[Order of Hockey in Canada]] in 2020.&lt;ref name=&quot;2020-recipients&quot;&gt;{{cite web|url=https://www.hockeycanada.ca/en-ca/news/2020-oohic-class-of-2020-named-to-order|title=Class of 2020 unveiled for Order of Hockey in Canada|date=2020-02-11|website=Hockey Canada|access-date=2020-02-11}}&lt;/ref&gt;<br /> <br /> ==Early life and education==<br /> Dryden was born in [[Hamilton, Ontario]], in 1947.&lt;ref name=CHATLAS&gt;{{cite book |title=The Canadian Hockey Atlas |url=https://archive.org/details/canadianhockeyat0000cole |url-access=registration |first=Stephen |last=Cole |publisher=Doubleday Canada |year=2006 |isbn=978-0-385-66093-8 }}&lt;/ref&gt; His parents were [[Murray Dryden]] (1911–2004) and Margaret Adelia Campbell (1912-1985). He has a sister, Judy, and a brother, [[Dave Dryden|Dave]], who was also an NHL goaltender. Dryden was raised in [[Islington-City Centre West|Islington]], [[Ontario]], then a suburb of Toronto. He played with the [[Etobicoke Indians]] of the [[Metro Junior B Hockey League]] as well as [[Humber Valley Packers]] of the [[Greater Toronto Hockey League|Metro Toronto Hockey League]].<br /> <br /> Dryden was drafted fourteenth overall by the [[Boston Bruins]] in the [[1964 NHL Amateur Draft]]. Days later, June 28,&lt;ref&gt;{{cite web|title=Trader Sam's Greatest Trades|url=http://www.habsworld.net/article.php?id=1472|website=HabsWorld| date=15 August 2007 |access-date=February 28, 2015}}&lt;/ref&gt; Boston traded Dryden, along with [[Alex Campbell (ice hockey)|Alex Campbell]], to the [[Montreal Canadiens]] for Paul Reid and Guy Allen. Dryden was told by his agent that he had been drafted by the Canadiens and did not find out until the mid-1970s that he had been drafted by the Bruins.&lt;ref&gt;{{cite web |url=http://www.habseyesontheprize.com/2009/8/21/997368/habs-robbed-bruins-of-dryden |title=Canadiens blog English translation of Canoe article |publisher=Sportsblog Inc. |date=August 21, 2009 |archive-url=https://web.archive.org/web/20090825030756/http://www.habseyesontheprize.com/2009/8/21/997368/habs-robbed-bruins-of-dryden |archive-date=August 25, 2009}}&lt;/ref&gt;<br /> <br /> Rather than play for the Canadiens in 1964, Dryden pursued a [[Bachelor of Arts|B.A.]] degree in [[Cornell University Department of History|History]] at [[Cornell University]], where he also played hockey until his graduation in 1969. He backstopped the [[Cornell Big Red men's ice hockey|Cornell Big Red]] to the 1967 [[National Collegiate Athletic Association]] championship and to three consecutive [[ECAC Hockey League|ECAC]] tournament championships, and won 76 of his 81 varsity starts. At Cornell, he was a member of the [[Quill and Dagger]] society.&lt;ref name='sun5-68'&gt;''The Cornell Daily Sun'', 9 May 1968&lt;/ref&gt; He also was a member of the Canadian amateur national team at the [[1969 World Ice Hockey Championships]] tournament in [[Stockholm]].<br /> <br /> Dryden took a break from the NHL for the 1973–74 season to [[Articled clerk|article]] for a Toronto law firm, and to earn an [[Bachelor of Laws|LL.B.]] degree he received from [[McGill University]] in 1973.&lt;ref&gt;{{cite news |title=Dryden Quits Hockey for Law Clerk Job |url=https://www.nytimes.com/1973/09/15/archives/dryden-quits-hockey-for-law-clerk-job.html |newspaper=[[The New York Times]] |access-date=22 July 2018 |language=en |date=15 September 1973}}&lt;/ref&gt;&lt;ref&gt;[https://www.mcgill.ca/about/alumni Notable alumni] - website of [[McGill University]]&lt;/ref&gt; During this time Dryden interned with [[Ralph Nader|Ralph Nader's]] [[Public Citizen]] organization. Inspired by Nader's call in [[Action for a Change]] for establishing [[Public Interest Research Group|Public Interest Research Group's]], Dryden tried to establish the [[Ontario Public Interest Research Group]] in the Province of Ontario.&lt;ref&gt;{{cite magazine |title=PIRG Power |url=http://qpirgmcgill.org/wp-content/uploads/2014/07/PIRG-Power-.pdf |author1=[[Karen Farbridge]] |author2=Peter Cameron |magazine=[[A\J: Alternatives Journal|Alternatives Journal]]|date=Summer 1998}}&lt;/ref&gt;<br /> <br /> Dryden's jersey number 1 was retired by the [[Cornell Big Red]] on February 25, 2010; along with [[Joe Nieuwendyk]], he is one of only two players to have their numbers retired by Cornell's hockey program.&lt;ref&gt;{{cite web |url=https://news.cornell.edu/stories/2010/02/dryden-nieuwendyks-hockey-numbers-be-retired |title=Big Red to retire Dryden, Nieuwendyk's hockey numbers |first=Kevin |last=Zeisse |publisher=Cornell Chronicle |date=February 25, 2010 |access-date=May 2, 2023}}&lt;/ref&gt;<br /> <br /> ==Playing career==<br /> Dryden made his NHL debut on Sunday March 14, 1971 against the Pittsburgh Penguins in Pittsburgh. The Canadiens won the game 5-1, and the only goal scored against Ken was by a player named John Stewart. Later, on March 20, 1971, he played in a home game against his brother [[Dave Dryden]], a fellow backup goaltender for [[Buffalo Sabres]], when Canadiens starter [[Rogie Vachon]] suffered an injury;&lt;ref&gt;{{Citation|last=SwissHabs|title=Legends of hockey : Ken Dryden|date=2012-04-27|url=https://www.youtube.com/watch?v=dZFOmrHbLw4|access-date=2019-03-27}}&lt;/ref&gt; this still stands, as of 2021, as the only time a pair of brothers faced against each other as goaltenders. He was called up from the minors late in the season and played only six regular-season games, but rang up 1.65 goals-against average. This earned him the starting goaltending job for the playoffs ahead of veteran [[Rogie Vachon]], and he helped the Canadiens to win the Stanley Cup. He also won the [[Conn Smythe Trophy]] as the most valuable player in the playoffs. He helped the Habs win five more [[Stanley Cup]]s in 1973, 1976, 1977, 1978, and 1979.<br /> <br /> The following year Dryden won the [[Calder Memorial Trophy|Calder Trophy]] as the rookie of the year; he was not eligible for it the previous year because he did not play enough regular-season games. He is the only player to win the Conn Smythe Trophy before winning the rookie of the year award, and the only goaltender to win both the Conn Smythe and the Stanley Cup before losing a regular-season game.&lt;ref name=&quot;:0&quot; /&gt; In the autumn of 1972 Dryden played for Team Canada in the [[1972 Summit Series]] against the [[Soviet national ice hockey team]].<br /> <br /> Dryden played from 1971 to 1979, with a break during the entire [[1973–74 NHL season|1973–74 season]]; he was unhappy with the contract that the Canadiens offered him, which he considered less than his market worth, given that he had won the Stanley Cup and Vezina Trophy. He announced on September 14, 1973, that he was joining the Toronto law firm of Osler, Hoskin and Harcourt as a legal clerk for the year, for $135 a week. He skipped training camp and held out that season. The Canadiens still had a good year, going 45-24-9, but lost in the first round of the playoffs to the [[New York Rangers]] in six games. The Canadiens allowed 56 more goals in the 1973–74 season than they had the year before with Dryden.&lt;ref name=&quot;athletics.mcgill.ca&quot;&gt;{{cite web |url=http://www.athletics.mcgill.ca/alumni_news_details.ch2?article_id=454 |title=McGill Athletics |access-date=2008-09-28 |url-status=dead |archive-url=https://web.archive.org/web/20080515141224/http://www.athletics.mcgill.ca/alumni_news_details.ch2?article_id=454 |archive-date=2008-05-15 }}&lt;/ref&gt; Dryden used that year to fulfill the requirements for his law degree at [[McGill University|McGill]] and article for a law firm. He retired for the last time on July 9, 1979.&lt;ref&gt;{{cite news |title=From the archives: Dryden announces retirement |url=https://www.theglobeandmail.com/sports/from-the-archives-dryden-announces-retirement/article1069554/ |access-date=4 August 2020 |work=Globe &amp; Mail |date=10 July 1979}}&lt;/ref&gt;<br /> <br /> Compared to those of most other great hockey players, Dryden's NHL career was very short: just over seven full seasons. Thus he did not amass record totals in most statistical categories. As he played all his years with a dynasty and retired before he passed his prime, his statistical percentages are unparalleled. His regular-season totals include a 74.3 winning percentage, a 2.24 goals-against average, a 92.2 save percentage, 46 shutouts, and 258 wins, only 57 losses and 74 ties in just 397 NHL games. He won the [[Vezina Trophy]] five times as the goaltender on the team who allowed the fewest goals and in the same years was selected as a First Team All-Star. In 1998, he was ranked number 25 on ''[[The Hockey News]]''' list of the 100 Greatest Hockey Players, a remarkable achievement for a player with a comparatively brief career.<br /> <br /> At 6 feet, 4 inches, Dryden was so tall that during stoppages in play he struck what became his trademark pose: leaning upon his stick. He was known as the &quot;four-storey goalie,&quot; and was once referred to as &quot;that thieving giraffe&quot; by Boston Bruins superstar [[Phil Esposito]], in reference to Dryden's skill and height. Unbeknownst to him, his pose was exactly the same as the one struck by fellow Canadiens goaltender, [[Georges Vézina]], 60 years prior.&lt;ref&gt;{{Citation|last=SwissHabs|title=Legends of hockey : Ken Dryden|date=2012-04-27|url=https://www.youtube.com/watch?v=dZFOmrHbLw4&amp;t=18m9s|access-date=2019-03-27}}&lt;/ref&gt;<br /> <br /> Dryden was inducted into the [[Hockey Hall of Fame]] in 1983, as soon as he was eligible. His jersey number 29 was retired by the Canadiens on January 29, 2007. He was inducted into the [[Ontario Sports Hall of Fame]] in 2011.&lt;ref&gt;{{cite web |title=Ken Dryden |url=http://oshof.ca/index.php/honoured-members/item/13-ken-dryden |website=oshof.ca |publisher=[[Ontario Sports Hall of Fame]] |date=2011 |access-date=2018-04-10 |archive-date=2018-07-23 |archive-url=https://web.archive.org/web/20180723034300/https://oshof.ca/index.php/honoured-members/item/13-ken-dryden |url-status=dead }}&lt;/ref&gt;<br /> <br /> ==Post playing ==<br /> <br /> ===Writing===<br /> Dryden wrote one book during his hockey career: ''Face-Off at the Summit''. It was a diary about Team Canada in the [[Summit Series|Canada vs. Soviet Union series of 1972]]. The book has been out of print for many years.<br /> <br /> After retiring from hockey Dryden wrote several more books. ''[[The Game (Ken Dryden)|The Game]]'' was a commercial and critical success, and was nominated for a [[1983 Governor General's Awards|Governor General's Award]] in 1983. His next book, ''Home Game: Hockey and Life in Canada'' (1990), written with [[Roy MacGregor]], was developed into an award-winning [[Canadian Broadcasting Corporation]] six-part documentary series for television. His fourth book was ''The Moved and the Shaken: The Story of One Man's Life'' (1993). His fifth book, ''In School: Our Kids, Our Teachers, Our Classrooms'' (1995), written with Roy MacGregor, was about Canada’s educational system. ''Becoming Canada'' (2010) argued for a new definition of Canada and its unique place in the world.<br /> <br /> In 2019, he published ''Scotty: A Hockey Life Like No Other,'' his biography of his Canadiens coach Scotty Bowman. Dryden says at the beginning that he '''needed'' to write this book,' because 'Scotty had lived a truly unique life. He has experienced almost ''everything'' in hockey, up close, for the best part of a century - and his is a life that no on else will live again. It's a life that had to be captured. And it needs to be captured now, because time is moving on.' &lt;ref name=&quot;A Hockey Life like no Other&quot;&gt;{{cite book |last=Dryden |first= Ken |author-link=Ken Dryden|date=2019 |title=[[The Game (Dryden book)|The Game]]|publisher=[[Penguin Random House]] |isbn=978-0-7710-2750-5}}&lt;/ref&gt;<br /> <br /> Feeling that Bowman was 'too practical and focused' to be a natural storyteller, Dryden instead asked Bowman to think like a coach and select the 8 greatest teams of all time (but only one per dynasty) and explain what he thought about them, how we coach against them but also what was happening in his life at that time and through that process, Bowman's story would be told.&lt;ref name=&quot;A Hockey Life like no Other&quot;/&gt;<br /> <br /> ===Commentator===<br /> Dryden worked as a television hockey commentator at the [[1980 Winter Olympics|1980]], [[1984 Winter Olympics|1984]] and [[1988 Winter Olympics]]. He served as a colour commentator with play-by-play man [[Al Michaels]] for [[ABC Olympic broadcasts|ABC]]'s coverage of the &quot;[[Miracle on Ice]].&quot; Immediately before [[Mike Eruzione]]'s game-winning goal for the US, Dryden expressed his concern that the team was &quot;depending a little bit too much&quot; on goaltender [[Jim Craig (ice hockey)|Jim Craig]] after Craig had just made &quot;too many good saves.&quot;<br /> <br /> ===Sports executive===<br /> In 1997, Dryden was hired as president of the [[Toronto Maple Leafs]] by minority owner [[Larry Tanenbaum]]. [[Pat Quinn (ice hockey)|Pat Quinn]] became head coach in 1998, and there were reports that the two men had a frosty relationship. A few months after joining the Leafs, Quinn became general manager, a move thought by some to preempt Dryden from hiring former Canadiens teammate [[Bob Gainey]].&lt;ref name=&quot;athletics.mcgill.ca&quot;/&gt;<br /> <br /> Dryden spoke at the [[Open Ice Summit]] in 1999, to discuss improvements needed to ice hockey in Canada. He wanted delegates to accept that progress made at the lower levels and off the ice was important in achieving international results.&lt;ref&gt;{{cite book|last=McKinley|first=Michael|title=It's Our Game: Celebrating 100 Years Of Hockey Canada|publisher=Viking|date=2014 |location=Toronto, Ontario|isbn=978-0-670-06817-3|pages=314–316}}&lt;/ref&gt; He was cautious that change would come slowly and be costly, but felt the summit was an important step in making progress.&lt;ref&gt;{{cite news|title=11 ideas from Open summit|last=Colbourn|first=Glen|date=August 28, 1999|newspaper=Medicine Hat News|location=Medicine Hat, Alberta|page=9 |url=https://newspaperarchive.com/sports-clipping-aug-28-1999-1717170/}}{{free access}}&lt;/ref&gt; He also urged for the end to persistent abuse of [[Official (ice hockey)|on-ice officials]], or Canada would lose 10,000 referees each year. As a result of the summit, Hockey Canada started to educate on the importance of respect for game officials.&lt;ref&gt;{{cite news|title=CHA wants fans to layoff referees|last=Beacon|first=Bill|date=November 5, 1999|newspaper=Winnipeg Free Press|location=Winnipeg, Manitoba|page=50|url=https://newspaperarchive.com/sports-clipping-nov-05-1999-1717195/}}{{free access}}&lt;/ref&gt;<br /> <br /> On August 29, 2003, with the hiring of [[John Ferguson, Jr.]] as general manager, there was a major management shakeup. Majority owner [[Steve Stavro]] was bought out by the [[Ontario Teachers' Pension Plan]] and he stepped down as chairman in favour of [[Larry Tanenbaum]]. Quinn continued as head coach. Dryden's position was abolished, in favour of having both the Leafs' and [[Toronto Raptors|Raptors]]' managers reporting directly to [[Maple Leaf Sports &amp; Entertainment Ltd.|MLSE]] President and CEO [[Richard Peddie]]. Dryden was shuffled to the less important role of vice-chairman and given a spot on MLSE's board of directors. This was described by commentators as &quot;sitting outside the loop&quot;, as Dryden did not report directly to Leafs ownership.&lt;ref name=&quot;athletics.mcgill.ca&quot;/&gt;&lt;ref&gt;{{cite web |url=http://www.athletics.mcgill.ca/alumni_news_details.ch2?article_id=453 |title=McGill Athletics |access-date=2006-12-05 |url-status=dead |archive-url=https://web.archive.org/web/20060208124054/http://www.athletics.mcgill.ca/alumni_news_details.ch2?article_id=453 |archive-date=2006-02-08 }}&lt;/ref&gt; He stayed on until 2004 when he resigned to enter politics.<br /> <br /> ===Teaching===<br /> In January 2012, Dryden was appointed a &quot;Special Visitor&quot; at his alma mater [[McGill University]]'s Institute for the Study of Canada. He taught a [[Canadian Studies]] course entitled &quot;Thinking the Future to Make the Future,&quot; which focused on issues facing Canada in the future and possible solutions to them.&lt;ref&gt;{{Cite news|url=https://reporter.mcgill.ca/q-a-ken-dryden-thinks-the-future/|title=Q &amp; A: Ken Dryden thinks the future|date=January 17, 2012|work=McGill Reporter|access-date = May 2, 2023|first=Cynthia|last=Lee}}&lt;/ref&gt;<br /> <br /> ==Political career==<br /> Dryden joined the [[Liberal Party of Canada]] and ran for the [[House of Commons of Canada|House of Commons]] in the [[2004 Canadian federal election|2004 federal election]]. He was selected by party leader and [[Prime Minister of Canada|Prime Minister]] Paul Martin as a &quot;[[star candidate]]&quot; in the Toronto [[Electoral district (Canada)|riding]] of [[York Centre]], then considered a [[safe seat|safe]] Liberal riding.&lt;ref name=&quot;The Star&quot;&gt;{{cite news| url=https://www.thestar.com/news/canada/2011/05/03/dryden_goes_down_to_defeat.html |title=Dryden goes down to defeat |first=Paul |last=Moloney |newspaper=The Toronto Star}}&lt;/ref&gt;<br /> <br /> Dryden was elected by a margin of over 11,000 votes.&lt;ref name=&quot;2004 results&quot;&gt;{{cite news |title=Election results...riding by riding |newspaper=The Globe and Mail |date=June 29, 2004 |page=A14}}&lt;/ref&gt; He was named to [[Cabinet of Canada|Cabinet]] as [[Minister of Social Development (Canada)|Minister of Social Development]].&lt;ref&gt;{{cite news |title=Who does what in the new federal cabinet |newspaper=The Hamilton Spectator |date=July 21, 2004 |page=A10}}&lt;/ref&gt; He made headlines on February 16, 2005, as the target of a remark by [[Conservative Party of Canada|Conservative]] [[Member of Parliament (Canada)|Member of Parliament]] [[Rona Ambrose]] who said about Dryden, &quot;working women want to make their own choices, we don't need old white guys telling us what to do.&quot; Ambrose made the remarks after Dryden commented on a poll that analyzed child care choices by Canadian families.&lt;ref&gt;{{cite news |title=A Verbal Slapshot; MP tells child-care minister Ken Dryden: 'We don't need old white guys telling us what to do' |last=Dugas |first=Dan |newspaper=The Hamilton Spectator |date=February 16, 2005 |page=A10}}&lt;/ref&gt; Dryden won generally favourable reviews for his performance in Cabinet.<br /> <br /> Dryden was re-elected in the [[2006 Canadian federal election|2006 federal election]], while the Liberals were defeated and [[Paul Martin]] resigned the party leadership.&lt;ref name=&quot;2006 results&quot;&gt;{{cite news |title=Election results...riding by riding |newspaper=The Globe and Mail |date=January 24, 2006 |page=A16}}&lt;/ref&gt; Interim party and opposition leader [[Bill Graham (Canadian politician)|Bill Graham]] named Dryden to his shadow cabinet as health critic.&lt;ref&gt;{{cite news |title=Six Liberals named to shadow cabinet |last=O'Neill |first=Juliet |newspaper=The Vancouver Sun |date=February 23, 2006 |page=A6}}&lt;/ref&gt;<br /> <br /> Dryden's margin of victory in [[York Centre]] dwindled in the 2006 and 2008 elections.&lt;ref&gt;{{cite news |title=Hockey legend Ken Dryden loses bid for fourth term |url=http://toronto.ctvnews.ca/hockey-legend-ken-dryden-loses-bid-for-fourth-term-1.638553 |publisher=CTV News |date=May 2, 2011}}&lt;/ref&gt; In the 2011 federal election, he focused his efforts on his own re-election instead of campaigning for other candidates as he did in the past, and he received a visit from former Prime Minister [[Jean Chrétien]]. Still, Dryden lost his seat to Conservative candidate [[Mark Adler (politician)|Mark Adler]] by nearly 6,000 votes.&lt;ref name=&quot;The Star&quot;/&gt;&lt;ref&gt;{{cite news |title=Israel a key election issue in York Centre |url=http://www.cbc.ca/news/canada/toronto/story/2011/04/25/cv-election-york-centre.html |publisher=CBC News |date=April 25, 2011}}&lt;/ref&gt;<br /> <br /> ===Leadership bid===<br /> <br /> On April 28, 2006, Dryden announced that he would run for the leadership of the [[Liberal Party of Canada]], which would be choosing a successor to Paul Martin at a [[2006 Liberal Party of Canada leadership election|convention in Montreal on December 2, 2006]].&lt;ref&gt;{{cite news|url=http://www.cbc.ca/news/canada/and-then-there-were-10-ken-dryden-is-in-1.587809|work=CBC News|title=And then there were 10 ... Ken Dryden is in|date=April 28, 2006}}&lt;/ref&gt;<br /> <br /> A poll&lt;ref&gt;[http://www.gandalfgroup.ca/downloads/Liberal_Leadership_The_Publics_Choice.pdf September 2006 poll]&lt;/ref&gt; found that Dryden's potential pool of support exceeded that of his opponents, due mainly to his former NHL career. His fundraising fell well below that of top leadership contenders ([[Michael Ignatieff]], [[Gerard Kennedy]], [[Stéphane Dion]] and [[Bob Rae]]). A variety of media pundits criticized his ponderous speaking style and limited French. Supporters argued that few people were strongly opposed to him and that if he ran he could attract more support on later ballots as a consensus candidate.<br /> <br /> At the convention, Dryden came in fifth place on the first ballot with 238 delegates, 4.9% of the vote. On the second ballot, he came in last place with 219 votes (4.7%) and was eliminated. He initially threw his support to Bob Rae, but after Rae was eliminated in the third ballot and released all of his delegates, Dryden endorsed Stéphane Dion, who went on to win the leadership.<br /> <br /> According to Elections Canada filings, as of 2013 Dryden's campaign still owed $225,000.&lt;ref&gt;{{cite news| url=https://www.theglobeandmail.com/news/politics/liberal-leadership-candidates-with-outstanding-loans-wont-be-taken-to-court/article13496625/ | location=Toronto | work=The Globe and Mail | title=Liberal leadership candidates remain off the hook for outstanding debts | date=July 30, 2013}}&lt;/ref&gt;<br /> <br /> ==Personal life==<br /> Ken is the son of Margaret and [[Murray Dryden]]. <br /> Dryden and his wife Lynda have two children and four grandchildren.&lt;ref&gt;{{cite web|url=https://penguinrandomhouse.ca/authors/7480/ken-dryden|title=Ken Dryden|publisher=Penguin Random House Canada|access-date=November 30, 2017}}&lt;/ref&gt; He is a first cousin, twice removed, of [[Murray Murdoch]], another former NHL player and a longtime coach of [[Yale University]]'s hockey team. There is another distant relationship with the [[Syl Apps|Apps]] family. <br /> His older brother Dave also played in the NHL and the WHA as a goalie, from 1961 to 1980.<br /> <br /> ==Bibliography==<br /> <br /> ===Non-fiction===<br /> *''Face-Off at the Summit'' (1973)<br /> *''[[The Game (Ken Dryden)|The Game]]'' (1983)<br /> *''Home Game: Hockey and Life in Canada'' (with Roy MacGregor, 1990)<br /> *''In School: Our Kids, Our Teachers, Our Classrooms'' (with Roy MacGregor, 1995)<br /> *''[[The Moved and the Shaken]]'' (1993)<br /> *''[[Becoming Canada]]'' (2010)<br /> *''[[Game Change (Ken Dryden)|Game Change]]'' (2017)&lt;ref&gt;[https://beta.theglobeandmail.com/arts/books-and-media/book-reviews/review-ken-drydens-game-change-is-a-deep-piece-of-investigative-journalism/article36674963 Review: Ken Dryden’s Game Change is a deep piece of investigative journalism] The Globe and Mail, 20 October 2017&lt;/ref&gt;<br /> *''Scotty: A Hockey Life Like No Other'' (2019)<br /> <br /> ==Awards and honors==<br /> Dryden's hockey awards and honours are numerous and include:<br /> {| class=&quot;wikitable&quot;<br /> ! Award<br /> ! Year<br /> ! Remark<br /> |-<br /> | All-[[ECAC Hockey|ECAC]] [[List of All-ECAC Hockey Teams#First Team|First Team]]<br /> | [[1966–67 NCAA Division I men's ice hockey season|1966–67]], [[1967–68 NCAA Division I men's ice hockey season|1967–68]], [[1968–69 NCAA Division I men's ice hockey season|1968–69]]<br /> | <br /> |-<br /> | [[American Hockey Coaches Association|AHCA]] [[List of Division I AHCA All-American Teams|East All-American]]<br /> | [[1966–67 NCAA Division I men's ice hockey season|1966–67]], [[1967–68 NCAA Division I men's ice hockey season|1967–68]], [[1968–69 NCAA Division I men's ice hockey season|1968–69]]<br /> | <br /> |-<br /> | [[ECAC Hockey]] [[List of ECAC Hockey All-Tournament Team|All-Tournament First Team]]<br /> | [[1967 ECAC Hockey Men's Ice Hockey Tournament|1967]], [[1968 ECAC Hockey Men's Ice Hockey Tournament|1968]], [[1969 ECAC Hockey Men's Ice Hockey Tournament|1969]]<br /> | <br /> |-<br /> | [[NCAA Men's Ice Hockey Championship|NCAA]] [[List of NCAA Division I Men's Ice Hockey All-Tournament Teams|All-Tournament First Team]]<br /> | [[1967 NCAA Division I Men's Ice Hockey Tournament|1967]]<br /> | &lt;ref name = ncaa&gt;{{cite news|title=NCAA Frozen Four Records|url=http://fs.ncaa.org/Docs/stats/frozen_4/2009/f4recs.pdf|publisher=NCAA.org|access-date=2013-06-19}}&lt;/ref&gt;<br /> |-<br /> | [[NCAA Men's Ice Hockey Championship|NCAA]] [[List of NCAA Division I Men's Ice Hockey All-Tournament Teams|All-Tournament Second Team]]<br /> | [[1968 NCAA Division I Men's Ice Hockey Tournament|1968]], [[1969 NCAA Division I Men's Ice Hockey Tournament|1969]]<br /> | &lt;ref name=&quot;ncaa&quot; /&gt;<br /> |-<br /> | [[Conn Smythe Trophy]] winner<br /> | [[1970-71 NHL season|1971]]<br /> | <br /> |-<br /> | [[Calder Memorial Trophy]] winner<br /> | [[1971-72 NHL season|1972]]<br /> | <br /> |-<br /> | [[Vezina Trophy]] winner<br /> | [[1972-73 NHL season|1973]], [[1975-76 NHL season|1976]], [[1976-77 NHL season|1977]]*, [[1977-78 NHL season|1978]]*, [[1978-79 NHL season|1979]]*<br /> | &lt;small&gt;&lt;nowiki&gt;* Shared with &lt;/nowiki&gt;[[Michel Larocque (ice hockey, born 1952)|Michel Larocque]].&lt;/small&gt;<br /> |-<br /> | [[Stanley Cup]] champion<br /> | [[1971 Stanley Cup Finals|1971]], [[1973 Stanley Cup Finals|1973]], [[1976 Stanley Cup Finals|1976]], [[1977 Stanley Cup Finals|1977]], [[1978 Stanley Cup Finals|1978]], [[1979 Stanley Cup Finals|1979]]<br /> | <br /> |-<br /> | Playing [[NHL All-Star Game]]s.<br /> | 1972, [[28th National Hockey League All-Star Game|1975]], [[29th National Hockey League All-Star Game|1976]], [[30th National Hockey League All-Star Game|1977]], [[31st National Hockey League All-Star Game|1978]]<br /> | <br /> |-<br /> | Selected to [[NHL First All-Star Team]]<br /> | 1973, 1976, 1977, 1978, 1979.<br /> |<br /> |-<br /> | Selected to [[NHL Second All-Star Team]]<br /> | 1972<br /> | <br /> |-<br /> | Inducted into the [[Hockey Hall of Fame]]<br /> | 1983<br /> | <br /> |-<br /> | Number 25 on ''The Hockey News''' list of the 100 Greatest Hockey Players<br /> | 1998<br /> | <br /> |-<br /> | Number 29 was retired by the [[Montreal Canadiens]] <br /> | January 29, 2007<br /> | <br /> |-<br /> | His number 1 was retired by the [[Cornell Big Red]]<br /> | February 25, 2010<br /> | &lt;small&gt;One of only two players to have his number retired by the Cornell hockey program;&lt;br&gt;the other being [[Joe Nieuwendyk]].&lt;/small&gt;<br /> |-<br /> | Recipient of the [[Order of Hockey in Canada]]<br /> | 2020<br /> | &lt;ref name=&quot;2020-recipients&quot; /&gt;<br /> |-<br /> |}<br /> <br /> Dryden does not have a substantive doctorate, but has received [[honorary doctoral degree]]s from several universities,&lt;ref&gt;[https://www.collectionscanada.gc.ca/eppp-archive/100/205/300/liberal-ef/06-01-30/www.liberal.ca/bio_e.aspx@&amp;id=35103 Biography: Ken Dryden] - website of the [[Library and Archives Canada]] of the [[Government of Canada]]&lt;/ref&gt; including:<br /> <br /> {| class=&quot;wikitable&quot;<br /> ! Honorary degree<br /> ! University<br /> ! Year<br /> ! Remark<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[University of British Columbia]]<br /> | 1992<br /> | &lt;ref&gt;[https://www.library.ubc.ca/archives/hdcites/hdcites10.html The Title and Degree of Doctor of Laws (honoris causa) conferred at congregation, May 26, 1992] - website of the [[University of British Columbia]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[York University]]<br /> | 1996<br /> | &lt;ref&gt;[https://www.yorku.ca/secretariat/senate/sub-committee-on-honorary-degrees-and-ceremonials/honorary-degree-recipients/ Honorary Degree Recipients] - website of the [[York University]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[University of Windsor]]<br /> | 1997<br /> | &lt;ref&gt;[https://www.uwindsor.ca/secretariat/sites/uwindsor.ca.secretariat/files/honorary_degree_by_convocation_1.pdf Honorary degrees conferred (Chronological)] - website of the [[University of Windsor]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Doctor of the University|D.Univ.]] degree<br /> | [[University of Ottawa]]<br /> | 2000<br /> | &lt;ref&gt;[https://www.uottawa.ca/president/bio/dryden-ken Office of the president: Honorary Doctorates - Ken Dryden] - website of the [[University of Ottawa]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[Toronto Metropolitan University|Ryerson University]]<br /> | 2013<br /> | &lt;ref&gt;[https://www.ryerson.ca/calendar/2021-2022/about-ryerson/honorary-doctorates/ Ryerson Honorary Doctorates and Fellowships] - website of the [[Toronto Metropolitan University|Ryerson University]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[McGill University]]<br /> | 2018<br /> | &lt;ref&gt;[https://www.mcgill.ca/newsroom/channels/news/naomi-azrieli-ken-dryden-receive-honorary-degrees-291005 Naomi Azrieli, Ken Dryden to receive honorary degrees News] - website of the [[McGill University]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[University of Winnipeg]]<br /> | 2018<br /> | &lt;ref&gt;[https://www.uwinnipeg.ca/awards-distinctions/honorary-doctorate/dryden.html Honorary Doctorate: Ken Dryden] - website of the [[University of Winnipeg]]&lt;/ref&gt;<br /> |}<br /> <br /> ==Career statistics==<br /> ===Regular season and playoffs===<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; style=&quot;width:95%; text-align:center;&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! colspan=&quot;9&quot; bgcolor=&quot;#e0e0e0&quot; | [[Regular season]]<br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! colspan=&quot;8&quot; bgcolor=&quot;#e0e0e0&quot; | [[Playoffs]]<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! [[Season (sports)|Season]]<br /> ! Team<br /> ! League<br /> ! GP !! W !! L !! T !! MIN !! GA !! [[Shutout#Ice hockey|SO]] !! [[Goals against average|GAA]] !! [[save percentage|SV%]]<br /> ! GP !! W !! L !! MIN !! GA !! SO !! GAA !! SV%<br /> |-<br /> | 1963–64<br /> | Humber Valley Packers<br /> | MTHL<br /> | — || — || — || — || — || — || — || — || — <br /> | — || — || — || — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | 1964–65<br /> | Etobicoke Indians<br /> | [[Metro Junior A Hockey League|MetJHL]]<br /> | — || — || — || — || — || — || — || — || — <br /> | — || — || — || — || — || — || — || —<br /> |-<br /> | [[1966–67 NCAA University Division men's ice hockey season|1966–67]]<br /> | [[Cornell Big Red men's ice hockey|Cornell University]]<br /> | [[ECAC Hockey|ECAC]]<br /> | 27 || 26 || 0 || 1 || 1646 || 40 || 4 || 1.46 || .945<br /> | — || — || — || — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1967–68 NCAA University Division men's ice hockey season|1967–68]]<br /> | Cornell University<br /> | ECAC<br /> | 29 || 25 || 2 || 0 || 1620 || 41 || 6 || 1.52 || .938<br /> | — || — || — || — || — || — || — || —<br /> |-<br /> | [[1968–69 NCAA University Division men's ice hockey season|1968–69]]<br /> | Cornell University<br /> | ECAC<br /> | 27 || 25 || 2 || 0 || 1578 || 47 || 3 || 1.79 || .936<br /> | — || — || — || — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1970–71 AHL season|1970–71]]<br /> | [[Montreal Voyageurs]]<br /> | [[American Hockey League|AHL]]<br /> | 33 || 16 || 7 || 8 || 1899 || 84 || 3 || 2.68 || —<br /> | — || — || — || — || — || — || — || —<br /> |-<br /> | [[1970–71 NHL season|1970–71]]*<br /> | [[Montreal Canadiens]]<br /> | [[National Hockey League|NHL]]<br /> | 6 || 6 || 0 || 0 || 327 || 9 || 0 || 1.65 || .957<br /> | 20 || 12 || 8 || 1221 || 61 || 0 || 3.00 || .914<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1971–72 NHL season|1971–72]]<br /> | Montreal Canadiens<br /> | NHL<br /> | 64 || 39 || 8 || 15 || 3800 || 142 || 8 || 2.24 || .930<br /> | 6 || 2 || 4 || 360 || 17 || 0 || 2.83 || .911<br /> |-<br /> | [[1972–73 NHL season|1972–73]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 54 || 33 || 7 || 13 || 3165 || 119 || 6 || 2.26 || .926<br /> | 17 || 12 || 5 || 1039 || 50 || 1 || 2.89 || .908<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1974–75 NHL season|1974–75]]<br /> | Montreal Canadiens<br /> | NHL<br /> | 56 || 30 || 9 || 16 || 3320 || 149 || 4 || 2.69 || .906<br /> | 11 || 6 || 5 || 688 || 29 || 2 || 2.53 || .916<br /> |-<br /> | [[1975–76 NHL season|1975–76]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 62 || 42 || 10 || 8 || 3580 || 121 || 8 || 2.03 || .927<br /> | 13 || 12 || 1 || 780 || 25 || 1 || 1.92 || .929<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1976–77 NHL season|1976–77]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 56 || 41 || 6 || 8 || 3275 || 117 || 10 || 2.14 || .920<br /> | 14 || 12 || 2 || 849 || 22 || 4 || 1.55 || .932<br /> |-<br /> | [[1977–78 NHL season|1977–78]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 52 || 37 || 7 || 7 || 3071 || 105 || 5 || 2.05 || .921<br /> | 15 || 12 || 3 || 919 || 29 || 2 || 1.89 || .920<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1978–79 NHL season|1978–79]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 47 || 30 || 10 || 7 || 2814 || 108 || 5 || 2.30 || .909<br /> | 16 || 12 || 4 || 990 || 41 || 0 || 2.48 || .900<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; | NHL totals<br /> ! 397 !! 258 !! 57 !! 74 !! 23,330 !! 870 !! 46 !! 2.24 !! .922<br /> ! 112 !! 80 !! 32 !! 6,846 !! 274 !! 10 !! 2.40 !! .915<br /> |}<br /> &lt;nowiki&gt;*&lt;/nowiki&gt; [[Stanley Cup]] Champion.<br /> <br /> ===International===<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; ID=&quot;Table3&quot; style=&quot;text-align:center; width:40em&quot;<br /> |- ALIGN=&quot;center&quot; bgcolor=&quot;#e0e0e0&quot;<br /> ! Year<br /> ! Team<br /> ! Event<br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! GP<br /> ! W<br /> ! L<br /> ! T<br /> ! MIN<br /> ! GA<br /> ! SO<br /> ! GAA<br /> |-<br /> | [[1969 World Ice Hockey Championships|1969]]<br /> | [[Canada men's national ice hockey team|Canada]]<br /> | [[Ice Hockey World Championships|WC]]<br /> | 2<br /> | 1<br /> | 1<br /> | 0<br /> | 120<br /> | 4<br /> | 1<br /> | 2.00<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1972 Summit Series|1972]]<br /> | Canada<br /> | SS<br /> | 4<br /> | 2<br /> | 2<br /> | 0<br /> | 240<br /> | 19<br /> | 0<br /> | 4.75<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; | Senior totals<br /> ! 6<br /> ! 3<br /> ! 3<br /> ! 0<br /> ! 360<br /> ! 23<br /> ! 1<br /> ! 3.83<br /> |}<br /> <br /> {{cite web |url=http://hockeygoalies.org/bio/drydenk.html|title = Dryden's stats |publisher=The Goaltender Home Page|access-date=2017-08-07}}<br /> <br /> ==References==<br /> {{reflist|2}}<br /> <br /> ==External links==<br /> *{{icehockeystats|legendsm=P198301}}<br /> * [http://hockeygoalies.org/bio/drydenk.html Ken Dryden biography] at [http://hockeygoalies.org hockeygoalies.org] - advanced statistics and game logs<br /> *{{IMDb name | id=1105047 | name=Ken Dryden}}<br /> *[https://web.archive.org/web/20060215194756/http://www.howdtheyvote.ca/member.php?id=93 How'd They Vote?: Ken Dryden's voting history and quotes]<br /> *{{Canadian Parliament links|ID=16970}}<br /> <br /> {{s-start}}<br /> {{Canadian federal ministry navigational box header |ministry=27}}<br /> {{ministry box cabinet posts<br /> | post1 = [[Minister of Social Development (Canada)|Minister of Social Development]]<br /> | post1years = 2004–2006<br /> | post1note = <br /> | post1preceded = [[Liza Frulla]]<br /> | post1followed = position abolished<br /> }}<br /> {{s-ach}}<br /> {{succession box | before = [[Doug Ferguson (ice hockey)|Doug Ferguson]]| title = [[List of ECAC Hockey Most Outstanding Player in Tournament|ECAC Hockey Most Outstanding Player in Tournament]]| years = [[1968 ECAC Hockey Men's Ice Hockey Tournament|1968]], [[1969 ECAC Hockey Men's Ice Hockey Tournament|1969]]| after = [[Bruce Bullock]]}}<br /> {{succession box | before = [[Wayne Small]]| title = [[List of ECAC Hockey Player of the Year|ECAC Hockey Player of the Year]]| years = [[1968–69 NCAA Division I men's ice hockey season|1968–69]]| after = [[Timothy Sheehy (ice hockey)|Tim Sheehy]]}}<br /> {{succession box | before = [[Gilbert Perreault]] | title = Winner of the [[Calder Memorial Trophy]] | years = 1972 | after = [[Steve Vickers (ice hockey)|Steve Vickers]] }}<br /> {{succession box | before = [[Bobby Orr]] | title = Winner of the [[Conn Smythe Trophy]] | years = 1971 | after = [[Bobby Orr]]}}<br /> {{succession box | before = [[Tony Esposito]] | title = Winner of the [[Vezina Trophy]] | years = [[1972–73 NHL season|1973]] | after = [[Tony Esposito]] and [[Bernie Parent]] ''(tied)''}}<br /> {{succession box | before = [[Bernie Parent]] | title = Winner of the [[Vezina Trophy]] &lt;br /&gt;''with [[Michel Larocque (ice hockey, born 1952)|Michel Larocque]] (1977, 1978, 1979)''| years = [[1975–76 NHL season|1976]], [[1976–77 NHL season|1977]], [[1977–78 NHL season|1978]], [[1978–79 NHL season|1979]] | after = [[Don Edwards (ice hockey)|Don Edwards]] and [[Bob Sauvé]]}}<br /> {{s-sports}}<br /> {{succession box | before = [[Bob Pulford]] | title = [[National Hockey League Players Association|NHLPA President]] | years = 1972–74 | after = [[Pit Martin]]}}<br /> {{succession box | before = [[Cliff Fletcher]] | title = [[List of Toronto Maple Leafs general managers|General Manager of the Toronto Maple Leafs]] | years = [[1997–98 NHL season|1997]]–[[1998–99 NHL season|99]] | after = [[Pat Quinn (ice hockey)|Pat Quinn]]}}<br /> {{s-end}}<br /> <br /> {{CA-Ministers of Labour}}<br /> <br /> {{Authority control}}<br /> <br /> {{DEFAULTSORT:Dryden, Ken}}<br /> [[Category:1947 births]]<br /> [[Category:Living people]]<br /> [[Category:Boston Bruins draft picks]]<br /> [[Category:Calder Trophy winners]]<br /> [[Category:Canadian ice hockey goaltenders]]<br /> [[Category:Canadian non-fiction writers]]<br /> [[Category:Canadian people of Scottish descent]]<br /> [[Category:Canadian sportsperson-politicians]]<br /> [[Category:Conn Smythe Trophy winners]]<br /> [[Category:Cornell Big Red men's ice hockey players]]<br /> [[Category:Hockey Hall of Fame inductees]]<br /> [[Category:Ice hockey people from Toronto]]<br /> [[Category:Lawyers in Ontario]]<br /> [[Category:Liberal Party of Canada leadership candidates]]<br /> [[Category:Liberal Party of Canada MPs]]<br /> [[Category:Members of the House of Commons of Canada from Ontario]]<br /> [[Category:Members of the King's Privy Council for Canada]]<br /> [[Category:Montreal Canadiens players]]<br /> [[Category:National Hockey League All-Stars]]<br /> [[Category:National Hockey League general managers]]<br /> [[Category:National Hockey League players with retired numbers]]<br /> [[Category:National Hockey League team presidents]]<br /> [[Category:Officers of the Order of Canada]]<br /> [[Category:Olympic Games broadcasters]]<br /> [[Category:Order of Hockey in Canada recipients]]<br /> [[Category:Politicians from Hamilton, Ontario]]<br /> [[Category:Politicians from Toronto]]<br /> [[Category:Ice hockey people from Hamilton, Ontario]]<br /> [[Category:Stanley Cup champions]]<br /> [[Category:Toronto Maple Leafs executives]]<br /> [[Category:Vezina Trophy winners]]<br /> [[Category:Writers from Hamilton, Ontario]]<br /> [[Category:Writers from Toronto]]<br /> [[Category:Members of the 27th Canadian Ministry]]<br /> [[Category:McGill University Faculty of Law alumni]]<br /> [[Category:NCAA men's ice hockey national champions]]<br /> [[Category:Hockey writers]]<br /> [[Category:AHCA Division I men's ice hockey All-Americans]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Ken_Dryden&diff=1168748311 Ken Dryden 2023-08-04T19:18:45Z <p>205.189.94.9: /* Personal life */ Father Murray has a wiki entry; related to Apps family.</p> <hr /> <div>{{Short description|Canadian ice hockey goaltender and politician}}<br /> {{Use Canadian English|date=January 2023}}<br /> {{Infobox officeholder<br /> | name = Ken Dryden<br /> | image = Ken Dryden 2011.jpg<br /> | caption = Dryden in 2011<br /> | honorific_prefix = [[The Honourable]]<br /> | honorific_suffix = [[King's Privy Council for Canada|PC]] [[Order of Canada|OC]]<br /> | office1 = [[Minister of Families, Children and Social Development|Minister of Social Development]]<br /> | term_start1 = July 20, 2004<br /> | term_end1 = February 5, 2006<br /> | predecessor1 = [[Liza Frulla]]<br /> | successor1 = [[Diane Finley]]<br /> | primeminister1 = [[Paul Martin]]<br /> | riding2 = [[York Centre]]<br /> | parliament2 = Canadian<br /> | term_start2 = June 28, 2004<br /> | term_end2 = May 1, 2011<br /> | predecessor2 = [[Art Eggleton]]<br /> | successor2 = [[Mark Adler (politician)|Mark Adler]]<br /> | birth_name = Kenneth Wayne Dryden<br /> | birth_date = {{birth date and age|1947|8|8}}<br /> | birth_place = [[Hamilton, Ontario|Hamilton]], [[Ontario]], Canada<br /> | profession = {{hlist|Athlete|lawyer|teacher|writer|politician|sports commentator|businessperson}}<br /> | alma_mater = [[Cornell University]] ([[Bachelor of Arts|BA]])&lt;br /&gt;[[McGill University]] ([[Bachelor of Laws|LLB]])<br /> | party = [[Liberal Party of Canada|Liberal]]<br /> | religion = <br /> | residence = <br /> | footnotes = <br /> | spouse = Lynda Dryden<br /> | module = {{Infobox ice hockey player<br /> | embed = yes<br /> | halloffame = 1983<br /> | image_size = <br /> | height_ft = 6<br /> | height_in = 4<br /> | weight_lb = 205<br /> | position = [[Goaltender]]<br /> | catches = Left<br /> | played_for = [[Montreal Canadiens]]<br /> | ntl_team = CAN<br /> | draft = 14th overall<br /> | draft_year = 1964<br /> | draft_team = [[Boston Bruins]]<br /> | career_start = 1970<br /> | career_end = 1979<br /> }}<br /> }}<br /> <br /> '''Kenneth Wayne Dryden''' {{Post-nominals|country=CAN|PC|OC}} (born August 8, 1947) is a [[Canadians|Canadian]] politician, lawyer, businessman, author, and former [[National Hockey League]] (NHL) [[goaltender]] and executive. He is an [[Officer of the Order of Canada]]&lt;ref&gt;{{cite web |title=Appointments to the Order of Canada |url=http://www.gg.ca/document.aspx?id=14904&amp;lan=eng |publisher=Governor General of Canada |access-date=December 31, 2012}}&lt;/ref&gt; and a member of the [[Hockey Hall of Fame]]. He was a Liberal [[Member of Parliament (Canada)|Member of Parliament]] from 2004 to 2011 and [[Minister of Families, Children and Social Development|Minister of Social Development]] from 2004 to 2006. In 2017, the league counted him in history's [[100 Greatest NHL Players]].&lt;ref&gt;{{cite web|title=100 Greatest NHL Players|url=https://www.nhl.com/fans/nhl-centennial/100-greatest-nhl-players|website=NHL.com|access-date=January 27, 2017|date=January 27, 2017}}&lt;/ref&gt;&lt;ref name=&quot;:0&quot;&gt;{{Citation|last=NHL|title=Ken Dryden won Conn Smythe before he won Calder|date=2017-03-22|url=https://www.youtube.com/watch?v=UCIEwsLagTM&amp;list=PL1NbHSfosBuHEp2Bphcgz16OKz0kjnCH6&amp;index=50 |archive-url=https://ghostarchive.org/varchive/youtube/20211212/UCIEwsLagTM| archive-date=2021-12-12 |url-status=live|access-date=2017-04-25}}{{cbignore}}&lt;/ref&gt; He received the [[Order of Hockey in Canada]] in 2020.&lt;ref name=&quot;2020-recipients&quot;&gt;{{cite web|url=https://www.hockeycanada.ca/en-ca/news/2020-oohic-class-of-2020-named-to-order|title=Class of 2020 unveiled for Order of Hockey in Canada|date=2020-02-11|website=Hockey Canada|access-date=2020-02-11}}&lt;/ref&gt;<br /> <br /> ==Early life and education==<br /> Dryden was born in [[Hamilton, Ontario]], in 1947.&lt;ref name=CHATLAS&gt;{{cite book |title=The Canadian Hockey Atlas |url=https://archive.org/details/canadianhockeyat0000cole |url-access=registration |first=Stephen |last=Cole |publisher=Doubleday Canada |year=2006 |isbn=978-0-385-66093-8 }}&lt;/ref&gt; His parents were [[Murray Dryden]] (1911–2004) and Margaret Adelia Campbell (1912-1985). He has a sister, Judy, and a brother, [[Dave Dryden|Dave]], who was also an NHL goaltender. Dryden was raised in [[Islington-City Centre West|Islington]], [[Ontario]], then a suburb of Toronto. He played with the [[Etobicoke Indians]] of the [[Metro Junior B Hockey League]] as well as [[Humber Valley Packers]] of the [[Greater Toronto Hockey League|Metro Toronto Hockey League]].<br /> <br /> Dryden was drafted fourteenth overall by the [[Boston Bruins]] in the [[1964 NHL Amateur Draft]]. Days later, June 28,&lt;ref&gt;{{cite web|title=Trader Sam's Greatest Trades|url=http://www.habsworld.net/article.php?id=1472|website=HabsWorld| date=15 August 2007 |access-date=February 28, 2015}}&lt;/ref&gt; Boston traded Dryden, along with [[Alex Campbell (ice hockey)|Alex Campbell]], to the [[Montreal Canadiens]] for Paul Reid and Guy Allen. Dryden was told by his agent that he had been drafted by the Canadiens and did not find out until the mid-1970s that he had been drafted by the Bruins.&lt;ref&gt;{{cite web |url=http://www.habseyesontheprize.com/2009/8/21/997368/habs-robbed-bruins-of-dryden |title=Canadiens blog English translation of Canoe article |publisher=Sportsblog Inc. |date=August 21, 2009 |archive-url=https://web.archive.org/web/20090825030756/http://www.habseyesontheprize.com/2009/8/21/997368/habs-robbed-bruins-of-dryden |archive-date=August 25, 2009}}&lt;/ref&gt;<br /> <br /> Rather than play for the Canadiens in 1964, Dryden pursued a [[Bachelor of Arts|B.A.]] degree in [[Cornell University Department of History|History]] at [[Cornell University]], where he also played hockey until his graduation in 1969. He backstopped the [[Cornell Big Red men's ice hockey|Cornell Big Red]] to the 1967 [[National Collegiate Athletic Association]] championship and to three consecutive [[ECAC Hockey League|ECAC]] tournament championships, and won 76 of his 81 varsity starts. At Cornell, he was a member of the [[Quill and Dagger]] society.&lt;ref name='sun5-68'&gt;''The Cornell Daily Sun'', 9 May 1968&lt;/ref&gt; He also was a member of the Canadian amateur national team at the [[1969 World Ice Hockey Championships]] tournament in [[Stockholm]].<br /> <br /> Dryden took a break from the NHL for the 1973–74 season to [[Articled clerk|article]] for a Toronto law firm, and to earn an [[Bachelor of Laws|LL.B.]] degree he received from [[McGill University]] in 1973.&lt;ref&gt;{{cite news |title=Dryden Quits Hockey for Law Clerk Job |url=https://www.nytimes.com/1973/09/15/archives/dryden-quits-hockey-for-law-clerk-job.html |newspaper=[[The New York Times]] |access-date=22 July 2018 |language=en |date=15 September 1973}}&lt;/ref&gt;&lt;ref&gt;[https://www.mcgill.ca/about/alumni Notable alumni] - website of [[McGill University]]&lt;/ref&gt; During this time Dryden interned with [[Ralph Nader|Ralph Nader's]] [[Public Citizen]] organization. Inspired by Nader's call in [[Action for a Change]] for establishing [[Public Interest Research Group|Public Interest Research Group's]], Dryden tried to establish the [[Ontario Public Interest Research Group]] in the Province of Ontario.&lt;ref&gt;{{cite magazine |title=PIRG Power |url=http://qpirgmcgill.org/wp-content/uploads/2014/07/PIRG-Power-.pdf |author1=[[Karen Farbridge]] |author2=Peter Cameron |magazine=[[A\J: Alternatives Journal|Alternatives Journal]]|date=Summer 1998}}&lt;/ref&gt;<br /> <br /> Dryden's jersey number 1 was retired by the [[Cornell Big Red]] on February 25, 2010; along with [[Joe Nieuwendyk]], he is one of only two players to have their numbers retired by Cornell's hockey program.&lt;ref&gt;{{cite web |url=https://news.cornell.edu/stories/2010/02/dryden-nieuwendyks-hockey-numbers-be-retired |title=Big Red to retire Dryden, Nieuwendyk's hockey numbers |first=Kevin |last=Zeisse |publisher=Cornell Chronicle |date=February 25, 2010 |access-date=May 2, 2023}}&lt;/ref&gt;<br /> <br /> ==Playing career==<br /> Dryden made his NHL debut on Sunday March 14, 1971 against the Pittsburgh Penguins in Pittsburgh. The Canadiens won the game 5-1, and the only goal scored against Ken was by a player named John Stewart. Later, on March 20, 1971, he played in a home game against his brother [[Dave Dryden]], a fellow backup goaltender for [[Buffalo Sabres]], when Canadiens starter [[Rogie Vachon]] suffered an injury;&lt;ref&gt;{{Citation|last=SwissHabs|title=Legends of hockey : Ken Dryden|date=2012-04-27|url=https://www.youtube.com/watch?v=dZFOmrHbLw4|access-date=2019-03-27}}&lt;/ref&gt; this still stands, as of 2021, as the only time a pair of brothers faced against each other as goaltenders. He was called up from the minors late in the season and played only six regular-season games, but rang up 1.65 goals-against average. This earned him the starting goaltending job for the playoffs ahead of veteran [[Rogie Vachon]], and he helped the Canadiens to win the Stanley Cup. He also won the [[Conn Smythe Trophy]] as the most valuable player in the playoffs. He helped the Habs win five more [[Stanley Cup]]s in 1973, 1976, 1977, 1978, and 1979.<br /> <br /> The following year Dryden won the [[Calder Memorial Trophy|Calder Trophy]] as the rookie of the year; he was not eligible for it the previous year because he did not play enough regular-season games. He is the only player to win the Conn Smythe Trophy before winning the rookie of the year award, and the only goaltender to win both the Conn Smythe and the Stanley Cup before losing a regular-season game.&lt;ref name=&quot;:0&quot; /&gt; In the autumn of 1972 Dryden played for Team Canada in the [[1972 Summit Series]] against the [[Soviet national ice hockey team]].<br /> <br /> Dryden played from 1971 to 1979, with a break during the entire [[1973–74 NHL season|1973–74 season]]; he was unhappy with the contract that the Canadiens offered him, which he considered less than his market worth, given that he had won the Stanley Cup and Vezina Trophy. He announced on September 14, 1973, that he was joining the Toronto law firm of Osler, Hoskin and Harcourt as a legal clerk for the year, for $135 a week. He skipped training camp and held out that season. The Canadiens still had a good year, going 45-24-9, but lost in the first round of the playoffs to the [[New York Rangers]] in six games. The Canadiens allowed 56 more goals in the 1973–74 season than they had the year before with Dryden.&lt;ref name=&quot;athletics.mcgill.ca&quot;&gt;{{cite web |url=http://www.athletics.mcgill.ca/alumni_news_details.ch2?article_id=454 |title=McGill Athletics |access-date=2008-09-28 |url-status=dead |archive-url=https://web.archive.org/web/20080515141224/http://www.athletics.mcgill.ca/alumni_news_details.ch2?article_id=454 |archive-date=2008-05-15 }}&lt;/ref&gt; Dryden used that year to fulfill the requirements for his law degree at [[McGill University|McGill]] and article for a law firm. He retired for the last time on July 9, 1979.&lt;ref&gt;{{cite news |title=From the archives: Dryden announces retirement |url=https://www.theglobeandmail.com/sports/from-the-archives-dryden-announces-retirement/article1069554/ |access-date=4 August 2020 |work=Globe &amp; Mail |date=10 July 1979}}&lt;/ref&gt;<br /> <br /> Compared to those of most other great hockey players, Dryden's NHL career was very short: just over seven full seasons. Thus he did not amass record totals in most statistical categories. As he played all his years with a dynasty and retired before he passed his prime, his statistical percentages are unparalleled. His regular-season totals include a 74.3 winning percentage, a 2.24 goals-against average, a 92.2 save percentage, 46 shutouts, and 258 wins, only 57 losses and 74 ties in just 397 NHL games. He won the [[Vezina Trophy]] five times as the goaltender on the team who allowed the fewest goals and in the same years was selected as a First Team All-Star. In 1998, he was ranked number 25 on ''[[The Hockey News]]''' list of the 100 Greatest Hockey Players, a remarkable achievement for a player with a comparatively brief career.<br /> <br /> At 6 feet, 4 inches, Dryden was so tall that during stoppages in play he struck what became his trademark pose: leaning upon his stick. He was known as the &quot;four-storey goalie,&quot; and was once referred to as &quot;that thieving giraffe&quot; by Boston Bruins superstar [[Phil Esposito]], in reference to Dryden's skill and height. Unbeknownst to him, his pose was exactly the same as the one struck by fellow Canadiens goaltender, [[Georges Vézina]], 60 years prior.&lt;ref&gt;{{Citation|last=SwissHabs|title=Legends of hockey : Ken Dryden|date=2012-04-27|url=https://www.youtube.com/watch?v=dZFOmrHbLw4&amp;t=18m9s|access-date=2019-03-27}}&lt;/ref&gt;<br /> <br /> Dryden was inducted into the [[Hockey Hall of Fame]] in 1983, as soon as he was eligible. His jersey number 29 was retired by the Canadiens on January 29, 2007. He was inducted into the [[Ontario Sports Hall of Fame]] in 2011.&lt;ref&gt;{{cite web |title=Ken Dryden |url=http://oshof.ca/index.php/honoured-members/item/13-ken-dryden |website=oshof.ca |publisher=[[Ontario Sports Hall of Fame]] |date=2011 |access-date=2018-04-10 |archive-date=2018-07-23 |archive-url=https://web.archive.org/web/20180723034300/https://oshof.ca/index.php/honoured-members/item/13-ken-dryden |url-status=dead }}&lt;/ref&gt;<br /> <br /> ==Post playing ==<br /> <br /> ===Writing===<br /> Dryden wrote one book during his hockey career: ''Face-Off at the Summit''. It was a diary about Team Canada in the [[Summit Series|Canada vs. Soviet Union series of 1972]]. The book has been out of print for many years.<br /> <br /> After retiring from hockey Dryden wrote several more books. ''[[The Game (Ken Dryden)|The Game]]'' was a commercial and critical success, and was nominated for a [[1983 Governor General's Awards|Governor General's Award]] in 1983. His next book, ''Home Game: Hockey and Life in Canada'' (1990), written with [[Roy MacGregor]], was developed into an award-winning [[Canadian Broadcasting Corporation]] six-part documentary series for television. His fourth book was ''The Moved and the Shaken: The Story of One Man's Life'' (1993). His fifth book, ''In School: Our Kids, Our Teachers, Our Classrooms'' (1995), written with Roy MacGregor, was about Canada’s educational system. ''Becoming Canada'' (2010) argued for a new definition of Canada and its unique place in the world.<br /> <br /> In 2019, he published ''Scotty: A Hockey Life Like No Other,'' his biography of his Canadiens coach Scotty Bowman. Dryden says at the beginning that he '''needed'' to write this book,' because 'Scotty had lived a truly unique life. He has experienced almost ''everything'' in hockey, up close, for the best part of a century - and his is a life that no on else will live again. It's a life that had to be captured. And it needs to be captured now, because time is moving on.' &lt;ref name=&quot;A Hockey Life like no Other&quot;&gt;{{cite book |last=Dryden |first= Ken |author-link=Ken Dryden|date=2019 |title=[[The Game (Dryden book)|The Game]]|publisher=[[Penguin Random House]] |isbn=978-0-7710-2750-5}}&lt;/ref&gt;<br /> <br /> Feeling that Bowman was 'too practical and focused' to be a natural storyteller, Dryden instead asked Bowman to think like a coach and select the 8 greatest teams of all time (but only one per dynasty) and explain what he thought about them, how we coach against them but also what was happening in his life at that time and through that process, Bowman's story would be told.&lt;ref name=&quot;A Hockey Life like no Other&quot;/&gt;<br /> <br /> ===Commentator===<br /> Dryden worked as a television hockey commentator at the [[1980 Winter Olympics|1980]], [[1984 Winter Olympics|1984]] and [[1988 Winter Olympics]]. He served as a colour commentator with play-by-play man [[Al Michaels]] for [[ABC Olympic broadcasts|ABC]]'s coverage of the &quot;[[Miracle on Ice]].&quot; Immediately before [[Mike Eruzione]]'s game-winning goal for the US, Dryden expressed his concern that the team was &quot;depending a little bit too much&quot; on goaltender [[Jim Craig (ice hockey)|Jim Craig]] after Craig had just made &quot;too many good saves.&quot;<br /> <br /> ===Sports executive===<br /> In 1997, Dryden was hired as president of the [[Toronto Maple Leafs]] by minority owner [[Larry Tanenbaum]]. [[Pat Quinn (ice hockey)|Pat Quinn]] became head coach in 1998, and there were reports that the two men had a frosty relationship. A few months after joining the Leafs, Quinn became general manager, a move thought by some to preempt Dryden from hiring former Canadiens teammate [[Bob Gainey]].&lt;ref name=&quot;athletics.mcgill.ca&quot;/&gt;<br /> <br /> Dryden spoke at the [[Open Ice Summit]] in 1999, to discuss improvements needed to ice hockey in Canada. He wanted delegates to accept that progress made at the lower levels and off the ice was important in achieving international results.&lt;ref&gt;{{cite book|last=McKinley|first=Michael|title=It's Our Game: Celebrating 100 Years Of Hockey Canada|publisher=Viking|date=2014 |location=Toronto, Ontario|isbn=978-0-670-06817-3|pages=314–316}}&lt;/ref&gt; He was cautious that change would come slowly and be costly, but felt the summit was an important step in making progress.&lt;ref&gt;{{cite news|title=11 ideas from Open summit|last=Colbourn|first=Glen|date=August 28, 1999|newspaper=Medicine Hat News|location=Medicine Hat, Alberta|page=9 |url=https://newspaperarchive.com/sports-clipping-aug-28-1999-1717170/}}{{free access}}&lt;/ref&gt; He also urged for the end to persistent abuse of [[Official (ice hockey)|on-ice officials]], or Canada would lose 10,000 referees each year. As a result of the summit, Hockey Canada started to educate on the importance of respect for game officials.&lt;ref&gt;{{cite news|title=CHA wants fans to layoff referees|last=Beacon|first=Bill|date=November 5, 1999|newspaper=Winnipeg Free Press|location=Winnipeg, Manitoba|page=50|url=https://newspaperarchive.com/sports-clipping-nov-05-1999-1717195/}}{{free access}}&lt;/ref&gt;<br /> <br /> On August 29, 2003, with the hiring of [[John Ferguson, Jr.]] as general manager, there was a major management shakeup. Majority owner [[Steve Stavro]] was bought out by the [[Ontario Teachers' Pension Plan]] and he stepped down as chairman in favour of [[Larry Tanenbaum]]. Quinn continued as head coach. Dryden's position was abolished, in favour of having both the Leafs' and [[Toronto Raptors|Raptors]]' managers reporting directly to [[Maple Leaf Sports &amp; Entertainment Ltd.|MLSE]] President and CEO [[Richard Peddie]]. Dryden was shuffled to the less important role of vice-chairman and given a spot on MLSE's board of directors. This was described by commentators as &quot;sitting outside the loop&quot;, as Dryden did not report directly to Leafs ownership.&lt;ref name=&quot;athletics.mcgill.ca&quot;/&gt;&lt;ref&gt;{{cite web |url=http://www.athletics.mcgill.ca/alumni_news_details.ch2?article_id=453 |title=McGill Athletics |access-date=2006-12-05 |url-status=dead |archive-url=https://web.archive.org/web/20060208124054/http://www.athletics.mcgill.ca/alumni_news_details.ch2?article_id=453 |archive-date=2006-02-08 }}&lt;/ref&gt; He stayed on until 2004 when he resigned to enter politics.<br /> <br /> ===Teaching===<br /> In January 2012, Dryden was appointed a &quot;Special Visitor&quot; at his alma mater [[McGill University]]'s Institute for the Study of Canada. He taught a [[Canadian Studies]] course entitled &quot;Thinking the Future to Make the Future,&quot; which focused on issues facing Canada in the future and possible solutions to them.&lt;ref&gt;{{Cite news|url=https://reporter.mcgill.ca/q-a-ken-dryden-thinks-the-future/|title=Q &amp; A: Ken Dryden thinks the future|date=January 17, 2012|work=McGill Reporter|access-date = May 2, 2023|first=Cynthia|last=Lee}}&lt;/ref&gt;<br /> <br /> ==Political career==<br /> Dryden joined the [[Liberal Party of Canada]] and ran for the [[House of Commons of Canada|House of Commons]] in the [[2004 Canadian federal election|2004 federal election]]. He was selected by party leader and [[Prime Minister of Canada|Prime Minister]] Paul Martin as a &quot;[[star candidate]]&quot; in the Toronto [[Electoral district (Canada)|riding]] of [[York Centre]], then considered a [[safe seat|safe]] Liberal riding.&lt;ref name=&quot;The Star&quot;&gt;{{cite news| url=https://www.thestar.com/news/canada/2011/05/03/dryden_goes_down_to_defeat.html |title=Dryden goes down to defeat |first=Paul |last=Moloney |newspaper=The Toronto Star}}&lt;/ref&gt;<br /> <br /> Dryden was elected by a margin of over 11,000 votes.&lt;ref name=&quot;2004 results&quot;&gt;{{cite news |title=Election results...riding by riding |newspaper=The Globe and Mail |date=June 29, 2004 |page=A14}}&lt;/ref&gt; He was named to [[Cabinet of Canada|Cabinet]] as [[Minister of Social Development (Canada)|Minister of Social Development]].&lt;ref&gt;{{cite news |title=Who does what in the new federal cabinet |newspaper=The Hamilton Spectator |date=July 21, 2004 |page=A10}}&lt;/ref&gt; He made headlines on February 16, 2005, as the target of a remark by [[Conservative Party of Canada|Conservative]] [[Member of Parliament (Canada)|Member of Parliament]] [[Rona Ambrose]] who said about Dryden, &quot;working women want to make their own choices, we don't need old white guys telling us what to do.&quot; Ambrose made the remarks after Dryden commented on a poll that analyzed child care choices by Canadian families.&lt;ref&gt;{{cite news |title=A Verbal Slapshot; MP tells child-care minister Ken Dryden: 'We don't need old white guys telling us what to do' |last=Dugas |first=Dan |newspaper=The Hamilton Spectator |date=February 16, 2005 |page=A10}}&lt;/ref&gt; Dryden won generally favourable reviews for his performance in Cabinet.<br /> <br /> Dryden was re-elected in the [[2006 Canadian federal election|2006 federal election]], while the Liberals were defeated and [[Paul Martin]] resigned the party leadership.&lt;ref name=&quot;2006 results&quot;&gt;{{cite news |title=Election results...riding by riding |newspaper=The Globe and Mail |date=January 24, 2006 |page=A16}}&lt;/ref&gt; Interim party and opposition leader [[Bill Graham (Canadian politician)|Bill Graham]] named Dryden to his shadow cabinet as health critic.&lt;ref&gt;{{cite news |title=Six Liberals named to shadow cabinet |last=O'Neill |first=Juliet |newspaper=The Vancouver Sun |date=February 23, 2006 |page=A6}}&lt;/ref&gt;<br /> <br /> Dryden's margin of victory in [[York Centre]] dwindled in the 2006 and 2008 elections.&lt;ref&gt;{{cite news |title=Hockey legend Ken Dryden loses bid for fourth term |url=http://toronto.ctvnews.ca/hockey-legend-ken-dryden-loses-bid-for-fourth-term-1.638553 |publisher=CTV News |date=May 2, 2011}}&lt;/ref&gt; In the 2011 federal election, he focused his efforts on his own re-election instead of campaigning for other candidates as he did in the past, and he received a visit from former Prime Minister [[Jean Chrétien]]. Still, Dryden lost his seat to Conservative candidate [[Mark Adler (politician)|Mark Adler]] by nearly 6,000 votes.&lt;ref name=&quot;The Star&quot;/&gt;&lt;ref&gt;{{cite news |title=Israel a key election issue in York Centre |url=http://www.cbc.ca/news/canada/toronto/story/2011/04/25/cv-election-york-centre.html |publisher=CBC News |date=April 25, 2011}}&lt;/ref&gt;<br /> <br /> ===Leadership bid===<br /> <br /> On April 28, 2006, Dryden announced that he would run for the leadership of the [[Liberal Party of Canada]], which would be choosing a successor to Paul Martin at a [[2006 Liberal Party of Canada leadership election|convention in Montreal on December 2, 2006]].&lt;ref&gt;{{cite news|url=http://www.cbc.ca/news/canada/and-then-there-were-10-ken-dryden-is-in-1.587809|work=CBC News|title=And then there were 10 ... Ken Dryden is in|date=April 28, 2006}}&lt;/ref&gt;<br /> <br /> A poll&lt;ref&gt;[http://www.gandalfgroup.ca/downloads/Liberal_Leadership_The_Publics_Choice.pdf September 2006 poll]&lt;/ref&gt; found that Dryden's potential pool of support exceeded that of his opponents, due mainly to his former NHL career. His fundraising fell well below that of top leadership contenders ([[Michael Ignatieff]], [[Gerard Kennedy]], [[Stéphane Dion]] and [[Bob Rae]]). A variety of media pundits criticized his ponderous speaking style and limited French. Supporters argued that few people were strongly opposed to him and that if he ran he could attract more support on later ballots as a consensus candidate.<br /> <br /> At the convention, Dryden came in fifth place on the first ballot with 238 delegates, 4.9% of the vote. On the second ballot, he came in last place with 219 votes (4.7%) and was eliminated. He initially threw his support to Bob Rae, but after Rae was eliminated in the third ballot and released all of his delegates, Dryden endorsed Stéphane Dion, who went on to win the leadership.<br /> <br /> According to Elections Canada filings, as of 2013 Dryden's campaign still owed $225,000.&lt;ref&gt;{{cite news| url=https://www.theglobeandmail.com/news/politics/liberal-leadership-candidates-with-outstanding-loans-wont-be-taken-to-court/article13496625/ | location=Toronto | work=The Globe and Mail | title=Liberal leadership candidates remain off the hook for outstanding debts | date=July 30, 2013}}&lt;/ref&gt;<br /> <br /> ==Personal life==<br /> Ken was the son of Margaret and [[Murray Dryden]]. <br /> Dryden and his wife Lynda have two children and four grandchildren.&lt;ref&gt;{{cite web|url=https://penguinrandomhouse.ca/authors/7480/ken-dryden|title=Ken Dryden|publisher=Penguin Random House Canada|access-date=November 30, 2017}}&lt;/ref&gt; He is a first cousin, twice removed, of [[Murray Murdoch]], another former NHL player and a longtime coach of [[Yale University]]'s hockey team. There is another distant relationship with the [[Syl Apps|Apps]] family. <br /> His older brother Dave also played in the NHL and the WHA as a goalie, from 1961 to 1980.<br /> <br /> ==Bibliography==<br /> <br /> ===Non-fiction===<br /> *''Face-Off at the Summit'' (1973)<br /> *''[[The Game (Ken Dryden)|The Game]]'' (1983)<br /> *''Home Game: Hockey and Life in Canada'' (with Roy MacGregor, 1990)<br /> *''In School: Our Kids, Our Teachers, Our Classrooms'' (with Roy MacGregor, 1995)<br /> *''[[The Moved and the Shaken]]'' (1993)<br /> *''[[Becoming Canada]]'' (2010)<br /> *''[[Game Change (Ken Dryden)|Game Change]]'' (2017)&lt;ref&gt;[https://beta.theglobeandmail.com/arts/books-and-media/book-reviews/review-ken-drydens-game-change-is-a-deep-piece-of-investigative-journalism/article36674963 Review: Ken Dryden’s Game Change is a deep piece of investigative journalism] The Globe and Mail, 20 October 2017&lt;/ref&gt;<br /> *''Scotty: A Hockey Life Like No Other'' (2019)<br /> <br /> ==Awards and honors==<br /> Dryden's hockey awards and honours are numerous and include:<br /> {| class=&quot;wikitable&quot;<br /> ! Award<br /> ! Year<br /> ! Remark<br /> |-<br /> | All-[[ECAC Hockey|ECAC]] [[List of All-ECAC Hockey Teams#First Team|First Team]]<br /> | [[1966–67 NCAA Division I men's ice hockey season|1966–67]], [[1967–68 NCAA Division I men's ice hockey season|1967–68]], [[1968–69 NCAA Division I men's ice hockey season|1968–69]]<br /> | <br /> |-<br /> | [[American Hockey Coaches Association|AHCA]] [[List of Division I AHCA All-American Teams|East All-American]]<br /> | [[1966–67 NCAA Division I men's ice hockey season|1966–67]], [[1967–68 NCAA Division I men's ice hockey season|1967–68]], [[1968–69 NCAA Division I men's ice hockey season|1968–69]]<br /> | <br /> |-<br /> | [[ECAC Hockey]] [[List of ECAC Hockey All-Tournament Team|All-Tournament First Team]]<br /> | [[1967 ECAC Hockey Men's Ice Hockey Tournament|1967]], [[1968 ECAC Hockey Men's Ice Hockey Tournament|1968]], [[1969 ECAC Hockey Men's Ice Hockey Tournament|1969]]<br /> | <br /> |-<br /> | [[NCAA Men's Ice Hockey Championship|NCAA]] [[List of NCAA Division I Men's Ice Hockey All-Tournament Teams|All-Tournament First Team]]<br /> | [[1967 NCAA Division I Men's Ice Hockey Tournament|1967]]<br /> | &lt;ref name = ncaa&gt;{{cite news|title=NCAA Frozen Four Records|url=http://fs.ncaa.org/Docs/stats/frozen_4/2009/f4recs.pdf|publisher=NCAA.org|access-date=2013-06-19}}&lt;/ref&gt;<br /> |-<br /> | [[NCAA Men's Ice Hockey Championship|NCAA]] [[List of NCAA Division I Men's Ice Hockey All-Tournament Teams|All-Tournament Second Team]]<br /> | [[1968 NCAA Division I Men's Ice Hockey Tournament|1968]], [[1969 NCAA Division I Men's Ice Hockey Tournament|1969]]<br /> | &lt;ref name=&quot;ncaa&quot; /&gt;<br /> |-<br /> | [[Conn Smythe Trophy]] winner<br /> | [[1970-71 NHL season|1971]]<br /> | <br /> |-<br /> | [[Calder Memorial Trophy]] winner<br /> | [[1971-72 NHL season|1972]]<br /> | <br /> |-<br /> | [[Vezina Trophy]] winner<br /> | [[1972-73 NHL season|1973]], [[1975-76 NHL season|1976]], [[1976-77 NHL season|1977]]*, [[1977-78 NHL season|1978]]*, [[1978-79 NHL season|1979]]*<br /> | &lt;small&gt;&lt;nowiki&gt;* Shared with &lt;/nowiki&gt;[[Michel Larocque (ice hockey, born 1952)|Michel Larocque]].&lt;/small&gt;<br /> |-<br /> | [[Stanley Cup]] champion<br /> | [[1971 Stanley Cup Finals|1971]], [[1973 Stanley Cup Finals|1973]], [[1976 Stanley Cup Finals|1976]], [[1977 Stanley Cup Finals|1977]], [[1978 Stanley Cup Finals|1978]], [[1979 Stanley Cup Finals|1979]]<br /> | <br /> |-<br /> | Playing [[NHL All-Star Game]]s.<br /> | 1972, [[28th National Hockey League All-Star Game|1975]], [[29th National Hockey League All-Star Game|1976]], [[30th National Hockey League All-Star Game|1977]], [[31st National Hockey League All-Star Game|1978]]<br /> | <br /> |-<br /> | Selected to [[NHL First All-Star Team]]<br /> | 1973, 1976, 1977, 1978, 1979.<br /> |<br /> |-<br /> | Selected to [[NHL Second All-Star Team]]<br /> | 1972<br /> | <br /> |-<br /> | Inducted into the [[Hockey Hall of Fame]]<br /> | 1983<br /> | <br /> |-<br /> | Number 25 on ''The Hockey News''' list of the 100 Greatest Hockey Players<br /> | 1998<br /> | <br /> |-<br /> | Number 29 was retired by the [[Montreal Canadiens]] <br /> | January 29, 2007<br /> | <br /> |-<br /> | His number 1 was retired by the [[Cornell Big Red]]<br /> | February 25, 2010<br /> | &lt;small&gt;One of only two players to have his number retired by the Cornell hockey program;&lt;br&gt;the other being [[Joe Nieuwendyk]].&lt;/small&gt;<br /> |-<br /> | Recipient of the [[Order of Hockey in Canada]]<br /> | 2020<br /> | &lt;ref name=&quot;2020-recipients&quot; /&gt;<br /> |-<br /> |}<br /> <br /> Dryden does not have a substantive doctorate, but has received [[honorary doctoral degree]]s from several universities,&lt;ref&gt;[https://www.collectionscanada.gc.ca/eppp-archive/100/205/300/liberal-ef/06-01-30/www.liberal.ca/bio_e.aspx@&amp;id=35103 Biography: Ken Dryden] - website of the [[Library and Archives Canada]] of the [[Government of Canada]]&lt;/ref&gt; including:<br /> <br /> {| class=&quot;wikitable&quot;<br /> ! Honorary degree<br /> ! University<br /> ! Year<br /> ! Remark<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[University of British Columbia]]<br /> | 1992<br /> | &lt;ref&gt;[https://www.library.ubc.ca/archives/hdcites/hdcites10.html The Title and Degree of Doctor of Laws (honoris causa) conferred at congregation, May 26, 1992] - website of the [[University of British Columbia]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[York University]]<br /> | 1996<br /> | &lt;ref&gt;[https://www.yorku.ca/secretariat/senate/sub-committee-on-honorary-degrees-and-ceremonials/honorary-degree-recipients/ Honorary Degree Recipients] - website of the [[York University]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[University of Windsor]]<br /> | 1997<br /> | &lt;ref&gt;[https://www.uwindsor.ca/secretariat/sites/uwindsor.ca.secretariat/files/honorary_degree_by_convocation_1.pdf Honorary degrees conferred (Chronological)] - website of the [[University of Windsor]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Doctor of the University|D.Univ.]] degree<br /> | [[University of Ottawa]]<br /> | 2000<br /> | &lt;ref&gt;[https://www.uottawa.ca/president/bio/dryden-ken Office of the president: Honorary Doctorates - Ken Dryden] - website of the [[University of Ottawa]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[Toronto Metropolitan University|Ryerson University]]<br /> | 2013<br /> | &lt;ref&gt;[https://www.ryerson.ca/calendar/2021-2022/about-ryerson/honorary-doctorates/ Ryerson Honorary Doctorates and Fellowships] - website of the [[Toronto Metropolitan University|Ryerson University]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[McGill University]]<br /> | 2018<br /> | &lt;ref&gt;[https://www.mcgill.ca/newsroom/channels/news/naomi-azrieli-ken-dryden-receive-honorary-degrees-291005 Naomi Azrieli, Ken Dryden to receive honorary degrees News] - website of the [[McGill University]]&lt;/ref&gt;<br /> |-<br /> | Honorary [[Legum Doctor|LL.D.]] degree<br /> | [[University of Winnipeg]]<br /> | 2018<br /> | &lt;ref&gt;[https://www.uwinnipeg.ca/awards-distinctions/honorary-doctorate/dryden.html Honorary Doctorate: Ken Dryden] - website of the [[University of Winnipeg]]&lt;/ref&gt;<br /> |}<br /> <br /> ==Career statistics==<br /> ===Regular season and playoffs===<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; style=&quot;width:95%; text-align:center;&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! colspan=&quot;9&quot; bgcolor=&quot;#e0e0e0&quot; | [[Regular season]]<br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! colspan=&quot;8&quot; bgcolor=&quot;#e0e0e0&quot; | [[Playoffs]]<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! [[Season (sports)|Season]]<br /> ! Team<br /> ! League<br /> ! GP !! W !! L !! T !! MIN !! GA !! [[Shutout#Ice hockey|SO]] !! [[Goals against average|GAA]] !! [[save percentage|SV%]]<br /> ! GP !! W !! L !! MIN !! GA !! SO !! GAA !! SV%<br /> |-<br /> | 1963–64<br /> | Humber Valley Packers<br /> | MTHL<br /> | — || — || — || — || — || — || — || — || — <br /> | — || — || — || — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | 1964–65<br /> | Etobicoke Indians<br /> | [[Metro Junior A Hockey League|MetJHL]]<br /> | — || — || — || — || — || — || — || — || — <br /> | — || — || — || — || — || — || — || —<br /> |-<br /> | [[1966–67 NCAA University Division men's ice hockey season|1966–67]]<br /> | [[Cornell Big Red men's ice hockey|Cornell University]]<br /> | [[ECAC Hockey|ECAC]]<br /> | 27 || 26 || 0 || 1 || 1646 || 40 || 4 || 1.46 || .945<br /> | — || — || — || — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1967–68 NCAA University Division men's ice hockey season|1967–68]]<br /> | Cornell University<br /> | ECAC<br /> | 29 || 25 || 2 || 0 || 1620 || 41 || 6 || 1.52 || .938<br /> | — || — || — || — || — || — || — || —<br /> |-<br /> | [[1968–69 NCAA University Division men's ice hockey season|1968–69]]<br /> | Cornell University<br /> | ECAC<br /> | 27 || 25 || 2 || 0 || 1578 || 47 || 3 || 1.79 || .936<br /> | — || — || — || — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1970–71 AHL season|1970–71]]<br /> | [[Montreal Voyageurs]]<br /> | [[American Hockey League|AHL]]<br /> | 33 || 16 || 7 || 8 || 1899 || 84 || 3 || 2.68 || —<br /> | — || — || — || — || — || — || — || —<br /> |-<br /> | [[1970–71 NHL season|1970–71]]*<br /> | [[Montreal Canadiens]]<br /> | [[National Hockey League|NHL]]<br /> | 6 || 6 || 0 || 0 || 327 || 9 || 0 || 1.65 || .957<br /> | 20 || 12 || 8 || 1221 || 61 || 0 || 3.00 || .914<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1971–72 NHL season|1971–72]]<br /> | Montreal Canadiens<br /> | NHL<br /> | 64 || 39 || 8 || 15 || 3800 || 142 || 8 || 2.24 || .930<br /> | 6 || 2 || 4 || 360 || 17 || 0 || 2.83 || .911<br /> |-<br /> | [[1972–73 NHL season|1972–73]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 54 || 33 || 7 || 13 || 3165 || 119 || 6 || 2.26 || .926<br /> | 17 || 12 || 5 || 1039 || 50 || 1 || 2.89 || .908<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1974–75 NHL season|1974–75]]<br /> | Montreal Canadiens<br /> | NHL<br /> | 56 || 30 || 9 || 16 || 3320 || 149 || 4 || 2.69 || .906<br /> | 11 || 6 || 5 || 688 || 29 || 2 || 2.53 || .916<br /> |-<br /> | [[1975–76 NHL season|1975–76]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 62 || 42 || 10 || 8 || 3580 || 121 || 8 || 2.03 || .927<br /> | 13 || 12 || 1 || 780 || 25 || 1 || 1.92 || .929<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1976–77 NHL season|1976–77]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 56 || 41 || 6 || 8 || 3275 || 117 || 10 || 2.14 || .920<br /> | 14 || 12 || 2 || 849 || 22 || 4 || 1.55 || .932<br /> |-<br /> | [[1977–78 NHL season|1977–78]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 52 || 37 || 7 || 7 || 3071 || 105 || 5 || 2.05 || .921<br /> | 15 || 12 || 3 || 919 || 29 || 2 || 1.89 || .920<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1978–79 NHL season|1978–79]]*<br /> | Montreal Canadiens<br /> | NHL<br /> | 47 || 30 || 10 || 7 || 2814 || 108 || 5 || 2.30 || .909<br /> | 16 || 12 || 4 || 990 || 41 || 0 || 2.48 || .900<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; | NHL totals<br /> ! 397 !! 258 !! 57 !! 74 !! 23,330 !! 870 !! 46 !! 2.24 !! .922<br /> ! 112 !! 80 !! 32 !! 6,846 !! 274 !! 10 !! 2.40 !! .915<br /> |}<br /> &lt;nowiki&gt;*&lt;/nowiki&gt; [[Stanley Cup]] Champion.<br /> <br /> ===International===<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; ID=&quot;Table3&quot; style=&quot;text-align:center; width:40em&quot;<br /> |- ALIGN=&quot;center&quot; bgcolor=&quot;#e0e0e0&quot;<br /> ! Year<br /> ! Team<br /> ! Event<br /> ! rowspan=&quot;99&quot; bgcolor=&quot;#ffffff&quot; | <br /> ! GP<br /> ! W<br /> ! L<br /> ! T<br /> ! MIN<br /> ! GA<br /> ! SO<br /> ! GAA<br /> |-<br /> | [[1969 World Ice Hockey Championships|1969]]<br /> | [[Canada men's national ice hockey team|Canada]]<br /> | [[Ice Hockey World Championships|WC]]<br /> | 2<br /> | 1<br /> | 1<br /> | 0<br /> | 120<br /> | 4<br /> | 1<br /> | 2.00<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1972 Summit Series|1972]]<br /> | Canada<br /> | SS<br /> | 4<br /> | 2<br /> | 2<br /> | 0<br /> | 240<br /> | 19<br /> | 0<br /> | 4.75<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; | Senior totals<br /> ! 6<br /> ! 3<br /> ! 3<br /> ! 0<br /> ! 360<br /> ! 23<br /> ! 1<br /> ! 3.83<br /> |}<br /> <br /> {{cite web |url=http://hockeygoalies.org/bio/drydenk.html|title = Dryden's stats |publisher=The Goaltender Home Page|access-date=2017-08-07}}<br /> <br /> ==References==<br /> {{reflist|2}}<br /> <br /> ==External links==<br /> *{{icehockeystats|legendsm=P198301}}<br /> * [http://hockeygoalies.org/bio/drydenk.html Ken Dryden biography] at [http://hockeygoalies.org hockeygoalies.org] - advanced statistics and game logs<br /> *{{IMDb name | id=1105047 | name=Ken Dryden}}<br /> *[https://web.archive.org/web/20060215194756/http://www.howdtheyvote.ca/member.php?id=93 How'd They Vote?: Ken Dryden's voting history and quotes]<br /> *{{Canadian Parliament links|ID=16970}}<br /> <br /> {{s-start}}<br /> {{Canadian federal ministry navigational box header |ministry=27}}<br /> {{ministry box cabinet posts<br /> | post1 = [[Minister of Social Development (Canada)|Minister of Social Development]]<br /> | post1years = 2004–2006<br /> | post1note = <br /> | post1preceded = [[Liza Frulla]]<br /> | post1followed = position abolished<br /> }}<br /> {{s-ach}}<br /> {{succession box | before = [[Doug Ferguson (ice hockey)|Doug Ferguson]]| title = [[List of ECAC Hockey Most Outstanding Player in Tournament|ECAC Hockey Most Outstanding Player in Tournament]]| years = [[1968 ECAC Hockey Men's Ice Hockey Tournament|1968]], [[1969 ECAC Hockey Men's Ice Hockey Tournament|1969]]| after = [[Bruce Bullock]]}}<br /> {{succession box | before = [[Wayne Small]]| title = [[List of ECAC Hockey Player of the Year|ECAC Hockey Player of the Year]]| years = [[1968–69 NCAA Division I men's ice hockey season|1968–69]]| after = [[Timothy Sheehy (ice hockey)|Tim Sheehy]]}}<br /> {{succession box | before = [[Gilbert Perreault]] | title = Winner of the [[Calder Memorial Trophy]] | years = 1972 | after = [[Steve Vickers (ice hockey)|Steve Vickers]] }}<br /> {{succession box | before = [[Bobby Orr]] | title = Winner of the [[Conn Smythe Trophy]] | years = 1971 | after = [[Bobby Orr]]}}<br /> {{succession box | before = [[Tony Esposito]] | title = Winner of the [[Vezina Trophy]] | years = [[1972–73 NHL season|1973]] | after = [[Tony Esposito]] and [[Bernie Parent]] ''(tied)''}}<br /> {{succession box | before = [[Bernie Parent]] | title = Winner of the [[Vezina Trophy]] &lt;br /&gt;''with [[Michel Larocque (ice hockey, born 1952)|Michel Larocque]] (1977, 1978, 1979)''| years = [[1975–76 NHL season|1976]], [[1976–77 NHL season|1977]], [[1977–78 NHL season|1978]], [[1978–79 NHL season|1979]] | after = [[Don Edwards (ice hockey)|Don Edwards]] and [[Bob Sauvé]]}}<br /> {{s-sports}}<br /> {{succession box | before = [[Bob Pulford]] | title = [[National Hockey League Players Association|NHLPA President]] | years = 1972–74 | after = [[Pit Martin]]}}<br /> {{succession box | before = [[Cliff Fletcher]] | title = [[List of Toronto Maple Leafs general managers|General Manager of the Toronto Maple Leafs]] | years = [[1997–98 NHL season|1997]]–[[1998–99 NHL season|99]] | after = [[Pat Quinn (ice hockey)|Pat Quinn]]}}<br /> {{s-end}}<br /> <br /> {{CA-Ministers of Labour}}<br /> <br /> {{Authority control}}<br /> <br /> {{DEFAULTSORT:Dryden, Ken}}<br /> [[Category:1947 births]]<br /> [[Category:Living people]]<br /> [[Category:Boston Bruins draft picks]]<br /> [[Category:Calder Trophy winners]]<br /> [[Category:Canadian ice hockey goaltenders]]<br /> [[Category:Canadian non-fiction writers]]<br /> [[Category:Canadian people of Scottish descent]]<br /> [[Category:Canadian sportsperson-politicians]]<br /> [[Category:Conn Smythe Trophy winners]]<br /> [[Category:Cornell Big Red men's ice hockey players]]<br /> [[Category:Hockey Hall of Fame inductees]]<br /> [[Category:Ice hockey people from Toronto]]<br /> [[Category:Lawyers in Ontario]]<br /> [[Category:Liberal Party of Canada leadership candidates]]<br /> [[Category:Liberal Party of Canada MPs]]<br /> [[Category:Members of the House of Commons of Canada from Ontario]]<br /> [[Category:Members of the King's Privy Council for Canada]]<br /> [[Category:Montreal Canadiens players]]<br /> [[Category:National Hockey League All-Stars]]<br /> [[Category:National Hockey League general managers]]<br /> [[Category:National Hockey League players with retired numbers]]<br /> [[Category:National Hockey League team presidents]]<br /> [[Category:Officers of the Order of Canada]]<br /> [[Category:Olympic Games broadcasters]]<br /> [[Category:Order of Hockey in Canada recipients]]<br /> [[Category:Politicians from Hamilton, Ontario]]<br /> [[Category:Politicians from Toronto]]<br /> [[Category:Ice hockey people from Hamilton, Ontario]]<br /> [[Category:Stanley Cup champions]]<br /> [[Category:Toronto Maple Leafs executives]]<br /> [[Category:Vezina Trophy winners]]<br /> [[Category:Writers from Hamilton, Ontario]]<br /> [[Category:Writers from Toronto]]<br /> [[Category:Members of the 27th Canadian Ministry]]<br /> [[Category:McGill University Faculty of Law alumni]]<br /> [[Category:NCAA men's ice hockey national champions]]<br /> [[Category:Hockey writers]]<br /> [[Category:AHCA Division I men's ice hockey All-Americans]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Bob_Murdoch_(ice_hockey,_born_1946)&diff=1168743088 Bob Murdoch (ice hockey, born 1946) 2023-08-04T18:36:23Z <p>205.189.94.9: /* International play */ W</p> <hr /> <div>{{Short description|Canadian ice hockey player and coach}}<br /> {{for|another Canadian ice hockey player|Bob Murdoch (ice hockey, born 1954)}}<br /> {{Use mdy dates|date=September 2011}}<br /> {{Infobox ice hockey player<br /> | name = Bob Murdoch<br /> | image = <br /> | image_size = <br /> | caption = <br /> | birth_date = {{birth date|1946|11|20}}<br /> | birth_place = [[Kirkland Lake]], [[Ontario]], Canada<br /> | death_date = August 2023 (aged 76)<br /> | death place = [[Canmore, Alberta]], Canada.<br /> | height_ft = 6<br /> | height_in = 0<br /> | weight_lb = 211<br /> | position = [[Defenceman|Defence]]<br /> | shoots = Right<br /> | played_for = [[Montreal Canadiens]]&lt;br&gt;[[Los Angeles Kings]]&lt;br&gt;[[Atlanta Flames]]&lt;br&gt;[[Calgary Flames]]<br /> | coached_for = [[Chicago Blackhawks]]&lt;br&gt;[[Winnipeg Jets (1972–1996)|Winnipeg Jets]]&lt;br&gt;[[Maddogs München]]&lt;br&gt;[[Kölner Haie]]&lt;br&gt;[[Nürnberg Ice Tigers]]<br /> | ntl_team = CAN<br /> | draft = Undrafted<br /> | career_start = 1970<br /> | career_end = 1982<br /> | career_start_coach = 1982<br /> | career_end_coach = 2002<br /> }}<br /> '''Robert John Murdoch''' (November 20, 1946 – August 2023)&lt;ref&gt;[https://www.tsn.ca/nhl/stanley-cup-champion-and-jack-adams-award-winner-bob-murdoch-passes-away-at-76-1.1992347 Stanley Cup champion and Jack Adams Award winner Bob Murdoch passes away at 76]&lt;/ref&gt; was a Canadian professional [[ice hockey]] [[defenceman]] and [[Coach (ice hockey)|coach]]. Murdoch played 12 seasons in the [[National Hockey League]] (NHL) for the [[Montreal Canadiens]], [[Los Angeles Kings]], [[Atlanta Flames]] and [[Calgary Flames]] and coached 10 seasons in the NHL serving as head coach for [[Chicago Blackhawks]] and [[Winnipeg Jets (1972–1996)|Winnipeg Jets]], and also serving as assistant coach for the [[Calgary Flames]] and [[San Jose Sharks]].&lt;ref&gt;[http://www.legendsofhockey.net/LegendsOfHockey/jsp/SearchPlayer.jsp?player=13802 Biography at Legends of Hockey]&lt;/ref&gt; He won the [[Stanley Cup]] in 1971 and 1973 while with Montreal.<br /> <br /> ==International play==<br /> Murdoch played for [[Canada men's national ice hockey team|Canada national team]] in [[1969 Ice Hockey World Championships|1968–69]] and in 1969–70, was one of many players affected by the withdrawal of the National Team from participating in the [[1970 Ice Hockey World Championships]].<br /> <br /> ==Coaching career==<br /> Murdoch coached 80 games with the [[Chicago Blackhawks]] during the [[1987–88 NHL season|1987–88 season]], compiling a record of 30–41–9. He was succeeded as Blackhawks head coach by [[Mike Keenan]] the following season.&lt;ref&gt;[https://www.hockey-reference.com/coaches/murdobo01c.html Bob Murdoch Coaching Record – Hockey-Reference.com]&lt;/ref&gt;<br /> <br /> During the [[1989–90 NHL season|1989–90 season]], Murdoch was named the head coach of the [[Winnipeg Jets (1972–1996)|Winnipeg Jets]]. After missing the playoffs the previous season, the Jets went 37–32–11 for 85 points and third in the [[Smythe Division]], making the [[1990 Stanley Cup playoffs]] but losing to the eventual Stanley Cup champion, the [[Edmonton Oilers]], in seven games. Murdoch was seen as an important part of the Jets quick turn around, winning the [[Jack Adams Award]] as the NHL's coach of the year.<br /> <br /> Despite the success of the previous season, however, the Jets struggled in the [[1990–91 NHL season|1990–91 season]], finishing last in the Smythe Division with a 26–43–11 record, and missing the playoffs. Murdoch was fired at the end of the season and was replaced by [[John Paddock]].<br /> <br /> Murdoch would become an associate coach for the [[San Jose Sharks]] during the [[1991–92 NHL season|1991–92]] and [[1992–93 NHL season|1992–93]] seasons. Afterwards, he departed for Europe, and has coached several teams in Germany's [[Deutsche Eishockey Liga]] (DEL).<br /> <br /> ==Career statistics==<br /> <br /> ===Regular season and playoffs===<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; style=&quot;text-align:center; width:60em;&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! rowspan=&quot;100&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! colspan=&quot;5&quot;|[[Regular season]]<br /> ! rowspan=&quot;100&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! colspan=&quot;5&quot;|[[Playoffs]]<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! [[Season (sports)|Season]]<br /> ! Team<br /> ! League<br /> ! GP !! [[Goal (ice hockey)|G]] !! [[Assist (ice hockey)|A]] !! [[Point (ice hockey)|Pts]] !! [[Penalty (ice hockey)|PIM]]<br /> ! GP !! G !! A !! Pts !! PIM<br /> |-<br /> | 1968–69<br /> | Winnipeg Nationals<br /> | WCSHL<br /> | 6 || 0 || 1 || 1 || 2<br /> | — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1969–70 AHL season|1969–70]]<br /> | [[Montreal Voyageurs]]<br /> | [[American Hockey League|AHL]]<br /> | 6 || 0 || 2 || 2 || 6<br /> | — || — || — || — || —<br /> |-<br /> | [[1970–71 NHL season|1970–71]]<br /> | [[Montreal Canadiens]]<br /> | [[National Hockey League|NHL]]<br /> | 1 || 0 || 2 || 2 || 2<br /> | 2 || 0 || 0 || 0 || 0<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1970–71 AHL season|1970–71]]<br /> | Montreal Voyageurs<br /> | AHL<br /> | 66 || 8 || 20 || 28 || 69<br /> | 3 || 1 || 2 || 3 || 4<br /> |-<br /> | [[1971–72 NHL season|1971–72]]<br /> | Montreal Canadiens<br /> | NHL<br /> | 11 || 1 || 1 || 2 || 8<br /> | 1 || 0 || 0 || 0 || 0<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1971–72 AHL season|1971–72]]<br /> | Nova Scotia Voyageurs<br /> | AHL<br /> | 53 || 7 || 32 || 39 || 53<br /> | — || — || — || — || —<br /> |-<br /> | [[1972–73 NHL season|1972–73]]<br /> | Montreal Canadiens<br /> | NHL<br /> | 69 || 2 || 22 || 24 || 55<br /> | 13 || 0 || 3 || 3 || 10<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1973–74 NHL season|1973–74]]<br /> | [[Los Angeles Kings]]<br /> | NHL<br /> | 76 || 8 || 20 || 28 || 85<br /> | 5 || 0 || 0 || 0 || 2<br /> |-<br /> | [[1974–75 NHL season|1974–75]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 80 || 13 || 29 || 42 || 116<br /> | 3 || 0 || 1 || 1 || 4<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1975–76 NHL season|1975–76]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 80 || 6 || 29 || 35 || 103<br /> | 9 || 0 || 5 || 5 || 15<br /> |-<br /> | [[1976–77 NHL season|1976–77]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 70 || 9 || 23 || 32 || 79<br /> | 9 || 2 || 3 || 5 || 14<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1977–78 NHL season|1977–78]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 76 || 2 || 17 || 19 || 68<br /> | 2 || 0 || 1 || 1 || 5<br /> |-<br /> | [[1978–79 NHL season|1978–79]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 32 || 3 || 12 || 15 || 46<br /> | — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | 1978–79<br /> | [[Atlanta Flames]]<br /> | NHL<br /> | 35 || 5 || 11 || 16 || 24<br /> | 2 || 0 || 0 || 0 || 4<br /> |-<br /> | [[1979–80 NHL season|1979–80]]<br /> | Atlanta Flames<br /> | NHL<br /> | 80 || 5 || 16 || 21 || 48<br /> | 4 || 1 || 1 || 2 || 2<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1980–81 NHL season|1980–81]]<br /> | [[Calgary Flames]]<br /> | NHL<br /> | 74 || 3 || 19 || 22 || 54<br /> | 16 || 1 || 4 || 5 || 36<br /> |-<br /> | [[1981–82 NHL season|1981–82]]<br /> | Calgary Flames<br /> | NHL<br /> | 73 || 3 || 17 || 20 || 76<br /> | 3 || 0 || 0 || 0 || 0<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot;|NHL totals<br /> ! 757 !! 60 !! 218 !! 278 !! 764<br /> ! 69 !! 4 !! 18 !! 22 !! 92<br /> |}<br /> <br /> ===International===<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; ID=&quot;Table3&quot; style=&quot;text-align:center; width:40em;&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! Year<br /> ! Team<br /> ! Event<br /> ! rowspan=&quot;102&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! GP !! G !! A !! Pts !! PIM<br /> |-<br /> | [[1969 World Ice Hockey Championships|1969]]<br /> | [[Canada men's national ice hockey team|Canada]]<br /> | [[World Ice Hockey Championships|WC]]<br /> | 5 || 0 || 0 || 0 || 2<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;4|Senior totals<br /> ! 5 !! 0 !! 0 !! 0 !! 2<br /> |}<br /> <br /> ==Head coaching record==<br /> {| class=&quot;wikitable&quot; style=&quot;font-size: 95%; text-align:center;&quot;<br /> |-<br /> ! rowspan=&quot;2&quot;|Team !! rowspan=&quot;2&quot;|Year !! colspan=&quot;6&quot;|[[Regular season]] !! colspan=&quot;4&quot;|[[Playoffs|Postseason]]<br /> |-<br /> ! G !! W !! L !! T !! Pts !!Finish !! W !! L !! Win % !! Result<br /> |- style=&quot;background:#fdd;&quot;<br /> ! [[Chicago Blackhawks|CHI]] !! [[1987–88 NHL season|1987–88]]<br /> | 80 || 30 || 41 || 9 || 69 || 3rd in [[Norris Division|Norris]] || 1 || 4 || {{winpct|1|4}} || Lost in Division Semifinals ([[St. Louis Blues|STL]])<br /> |- style=&quot;background:#fdd;&quot;<br /> ! [[Winnipeg Jets (1972–1996)|WIN]] !! [[1989–90 NHL season|1989–90]]<br /> | 80 || 37 || 32 || 11 || 85 || 3rd in [[Smythe Division|Smythe]] || 3 || 4 || {{winpct|3|4}} || Lost in Division Semifinals ([[Edmonton Oilers|EDM]])<br /> |-<br /> ! WIN !! [[1990–91 NHL season|1990–91]]<br /> | 80 || 26 || 43 || 11 || 63 || 5th in Smythe || — || — || — || Missed playoffs<br /> |-<br /> ! colspan=&quot;2&quot;|Total !! 240 !! 93 !! 116 !! 31 !! &amp;nbsp; !! &amp;nbsp; !! 4 !! 8 !! {{Winning percentage|4|8}} !! 2 playoff appearances<br /> |}<br /> <br /> ==Awards and achievements==<br /> * [[Stanley Cup]] champion – [[1971 Stanley Cup Finals|1971]]<br /> * Played in [[NHL All-Star Game]] – [[28th National Hockey League All-Star Game|1975]]<br /> * [[Jack Adams Award]] winner – [[1989–90 NHL season|1990]]<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==External links==<br /> * {{Ice hockey stats}}<br /> * [http://hockeylemagazine.com/hockey/pros/lnh/playeren/2221.asp Hockey Le Magazine profile]<br /> <br /> {{s-start}}<br /> {{s-sports}}<br /> {{succession box | before = [[Bob Pulford]] | title = [[List of Chicago Blackhawks head coaches|Head coach of the Chicago Blackhawks]] | years = [[1987–88 NHL season|1987–88]] | after = [[Mike Keenan]]}}<br /> {{succession box | before = [[Rick Bowness]] | title = [[List of Winnipeg Jets (1972–1996) head coaches|Head coach of the original Winnipeg Jets]] | years = [[1989–90 NHL season|1989]]–[[1990–91 NHL season|1991]] | after = [[John Paddock]]}}<br /> {{s-ach}}<br /> {{succession box | before = [[Pat Burns]] | title = Winner of the [[Jack Adams Award]] | years = [[1989–90 NHL season|1990]] | after = [[Brian Sutter]]}}<br /> {{s-end}}<br /> <br /> {{DEFAULTSORT:Murdoch, Bob}}<br /> [[Category:1946 births]]<br /> [[Category:2023 deaths]]<br /> [[Category:Atlanta Flames players]]<br /> [[Category:Calgary Flames coaches]]<br /> [[Category:Calgary Flames players]]<br /> [[Category:Canadian ice hockey coaches]]<br /> [[Category:Canadian ice hockey defencemen]]<br /> [[Category:Chicago Blackhawks coaches]]<br /> [[Category:Jack Adams Award winners]]<br /> [[Category:Los Angeles Kings players]]<br /> [[Category:Montreal Canadiens players]]<br /> [[Category:National Hockey League All-Stars]]<br /> [[Category:San Jose Sharks coaches]]<br /> [[Category:Ice hockey people from Kirkland Lake]]<br /> [[Category:Stanley Cup champions]]<br /> [[Category:Undrafted National Hockey League players]]<br /> [[Category:Winnipeg Jets (1972–1996) coaches]]</div> 205.189.94.9 https://en.wikipedia.org/w/index.php?title=Bob_Murdoch_(ice_hockey,_born_1946)&diff=1168742997 Bob Murdoch (ice hockey, born 1946) 2023-08-04T18:35:30Z <p>205.189.94.9: died in Canmore AB, at 76 of alzheimers... International hockey</p> <hr /> <div>{{Short description|Canadian ice hockey player and coach}}<br /> {{for|another Canadian ice hockey player|Bob Murdoch (ice hockey, born 1954)}}<br /> {{Use mdy dates|date=September 2011}}<br /> {{Infobox ice hockey player<br /> | name = Bob Murdoch<br /> | image = <br /> | image_size = <br /> | caption = <br /> | birth_date = {{birth date|1946|11|20}}<br /> | birth_place = [[Kirkland Lake]], [[Ontario]], Canada<br /> | death_date = August 2023 (aged 76)<br /> | death place = [[Canmore, Alberta]], Canada.<br /> | height_ft = 6<br /> | height_in = 0<br /> | weight_lb = 211<br /> | position = [[Defenceman|Defence]]<br /> | shoots = Right<br /> | played_for = [[Montreal Canadiens]]&lt;br&gt;[[Los Angeles Kings]]&lt;br&gt;[[Atlanta Flames]]&lt;br&gt;[[Calgary Flames]]<br /> | coached_for = [[Chicago Blackhawks]]&lt;br&gt;[[Winnipeg Jets (1972–1996)|Winnipeg Jets]]&lt;br&gt;[[Maddogs München]]&lt;br&gt;[[Kölner Haie]]&lt;br&gt;[[Nürnberg Ice Tigers]]<br /> | ntl_team = CAN<br /> | draft = Undrafted<br /> | career_start = 1970<br /> | career_end = 1982<br /> | career_start_coach = 1982<br /> | career_end_coach = 2002<br /> }}<br /> '''Robert John Murdoch''' (November 20, 1946 – August 2023)&lt;ref&gt;[https://www.tsn.ca/nhl/stanley-cup-champion-and-jack-adams-award-winner-bob-murdoch-passes-away-at-76-1.1992347 Stanley Cup champion and Jack Adams Award winner Bob Murdoch passes away at 76]&lt;/ref&gt; was a Canadian professional [[ice hockey]] [[defenceman]] and [[Coach (ice hockey)|coach]]. Murdoch played 12 seasons in the [[National Hockey League]] (NHL) for the [[Montreal Canadiens]], [[Los Angeles Kings]], [[Atlanta Flames]] and [[Calgary Flames]] and coached 10 seasons in the NHL serving as head coach for [[Chicago Blackhawks]] and [[Winnipeg Jets (1972–1996)|Winnipeg Jets]], and also serving as assistant coach for the [[Calgary Flames]] and [[San Jose Sharks]].&lt;ref&gt;[http://www.legendsofhockey.net/LegendsOfHockey/jsp/SearchPlayer.jsp?player=13802 Biography at Legends of Hockey]&lt;/ref&gt; He won the [[Stanley Cup]] in 1971 and 1973 while with Montreal.<br /> <br /> ==International play==<br /> Murdoch played for [[Canada men's national ice hockey team|Canada national team]] in [[1969 Ice Hockey World Championships|1968–69]] and in 1969–70, was one of many players affected by the withdrawal of the National Team from participating in the [[1970 Ice Hockey world Championships]].<br /> <br /> ==Coaching career==<br /> Murdoch coached 80 games with the [[Chicago Blackhawks]] during the [[1987–88 NHL season|1987–88 season]], compiling a record of 30–41–9. He was succeeded as Blackhawks head coach by [[Mike Keenan]] the following season.&lt;ref&gt;[https://www.hockey-reference.com/coaches/murdobo01c.html Bob Murdoch Coaching Record – Hockey-Reference.com]&lt;/ref&gt;<br /> <br /> During the [[1989–90 NHL season|1989–90 season]], Murdoch was named the head coach of the [[Winnipeg Jets (1972–1996)|Winnipeg Jets]]. After missing the playoffs the previous season, the Jets went 37–32–11 for 85 points and third in the [[Smythe Division]], making the [[1990 Stanley Cup playoffs]] but losing to the eventual Stanley Cup champion, the [[Edmonton Oilers]], in seven games. Murdoch was seen as an important part of the Jets quick turn around, winning the [[Jack Adams Award]] as the NHL's coach of the year.<br /> <br /> Despite the success of the previous season, however, the Jets struggled in the [[1990–91 NHL season|1990–91 season]], finishing last in the Smythe Division with a 26–43–11 record, and missing the playoffs. Murdoch was fired at the end of the season and was replaced by [[John Paddock]].<br /> <br /> Murdoch would become an associate coach for the [[San Jose Sharks]] during the [[1991–92 NHL season|1991–92]] and [[1992–93 NHL season|1992–93]] seasons. Afterwards, he departed for Europe, and has coached several teams in Germany's [[Deutsche Eishockey Liga]] (DEL).<br /> <br /> ==Career statistics==<br /> <br /> ===Regular season and playoffs===<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; style=&quot;text-align:center; width:60em;&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! rowspan=&quot;100&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! colspan=&quot;5&quot;|[[Regular season]]<br /> ! rowspan=&quot;100&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! colspan=&quot;5&quot;|[[Playoffs]]<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! [[Season (sports)|Season]]<br /> ! Team<br /> ! League<br /> ! GP !! [[Goal (ice hockey)|G]] !! [[Assist (ice hockey)|A]] !! [[Point (ice hockey)|Pts]] !! [[Penalty (ice hockey)|PIM]]<br /> ! GP !! G !! A !! Pts !! PIM<br /> |-<br /> | 1968–69<br /> | Winnipeg Nationals<br /> | WCSHL<br /> | 6 || 0 || 1 || 1 || 2<br /> | — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1969–70 AHL season|1969–70]]<br /> | [[Montreal Voyageurs]]<br /> | [[American Hockey League|AHL]]<br /> | 6 || 0 || 2 || 2 || 6<br /> | — || — || — || — || —<br /> |-<br /> | [[1970–71 NHL season|1970–71]]<br /> | [[Montreal Canadiens]]<br /> | [[National Hockey League|NHL]]<br /> | 1 || 0 || 2 || 2 || 2<br /> | 2 || 0 || 0 || 0 || 0<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1970–71 AHL season|1970–71]]<br /> | Montreal Voyageurs<br /> | AHL<br /> | 66 || 8 || 20 || 28 || 69<br /> | 3 || 1 || 2 || 3 || 4<br /> |-<br /> | [[1971–72 NHL season|1971–72]]<br /> | Montreal Canadiens<br /> | NHL<br /> | 11 || 1 || 1 || 2 || 8<br /> | 1 || 0 || 0 || 0 || 0<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1971–72 AHL season|1971–72]]<br /> | Nova Scotia Voyageurs<br /> | AHL<br /> | 53 || 7 || 32 || 39 || 53<br /> | — || — || — || — || —<br /> |-<br /> | [[1972–73 NHL season|1972–73]]<br /> | Montreal Canadiens<br /> | NHL<br /> | 69 || 2 || 22 || 24 || 55<br /> | 13 || 0 || 3 || 3 || 10<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1973–74 NHL season|1973–74]]<br /> | [[Los Angeles Kings]]<br /> | NHL<br /> | 76 || 8 || 20 || 28 || 85<br /> | 5 || 0 || 0 || 0 || 2<br /> |-<br /> | [[1974–75 NHL season|1974–75]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 80 || 13 || 29 || 42 || 116<br /> | 3 || 0 || 1 || 1 || 4<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1975–76 NHL season|1975–76]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 80 || 6 || 29 || 35 || 103<br /> | 9 || 0 || 5 || 5 || 15<br /> |-<br /> | [[1976–77 NHL season|1976–77]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 70 || 9 || 23 || 32 || 79<br /> | 9 || 2 || 3 || 5 || 14<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1977–78 NHL season|1977–78]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 76 || 2 || 17 || 19 || 68<br /> | 2 || 0 || 1 || 1 || 5<br /> |-<br /> | [[1978–79 NHL season|1978–79]]<br /> | Los Angeles Kings<br /> | NHL<br /> | 32 || 3 || 12 || 15 || 46<br /> | — || — || — || — || —<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | 1978–79<br /> | [[Atlanta Flames]]<br /> | NHL<br /> | 35 || 5 || 11 || 16 || 24<br /> | 2 || 0 || 0 || 0 || 4<br /> |-<br /> | [[1979–80 NHL season|1979–80]]<br /> | Atlanta Flames<br /> | NHL<br /> | 80 || 5 || 16 || 21 || 48<br /> | 4 || 1 || 1 || 2 || 2<br /> |- bgcolor=&quot;#f0f0f0&quot;<br /> | [[1980–81 NHL season|1980–81]]<br /> | [[Calgary Flames]]<br /> | NHL<br /> | 74 || 3 || 19 || 22 || 54<br /> | 16 || 1 || 4 || 5 || 36<br /> |-<br /> | [[1981–82 NHL season|1981–82]]<br /> | Calgary Flames<br /> | NHL<br /> | 73 || 3 || 17 || 20 || 76<br /> | 3 || 0 || 0 || 0 || 0<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;3&quot;|NHL totals<br /> ! 757 !! 60 !! 218 !! 278 !! 764<br /> ! 69 !! 4 !! 18 !! 22 !! 92<br /> |}<br /> <br /> ===International===<br /> {| border=&quot;0&quot; cellpadding=&quot;1&quot; cellspacing=&quot;0&quot; ID=&quot;Table3&quot; style=&quot;text-align:center; width:40em;&quot;<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! Year<br /> ! Team<br /> ! Event<br /> ! rowspan=&quot;102&quot; bgcolor=&quot;#ffffff&quot;|<br /> ! GP !! G !! A !! Pts !! PIM<br /> |-<br /> | [[1969 World Ice Hockey Championships|1969]]<br /> | [[Canada men's national ice hockey team|Canada]]<br /> | [[World Ice Hockey Championships|WC]]<br /> | 5 || 0 || 0 || 0 || 2<br /> |- bgcolor=&quot;#e0e0e0&quot;<br /> ! colspan=&quot;4|Senior totals<br /> ! 5 !! 0 !! 0 !! 0 !! 2<br /> |}<br /> <br /> ==Head coaching record==<br /> {| class=&quot;wikitable&quot; style=&quot;font-size: 95%; text-align:center;&quot;<br /> |-<br /> ! rowspan=&quot;2&quot;|Team !! rowspan=&quot;2&quot;|Year !! colspan=&quot;6&quot;|[[Regular season]] !! colspan=&quot;4&quot;|[[Playoffs|Postseason]]<br /> |-<br /> ! G !! W !! L !! T !! Pts !!Finish !! W !! L !! Win % !! Result<br /> |- style=&quot;background:#fdd;&quot;<br /> ! [[Chicago Blackhawks|CHI]] !! [[1987–88 NHL season|1987–88]]<br /> | 80 || 30 || 41 || 9 || 69 || 3rd in [[Norris Division|Norris]] || 1 || 4 || {{winpct|1|4}} || Lost in Division Semifinals ([[St. Louis Blues|STL]])<br /> |- style=&quot;background:#fdd;&quot;<br /> ! [[Winnipeg Jets (1972–1996)|WIN]] !! [[1989–90 NHL season|1989–90]]<br /> | 80 || 37 || 32 || 11 || 85 || 3rd in [[Smythe Division|Smythe]] || 3 || 4 || {{winpct|3|4}} || Lost in Division Semifinals ([[Edmonton Oilers|EDM]])<br /> |-<br /> ! WIN !! [[1990–91 NHL season|1990–91]]<br /> | 80 || 26 || 43 || 11 || 63 || 5th in Smythe || — || — || — || Missed playoffs<br /> |-<br /> ! colspan=&quot;2&quot;|Total !! 240 !! 93 !! 116 !! 31 !! &amp;nbsp; !! &amp;nbsp; !! 4 !! 8 !! {{Winning percentage|4|8}} !! 2 playoff appearances<br /> |}<br /> <br /> ==Awards and achievements==<br /> * [[Stanley Cup]] champion – [[1971 Stanley Cup Finals|1971]]<br /> * Played in [[NHL All-Star Game]] – [[28th National Hockey League All-Star Game|1975]]<br /> * [[Jack Adams Award]] winner – [[1989–90 NHL season|1990]]<br /> <br /> ==References==<br /> {{reflist}}<br /> <br /> ==External links==<br /> * {{Ice hockey stats}}<br /> * [http://hockeylemagazine.com/hockey/pros/lnh/playeren/2221.asp Hockey Le Magazine profile]<br /> <br /> {{s-start}}<br /> {{s-sports}}<br /> {{succession box | before = [[Bob Pulford]] | title = [[List of Chicago Blackhawks head coaches|Head coach of the Chicago Blackhawks]] | years = [[1987–88 NHL season|1987–88]] | after = [[Mike Keenan]]}}<br /> {{succession box | before = [[Rick Bowness]] | title = [[List of Winnipeg Jets (1972–1996) head coaches|Head coach of the original Winnipeg Jets]] | years = [[1989–90 NHL season|1989]]–[[1990–91 NHL season|1991]] | after = [[John Paddock]]}}<br /> {{s-ach}}<br /> {{succession box | before = [[Pat Burns]] | title = Winner of the [[Jack Adams Award]] | years = [[1989–90 NHL season|1990]] | after = [[Brian Sutter]]}}<br /> {{s-end}}<br /> <br /> {{DEFAULTSORT:Murdoch, Bob}}<br /> [[Category:1946 births]]<br /> [[Category:2023 deaths]]<br /> [[Category:Atlanta Flames players]]<br /> [[Category:Calgary Flames coaches]]<br /> [[Category:Calgary Flames players]]<br /> [[Category:Canadian ice hockey coaches]]<br /> [[Category:Canadian ice hockey defencemen]]<br /> [[Category:Chicago Blackhawks coaches]]<br /> [[Category:Jack Adams Award winners]]<br /> [[Category:Los Angeles Kings players]]<br /> [[Category:Montreal Canadiens players]]<br /> [[Category:National Hockey League All-Stars]]<br /> [[Category:San Jose Sharks coaches]]<br /> [[Category:Ice hockey people from Kirkland Lake]]<br /> [[Category:Stanley Cup champions]]<br /> [[Category:Undrafted National Hockey League players]]<br /> [[Category:Winnipeg Jets (1972–1996) coaches]]</div> 205.189.94.9