History of Artificial Intelligence began when McCulloch and Walter Pitts proposed a model of artificial neurons in 1943. Significance of this work is that each neuron is characterised as being “on” or “off”. Switching to “on” occurred when significant number of neighbouring neurons stimulated. McCulloth and Pits showed that any computable function could be computed by network of connected neurons. In 1949, Donald Hebb modified the connection strength between neurons using a simple updating rule what is known as Hebbian learning even today. Marvin Minsky and Dean Edmonds built the first neural network computer called SNARC in 1951. This computer used 3000 vacuum tubes and a network of 40 neurons. Alan Turing introduced the infamous Turing test, machine learning, genetic algorithms, and reinforcement learning.
Artificial Intelligence was formally born in a workshop conducted by IBM at Dartmouth College in 1956. Mc Carthy coined the term Artificial Intelligence. It turns out to be the greatest milestone in the history of artificial intelligence. Newell, Shaw and Simon developed a reasoning program called Logic Theorist. It was meant for automatic theorem proving which led the development of Information Processing Language, the first list-processing language. Chomsky’s theory of generative grammar influenced Natural Language Processing. Rosenblatt invented perceptrons in 1958. John McCarthy developed LISP, an AI programming language.
Newell and Simson wrote General Problem Sover (GPS) in IPL. It imitated the way humans solve the problems. In 1976, they formulated physical symbol system and claimed that it is sufficient for general intelligent action. Herbert Gelernter developed Geometry Theorem Prover. A.L.Samuel developed checkers program between 1961 and 1965. J.A.Robinson introduced a inference method, resolution in 1965. In the same period DENDRAL, the first knowledge-based expert system was developed at Stanford University by J.Laderberg, Edward Feigenbaum and Carl Djerassi. DEDNDRAL was to infer molecular structure from the information provided by a mass spectrometer. Feigenbaum, Buchanan and Edward Shortlife developed an expert system called MYCIN to diagnose blood infections. MYCIN used 450 rules acquired from the information given by experts. MYCIN incorporated certainty factors, a calculus of uncertainty.