History of Thinking Machines

What is the history of Thinking Machines?

The history of Thinking Machines

The terms artificial intelligence, big data intelligence, or even Automat are the terms associated with the modern technologies used in describing moving machines or even resembling human or animal actions.

Artificial intelligence was first coined by John McCarthy in 1956 at his first academic conference on the subject. In his seminal work in 1945, “As We May Think,”  Bush proposed a system amplifying people’s knowledge and understanding.

The history of Thinking Machines

Five years later, in 1950, a paper was written on the notion of machines simulating human beings and their ability to do intelligent things, such as play chess.

Philosophers and inventors have been exploring the concept of big-thinking machines for centuries. The thinkers from ancient times could understand and describe human thought processes with the ability to compare these processes to mechanical operations, manipulating symbols, and logic.

There were various mechanical devices before computers that were designed for performing calculations, such as abacuse and later mechanical calculators.

In the 19th century, Charles Babbage laid the groundwork for modern computers. The core idea of human thinking is its ability to break down into the manipulation of symbols, a concept that underpins both logic and computation.

In the 15th century BC, Egyptian Amenhotep was one of the pioneers. The technology stated that the “Statue of Memnon, King of Ethiopia, uttered a melodious sound when struck by the sun’s rays in the morning and during sunset.”

And in 1493, Leonardo Da Vinci came up with the device for calculation, an early version of today’s complicated calculator.

And in 1792, Johann Reichold came up with arithmetic machines.

History of Thinking Machines

In 1900, Alexander Rechnitzer discovered the world’s first motor-driven calculating machine, and it was also the first machine to embody full automatic multiplication and division.

In 1905, Christel Hamann discovered the mechanical calculator, based on the stepped drum of Leibniz, mounted in the center of the machine.

The three main pioneers who stand out as laying the philosophical and technical groundwork for the artificial intelligence field are John von Neumann, Alan Turing, and Claude Shannon.

Von Neumann, a native of Hungary, was born in 1903 into a Jewish banking family in Budapest, and he was world-renowned. According to legend, he was able to divide eight-digit numbers in his head at the age of six.

Von Neumann’s major contribution was to compute and help establish the idea of a computer program stored in the computer’s memory. Von Neumann became the first person to use the human term “memory” when referring to a computer.

However, Von Neumann never believed that a computer would be able to think the way humans can, but he did establish the parallels that exist with human physiognomy.

The second contributor is Alan Turing, a British mathematician and cryptoanalyst. He is famously known for leading the team during the Second World War for the Government Code and Cypher School at Britain’s secret code-breaking center, Bletchley Park.

His techniques are well known to crack German codes, and he is also, most famously, an electromechanical device capable of working out the settings for the Enigma machine.

A movie was also made about Alan Turing, “The Imitation Game,”  released in 2014, with Benedict Cumberbatch as Alan Turing and Keira Knightley as Joan Clarke.

The most significant concept by Turing was called the Universal Turing Machine. In the book, he explained the concept that a variety of tasks can be performed by machines by reading step-by-step instructions from a tape. Turing also wrote about the importance of the computer, saying that it “could in fact be made to work as a model of any other machine.”

As Turing noted, “The engineering problem of producing various machines for various jobs is replaced by the office work of ‘programming’ the universal machine to do these jobs.”

Human intelligence was one such work that was hypothesized. The paper was entitled “Intelligent Machinery,”  where Turing considered what it would be like to reproduce intelligence inside a machine—a particular challenge given the limitations of computers at the time.

“The memory capacity of the human brain is probably of the order of ten thousand million binary digits.” “But most of this is probably used in remembering visual impressions and other comparatively wasteful ways. One might reasonably hope to be able to make some real progress with a few million digits.”

The third is named Claude Shannon, known today as the father of “information theory.” He was born in 1916 and is also one of the youngest of Shannon’s three big contributions to computing related to transistor work.

A computer with a lot of algorithms tells it to follow the instructions by switching these transistors on and off. With certain transistors switched on and off in response to other transistors, Shannon also argued that computers were performing basic reasoning. If transistor 1 switches on when transistors 2 and 3 are also on, it is a logical operation.

The work by Claude Shannon elaborated on the logical operations that could be performed by the transistors.

However, due to the untimely deaths of both John von Neumann and Alan Turing, Claude Shannon played an active role in the official formation of artificial intelligence.

The Artificial Intelligence division’s rapid growth into different specialties didn’t take long. As per the UK’s “Mechanization of Thought Processes” conference, organized at the National Physical Laboratory in Teddington, Middlesex, in 1958.

After the Dartmouth conference, AI split the field into “artificial thinking, character and pattern recognition, learning, mechanical language translation, biology, automatic programming, industrial planning, and clerical mechanization.”

Sources:- Thinking Machines, Time, Entertainment Weekly, AI and Big Data’s Potential for Disruptive Innovation, Forbes, Medium, AI Forbes

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.