The Man Who Shaped Artificial Intelligence

“I last sat down in 2005,” Geoffrey Hinton often says, “and it was a mistake.” In the seventeen years since then, Hinton has never sat down, his severe back problems prevent him from doing so. He travels only by train or car, where he can sprawl across the seats. He cannot fly commercial, since they insist on you sitting down while taking off or landing. He eats “like a monk on the altar”, using a foam cushion to kneel down at a table. With his trademark wry British humour, he talks of his back being “a long-standing problem”. In these seventeen years, Geoffrey Hinton, working from the University of Toronto has also transformed Artificial Intelligence. He brought neural networks back from the cold of the AI Winter, ‘invented’ deep learning, tutored a bevy of geniuses who are at the bleeding edge of AI today, and won the fabled Turing Awardwhile he was at it.

The Artificial Intelligence Genius Makers

I first came across the legend of Geoffrey Hinton in a fabulous book by Cade Metz, a New York Times journalist, ‘Genius Makers’, where he detailed the life of the personalities who shaped AI, foremost among them being Hinton. After studying psychology at Cambridge University and AI at the University of Edinburgh, Hinton went back to something which had fascinated him even as a child – how the human brain stored memories, and how it worked. He was one of the first researchers who started working on ‘mimicking’ the human brain using computer hardware and software, thus constructing a newer, purer form of AI, which we now call ‘deep learning’. He started doing this in the 1980s, along with an intrepid bunch of young students, Yann LeCun, Yoshua Bengio, and Ilya Sutskever. His PhD thesis titled, ‘Deep Neural Networks for Acoustic Modelling in Speech Recognition’, demonstrated how deep neural networks outclassed older Machine Learning models like Hidden Markovs and Gaussian Mixtures at identifying speech patterns. He literally invented ‘backpropagation’, which reportedly was one of the concepts that inspired the Google BackRub search algorithm, the core of its astonishing search engine. Interestingly, Page and Brin were toying with actually naming their company BackRub, before they settled on Google.

Replicating the Brain Using AI

“I get very excited when we discover a way of making neural networks better — and when that’s closely related to how the brain works,” says Hinton. By mimicking the brain, Hinton sought to get rid of traditional machine learning techniques, where humans would label pictures, words and objects; instead, his worked copied the brain’s self-learning techniques. He and his team built “artificial neurons from interconnected layers of software modelled after the columns of neurons in the brain’s cortex. These neural nets can gather information, react to it, build an understanding of what something looks or sounds like.” (https://bit.ly/3LRJwWo ). The AI community did not trust this new approach; Hinton told Sky News that it was “an idea that almost no one on Earth believed in at that point – it was pretty much a dead idea, even among AI researchers”. Well, that sentiment has changed. Deep Learning has been harnessed by Google, Meta, Microsoft, DeepMind, Baidu and almost every other tech company to build driverless cars, predict protein folding, and beating humans at Go. Yann LeCun now leads Meta’s AI efforts, Bengio is doing seminal work at University of Montreal, Sutskevar co-founded OpenAI, which famously released GPT 3. Hinton himself works part time for Google, the result of a frenzied bidding war between Google, Microsoft and Baidu, where he auctioned his company, essentially himself, to Google for $44mn – how this auction unravelled is a stuff of legend in itself. Deep Leaning is mainstream now and considered one of the most exciting developments in AI. It is regarded as the surest bet that AI has to achieve Artificial General Intelligence, or AGI. As Hinton puts it: “We ceased to be the lunatic fringe. We’re now the lunaticcore.”

Hilton and His Influences

It is also interesting to know that Hinton comes from a formidably intellectual and academic family and his mother used to tell him to ‘Be and academic, or be a failure’. His great-great grandfather was George Boole, who invented Boolean logic and algebra, which became the foundation of modern computers. George’s wife Mary was a well-known teacher of algebra and logic. Mary’s uncle was George Everest, and as the Surveyor General of India, had the highest mountain named after him. Geoffrey’s great grandfather, a renowned mathematician, created the concept of the ‘fourth dimension’, and first drew the tesseract, and his cousin, Joan, a nuclear physicist was one of the few women to work on the Manhattan Project. His father, Howard Hinton, a formidable entomologist and a Fellow of the Royal Society, often told Geoffrey, “Work really hard and maybe when you’re twice as old as me, you’ll be half as good.”. Geoffrey Hinton did work hard, became the Godfather of Deep Learning, a Turing Award winner, and a Fellow of the Royal Society. And he is not sitting onhis laurels.

FAQ

Geoffrey Hinton, Yoshua Bengio and Yann LeCun are considered as the Godfathers of AI and Godfathers of deep learning. For their commendable work on deep learning they received the 2018 Turing Award.

 “Neural Network Architectures for Artificial Intelligence”, “Unsupervised Learning: Foundations of Neural Computation”, “Connectionist Symbol Processing” and “Parallel Models of Associative Memory” were published in August 1988, books written by Geoffrey Hinton.


Subscribe To My Monthly Newsletter