
In partnership with Google, the Computer History Museum released the source code to Alexnet, the nerve network that started in 2012 the prevailing day approach to artificial intelligence. The source code is available as an open source GitHub Chm page.
What is Alexnet?
Alexnet is a created nerve network created to learn about the contents of photographs. It was developed in 2012 by graduate students at the University of Toronto Alex Crespsky and Elijah Sutsfar And advisor to faculty members, Jeffrey Henton.
Hinton is one of the fathers of deep learning, the type of artificial intelligence that uses nerve networks and is the basis of Amnesty International today. The three-layer nerve networks were built with only one layer of adaptive weights in the late fifties of the last century-especially Cornell Frank Rosenblatt-but turned out to have restrictions. [This explainer gives more details on how neural networks work.] In particular, the researchers needed networks with more than one layer of adaptive weights, but there was no good way to train them. By the early seventies of the twentieth century, nerve networks were largely rejected by artificial intelligence researchers.
Frank Rosenblatt [left, shown with Charles W. Wightman] The first artificial nervous network, The Perceptron, was developed in 1957.Division of rare manuscripts collections/Cornell University Library
In the eighties of the last century, neural network research was revived outside the artificial intelligence community by cognitive scientists at the University of California San Diego, under the name of the new “communication”. After completing a doctorate at the University of Edinbur David Rommahart and Ronald Williams. The three team re -discovered the background education algorithm for nervous networks training, and in 1986 they published two paper showing that they enabled nervous networks to learn multiple layers of features of language and vision tasks. BackProportization, which is essential for deep learning today, is the difference between the current output and the output of the network to adjust the weights in each layer, from the output layer back to the input layer.
In 1987, Hinton joined Toronto University. Away from traditional artificial intelligence centers, Hinton’s work and implementation of graduate students Toronto Toronto is a center for deep learning research in the coming decades. He was one of the post -PhD students in Hinton Yan to beNow, the chief scientist in Meta. While working in Toronto, Lecun showed that when BackProbation was used in the “tawafrawi” nervous networks, it became very good in identifying handwritten numbers.
Imagenet and GPUS
Despite these developments, nerve networks cannot be constantly outperforming other types of learning algorithms. They needed two development from outside Amnesty International to pave the way. The first was the appearance of larger quantities of data for training, provided through the web. The second was a sufficient mathematical force to perform this training, in the form of 3D graphics chips, known as graphics processing units. By 2012, the time had ended for Alexnet.
The IMAGENET photo collection of Fei-FEI LI, which was completed in 2009, was a pivotal training in Alexnet. Here, me [right] He talks to Tom Cleloel at the Computer History Museum.Douglas Verbern/Computer History Museum
The data needed to train AlexNet was found in ImagenetA project led by Professor Stanford began Fi Fi Lee. Starting in 2006, against traditional wisdom, I depict me a collection of data that covers each name in English. She and graduate students began to collect and classify images on the Internet WordnetDatabase of words and their relationships with each other. Looking at the magnitude of their mission, LI and its collaborators ultimately made the task of describing pictures of workers, using workers, using workers, using Amazon mechanical platform.
It was completed in 2009, Imagenet was larger than any previous photo data collection through several orders. She expressed to me that his availability stimulates new breakthroughs, and I started a a race In 2010 to encourage search teams to improve image recognition algorithms. But over the next two years, only the best systems have made marginal improvements.
The second case necessary for the success of the nerve networks was economic access to huge amounts of account. The nervous network training includes many complications of the frequent matrix, preferably parallel, which is designed to draw graphics processing. NafidiaFounded by CEO Gensen HuangHe had led the 2000s road to make graphics processing units more generalized and programmed for applications that exceed 3D graphics, especially with Cuda programming system Issued in 2007.
The Imagenet and Cuda, like the nerve networks themselves, were somewhat specialized developments that were awaiting the appropriate conditions for brilliance. In 2012, Alexnet collected these elements – emotional nervous networks, large data groups, and graphics processing units – for the first time, with predators. Each of these need the other.
How was Alexnet created
By the end of the first decade of the twentieth century, graduate students in Hinton began at the University of Toronto to use graphics processing units to train nervous networks on both images and recognize speech. Their first successes came to identify speech, but the success in identifying pictures will indicate deep learning as a potential solution to Amnesty International. One student, Ilya Sutskever, believes that the performance of the nerve networks will expand with the amount of data available, and the IMAGENET access reaches the opportunity.
In 2011, Sutskever persuaded his fellow student Alex Kreiszky, who had an eager ability to take out the maximum performance of graphics processing units, to train an IMAGENET nervous network, with Hinton as a major investigator.
AlexNet Nvidia GPUS uses Cuda Code Code trained on the Imagenet data collection. Jensen Huang, CEO of NVIDIA, was appointed as a CHM 2024 colleague for his contributions to computer graphics chips and AI.Douglas Verbern/Computer History Museum
Krizhevsky had already written a cuda symbol of a fodder nervous network using NVIDIA graphics units, called Cuda-ConvnetTrain on much smaller CIFAR-10 Data set. He expanded Cuda-Convnet with the support of the processing of multiple graphics processing units and other features and re-trained on imagenet. A computer was trained with NVIDIA card in the Krizhevsky bedroom in his parents ’house. Over the next year, he has continued to modify the network teachers and re -trained them until it achieved a superior performance on its competitors. In the end, the Alexnet network will be named after krizhevsky. Jeff Henton summarized the AlexNet project in this way: “Elijah believed that we should do so, made her Alex working, and got it Nobel Prize“
Krizhevsky, Sutskever and Hinton Paper on Alexnet This was published in the fall of 2012 and was presented by Krizhevsky at the Florence Computer Vision Conference, Italy, in October. Vetere computer vision researchers were not convinced, but to be, who was at the meeting, announced it was a turning point for Amnesty International. He was right. Before Alexnet, none of the leading computer vision leaves are used. After that, almost all of them will.
Alexnet was just the beginning. In the next decade, nervous networks will advance to synthesize human voices, overcoming the hero players, generating artworks, and reached their climax with a Chatgpt version in November 2022 by by OpenaiFounded by Squisfar.
Alexnet source code launch
In 2020, I contacted Krizhevsky to ask about the possibility of allowing Chm to release Alexnet source code, due to its historical importance. He linked me to Hinton, who was working in Google at the time. Google Alexnet owns, after you got Dnresearch, the company owned by Hinton, Sutskever, and Krizhevsky. Hinton got the ball by connecting the CHM to the appropriate team in Google. CHM has worked with Google for a period of five years to negotiate the version. The team also helped us to determine the specific version of the AlexNet source code for its release – there have been many Alexnet versions over the years. There are other warehouses from the code called Alexnet on GitHub, but many of these creativity are based on the famous paper, not the original code.
CHM is proud to present the source code to the 2012 Alexnet version, which turns the field of artificial intelligence. You can access the source code on GitHub Chm page.
This post was originally appeared on Computer History Museum Blog.
Thanks and appreciation
Special thanks to Ceffrey Hinton for submitting his quote and reviewing the text, to Cade Metz and Alex Krizhevsky for additional clarifications, and the Bievid Bieber and the rest of the team in Google for their work in securing the source code version.
From your site articles
Related articles about the web