Иностранные Бердышев

Материал из Wiki
Версия от 13:06, 24 декабря 2019; Бердышев Виталий (обсуждение | вклад)

(разн.) ← Предыдущая | Текущая версия (разн.) | Следующая → (разн.)
Перейти к: навигация, поиск

1) https://www.scopus.com/record/display.uri?eid=2-s2.0-0032482432&origin=resultslist&sort=cp-f&src=s&st1=neural+network&nlo=&nlr=&nls=&sid=679b26cac29a36193382ff45f66bb4ec&sot=b&sdt=cl&cluster=scosubtype%2c%22ar%22%2ct&sl=29&s=TITLE-ABS-KEY%28neural+network%29&relpos=0&citeCnt=24191&searchTerm= Collective dynamics of 'small-world9 networks(Article)

   Watts, D.J., Strogatz, S.H. View Correspondence (jump link) 
   Department of Theoretical and Applied Mechanics, Kimball Hall, Cornell University, Ithaca, NY 14853, United States

Процентиль важности: 96.161

2) https://www.scopus.com/record/display.uri?eid=2-s2.0-84876231242&origin=resultslist&sort=cp-f&src=s&st1=neural+network&nlo=&nlr=&nls=&sid=e9776e07715749d9aac159285aa4bdba&sot=b&sdt=b&sl=29&s=TITLE-ABS-KEY%28neural+network%29&relpos=1&citeCnt=29206&searchTerm=

Advances in Neural Information Processing SystemsVolume 2, 2012, Pages 1097-110526th Annual Conference on Neural Information Processing Systems 2012, NIPS 2012; Lake Tahoe, NV; United States; 3 December 2012 до 6 December 2012; Код 96883 ImageNet classification with deep convolutional neural networks(Conference Paper)

   Krizhevsky, A.Email Author, Sutskever, I.Email Author, Hinton, G.E.Email Author View Correspondence (jump link) 
   University of Toronto, Canada

Краткое описание Просмотр пристатейных ссылок (26)

We trained a large, deep convolutional neural network to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes. On the test data, we achieved top-1 and top-5 error rates of 37.5% and 17.0% which is considerably better than the previous state-of-the-art. The neural network, which has 60 million parameters and 650,000 neurons, consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax. To make training faster, we used non-saturating neurons and a very efficient GPU implementation of the convolution operation. To reduce overfitting in the fully-connected layers we employed a recently-developed regularization method called "dropout" that proved to be very effective. We also entered a variant of this model in the ILSVRC-2012 competition and achieved a winning top-5 test error rate of 15.3%, compared to 26.2% achieved by the second-best entry. Важность темы SciVal Тема: Convolution | Neural networks | Convolutional network Процентиль важности: 99.989


We trained a large, deep convolutional neural network to classify the 1.2 million high-resolution images in the ImageNet LSVRC-2010 contest into the 1000 different classes. On the test data, we achieved top-1 and top-5 error rates of 37.5% and 17.0% which is considerably better than the previous state-of-the-art. The neural network, which has 60 million parameters and 650,000 neurons, consists of five convolutional layers, some of which are followed by max-pooling layers, and three fully-connected layers with a final 1000-way softmax. To make training faster, we used non-saturating neurons and a very efficient GPU implementation of the convolution operation. To reduce overfitting in the fully-connected layers we employed a recently-developed regularization method called "dropout" that proved to be very effective. We also entered a variant of this model in the ILSVRC-2012 competition and achieved a winning top-5 test error rate of 15.3%, compared to 26.2% achieved by the second-best entry.

3) https://www.scopus.com/record/display.uri?eid=2-s2.0-0032482432&origin=resultslist&sort=cp-f&src=s&st1=neural+network&nlo=&nlr=&nls=&sid=e9776e07715749d9aac159285aa4bdba&sot=b&sdt=b&sl=29&s=TITLE-ABS-KEY%28neural+network%29&relpos=2&citeCnt=24191&searchTerm= Collective dynamics of 'small-world9 networks(Article)

   Watts, D.J., Strogatz, S.H. View Correspondence (jump link) 
   Department of Theoretical and Applied Mechanics, Kimball Hall, Cornell University, Ithaca, NY 14853, United States

Краткое описание Просмотр пристатейных ссылок (27)

Networks of coupled dynamical systems have been used to model biological oscillators 1-4, Josephson junction arrays5,6, excitable media7, neural networks8-10, spatial games", genetic control networks12 and many other self-organizing systems. Ordinarily, the connection topology is assumed to be either completely regular or completely random. But many biological, technological and social networks lie somewhere between these two extremes. Here we explore simple models of networks that can be tuned through this middle ground: regular networks 'rewired' to introduce increasing amounts of disorder. We find that these systems can be highly clustered, like regular lattices, yet have small characteristic path lengths, like random graphs. We call them 'small-world' networks, by analogy with the small-world phenomenon13'14 (popularly known as six degrees of separation15). The neural network of the worm Caenorhabditis elegans, the power grid of the western United States, and the collaboration graph of film actors are shown to be small-world networks. Models of dynamical systems with small-world coupling display enhanced signal-propagation speed, computational power, and synchronizability. In particular, infectious diseases spread more easily in small-world networks than in regular lattices. Важность темы SciVal Тема: Models | Complex networks | Preferential attachment Процентиль важности: 96.161

4) https://www.scopus.com/record/display.uri?eid=2-s2.0-0242490780&origin=resultslist&sort=cp-f&src=s&st1=neural+network&nlo=&nlr=&nls=&sid=679b26cac29a36193382ff45f66bb4ec&sot=b&sdt=cl&cluster=scosubtype%2c%22ar%22%2ct&sl=29&s=TITLE-ABS-KEY%28neural+network%29&relpos=5&citeCnt=11768&searchTerm= Collective dynamics of 'small-world9 networks(Article)

   Watts, D.J., Strogatz, S.H. View Correspondence (jump link) 
   Department of Theoretical and Applied Mechanics, Kimball Hall, Cornell University, Ithaca, NY 14853, United States

Краткое описание Просмотр пристатейных ссылок (27)

Networks of coupled dynamical systems have been used to model biological oscillators 1-4, Josephson junction arrays5,6, excitable media7, neural networks8-10, spatial games", genetic control networks12 and many other self-organizing systems. Ordinarily, the connection topology is assumed to be either completely regular or completely random. But many biological, technological and social networks lie somewhere between these two extremes. Here we explore simple models of networks that can be tuned through this middle ground: regular networks 'rewired' to introduce increasing amounts of disorder. We find that these systems can be highly clustered, like regular lattices, yet have small characteristic path lengths, like random graphs. We call them 'small-world' networks, by analogy with the small-world phenomenon13'14 (popularly known as six degrees of separation15). The neural network of the worm Caenorhabditis elegans, the power grid of the western United States, and the collaboration graph of film actors are shown to be small-world networks. Models of dynamical systems with small-world coupling display enhanced signal-propagation speed, computational power, and synchronizability. In particular, infectious diseases spread more easily in small-world networks than in regular lattices. Важность темы SciVal Тема: Models | Complex networks | Preferential attachment Процентиль важности: 96.161

5) https://www.scopus.com/record/display.uri?eid=2-s2.0-33745805403&origin=resultslist&sort=cp-f&src=s&st1=neural+network&nlo=&nlr=&nls=&sid=679b26cac29a36193382ff45f66bb4ec&sot=b&sdt=cl&cluster=scosubtype%2c%22ar%22%2ct&sl=29&s=TITLE-ABS-KEY%28neural+network%29&relpos=12&citeCnt=7295&searchTerm= A fast learning algorithm for deep belief nets(Article)

   Hinton, G.E.aEmail Author, Osindero, S.aEmail Author, Teh, Y.-W.bEmail Author View Correspondence (jump link) 
   aDepartment of Computer Science, University of Toronto, Toronto, M5S 3G4, Canada
   bDepartment of Computer Science, National University of Singapore, Singapore 117543, Singapore

Краткое описание Просмотр пристатейных ссылок (21)

We show how to use "complementary priors" to eliminate the explaining-away effects that make inference difficult in densely connected belief nets that have many hidden layers. Using complementary priors, we derive a fast, greedy algorithm that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory. The fast, greedy algorithm is used to initialize a slower learning procedure that fine-tunes the weights using a contrastive version of the wake-sleep algorithm. After fine-tuning, a network with three hidden layers forms a very good generative model of the joint distribution of handwritten digit images and their labels. This generative model gives better digit classification than the best discriminative learning algorithms. The low-dimensional manifolds on which the digits lie are modeled by long ravines in the free-energy landscape of the top-level associative memory, and it is easy to explore these ravines by using the directed connections to display what the associative memory has in mind. © 2006 Massachusetts Institute of Technology. Важность темы SciVal Тема: Neural networks | Learning systems | Deep belief Процентиль важности: 99.641



ВЫВОД Актуальность исследований в этом направлении подтверждается массой различных применений ИНС. Это автоматизация процессов распознавания образов, адаптивное управление, аппроксимация функционалов, прогнозирование, создание экспертных систем, организация ассоциативной памяти и многие другие приложения. С помощью ИНС можно, например, предсказывать показатели биржевого рынка, выполнять распознавание оптических или звуковых сигналов, создавать самообучающиеся системы, способные управлять автомашиной при парковке или синтезировать речь по тексту. В то время как на западе применение нейросетей уже достаточно обширно, у нас это еще в некоторой степени экзотика – российские фирмы, использующие нейросети в практических целях, можно пересчитать по пальцам.

Персональные инструменты
Пространства имён

Варианты
Действия
Навигация
Инструменты