Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Special pages
Niidae Wiki
Search
Search
Appearance
Create account
Log in
Personal tools
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Editing
Perceptron
(section)
Page
Discussion
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Page information
Appearance
move to sidebar
hide
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
=== Subsequent work === Rosenblatt continued working on perceptrons despite diminishing funding. The last attempt was Tobermory, built between 1961 and 1967, built for speech recognition.<ref>Rosenblatt, Frank (1962). β''[https://web.archive.org/web/20231230210135/https://apps.dtic.mil/sti/tr/pdf/AD0420696.pdf#page=163 A Description of the Tobermory Perceptron]''.β Cognitive Research Program. Report No. 4. Collected Technical Papers, Vol. 2. Edited by Frank Rosenblatt. Ithaca, NY: Cornell University.</ref> It occupied an entire room.<ref name=":7">Nagy, George. 1963. ''[https://web.archive.org/web/20231230204827/https://apps.dtic.mil/sti/trecms/pdf/AD0607459.pdf System and circuit designs for the Tobermory perceptron]''. Technical report number 5, Cognitive Systems Research Program, Cornell University, Ithaca New York.</ref> It had 4 layers with 12,000 weights implemented by toroidal [[magnetic core]]s. By the time of its completion, simulation on digital computers had become faster than purpose-built perceptron machines.<ref>Nagy, George. "Neural networks-then and now." ''IEEE Transactions on Neural Networks'' 2.2 (1991): 316-318.</ref> He died in a boating accident in 1971. [[File:Isometric view of Tobermory Phase I.png|thumb|Isometric view of Tobermory Phase I.<ref name=":7" />]] The [[kernel perceptron]] algorithm was already introduced in 1964 by Aizerman et al.<ref>{{cite journal |last1=Aizerman |first1=M. A. |last2=Braverman |first2=E. M. |last3=Rozonoer |first3=L. I. |year=1964 |title=Theoretical foundations of the potential function method in pattern recognition learning |journal=Automation and Remote Control |volume=25 |pages=821β837 }}</ref> Margin bounds guarantees were given for the Perceptron algorithm in the general non-separable case first by [[Yoav Freund|Freund]] and [[Robert Schapire|Schapire]] (1998),<ref name="largemargin">{{Cite journal |doi=10.1023/A:1007662407062 |year=1999 |title=Large margin classification using the perceptron algorithm |last1=Freund |first1=Y. |author-link1=Yoav Freund |journal=[[Machine Learning (journal)|Machine Learning]] |volume=37 |issue=3 |pages=277β296 |last2=Schapire |first2=R. E. |s2cid=5885617 |author-link2=Robert Schapire |url=http://cseweb.ucsd.edu/~yfreund/papers/LargeMarginsUsingPerceptron.pdf|doi-access=free }}</ref> and more recently by [[Mehryar Mohri|Mohri]] and Rostamizadeh (2013) who extend previous results and give new and more favorable L1 bounds.<ref>{{cite arXiv |last1=Mohri |first1=Mehryar |last2=Rostamizadeh |first2=Afshin |title=Perceptron Mistake Bounds |eprint=1305.0208 |year=2013 |class=cs.LG }}</ref><ref>[https://mitpress.mit.edu/books/foundations-machine-learning-second-edition] Foundations of Machine Learning, MIT Press (Chapter 8).</ref> The perceptron is a simplified model of a biological [[neuron]]. While the complexity of [[biological neuron model]]s is often required to fully understand neural behavior, research suggests a perceptron-like linear model can produce some behavior seen in real neurons.<ref>{{cite journal |last1=Cash |first1=Sydney |first2=Rafael |last2=Yuste |title=Linear Summation of Excitatory Inputs by CA1 Pyramidal Neurons |journal=[[Neuron (journal)|Neuron]] |volume=22 |issue=2 |year=1999 |pages=383β394 |doi=10.1016/S0896-6273(00)81098-3 |pmid=10069343 |doi-access=free }}</ref> The solution spaces of decision boundaries for all binary functions and learning behaviors are studied in.<ref>{{cite book |last1=Liou |first1=D.-R. |title=Learning Behaviors of Perceptron |last2=Liou |first2=J.-W. |last3=Liou |first3=C.-Y. |publisher=iConcept Press |year=2013 |isbn=978-1-477554-73-9}}</ref>
Summary:
Please note that all contributions to Niidae Wiki may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Encyclopedia:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Search
Search
Editing
Perceptron
(section)
Add topic