Age, Biography and Wiki

Sepp Hochreiter was born on 14 February, 1967 in Mühldorf, Germany. Discover Sepp Hochreiter's Biography, Age, Height, Physical Stats, Dating/Affairs, Family and career updates. Learn How rich is He in this year and how He spends money? Also learn how He earned most of networth at the age of 57 years old?

Popular As N/A
Occupation N/A
Age 57 years old
Zodiac Sign Aquarius
Born 14 February, 1967
Birthday 14 February
Birthplace Mühldorf, West Germany
Nationality Germany

We recommend you to check the complete list of Famous People born on 14 February. He is a member of famous with the age 57 years old group.

Sepp Hochreiter Height, Weight & Measurements

At 57 years old, Sepp Hochreiter height not available right now. We will update Sepp Hochreiter's Height, weight, Body Measurements, Eye Color, Hair Color, Shoe & Dress size soon as possible.

Physical Status
Height Not Available
Weight Not Available
Body Measurements Not Available
Eye Color Not Available
Hair Color Not Available

Dating & Relationship status

He is currently single. He is not dating anyone. We don't have much information about He's past relationship and any previous engaged. According to our Database, He has no children.

Family
Parents Not Available
Wife Not Available
Sibling Not Available
Children Not Available

Sepp Hochreiter Net Worth

His net worth has been growing significantly in 2022-2023. So, how much is Sepp Hochreiter worth at the age of 57 years old? Sepp Hochreiter’s income source is mostly from being a successful . He is from Germany. We have estimated Sepp Hochreiter's net worth , money, salary, income, and assets.

Net Worth in 2023 $1 Million - $5 Million
Salary in 2023 Under Review
Net Worth in 2022 Pending
Salary in 2022 Under Review
House Not Available
Cars Not Available
Source of Income

Sepp Hochreiter Social Network

Instagram
Linkedin
Twitter
Facebook
Wikipedia Sepp Hochreiter Wikipedia
Imdb

Timeline

2014

Neural networks are different types of simplified mathematical models of biological neural networks like those in human brains. In feedforward neural networks (NNs) the information moves forward in only one direction, from the input layer that receives information from the environment, through the hidden layers to the output layer that supplies the information to the environment. Unlike NNs, recurrent neural networks (RNNs) can use their internal memory to process arbitrary sequences of inputs. If data mining is based on neural networks, overfitting reduces the network's capability to correctly process future data. To avoid overfitting, Sepp Hochreiter developed algorithms for finding low complexity neural networks like "Flat Minimum Search" (FMS), which searches for a "flat" minimum — a large connected region in the parameter space where the network function is constant. Thus, the network parameters can be given with low precision which means a low complex network that avoids overfitting. Low complexity neural networks are well suited for deep learning because they control the complexity in each network layer and, therefore, learn hierarchical representations of the input. Sepp Hochreiter's group introduced "exponential linear units" (ELUs) which speed up learning in deep neural networks and lead to higher classification accuracies. Like rectified linear units (ReLUs), leaky ReLUs (LReLUs), and parametrized ReLUs (PReLUs), ELUs alleviate the vanishing gradient problem via the identity for positive values. However, ELUs have improved learning characteristics compared to ReLUs, due to negative values which push mean unit activations closer to zero. Mean shifts toward zero speed up learning by bringing the normal gradient closer to the unit natural gradient because of a reduced bias shift effect. Sepp Hochreiter introduced self-normalizing neural networks (SNNs) which allow for feedforward networks abstract representations of the input on different levels. SNNs avoid problems of batch normalization since the activations across samples automatically converge to mean zero and variance one. SNNs an enabling technology to (1) train very deep networks, that is, networks with many layers, (2) use novel regularization strategies, and (3) learn very robustly across many layers. In unsupervised deep learning, Generative Adversarial Networks (GANs) are very popular since they create new images which are more realistic than those of obtained from other generative approaches. Sepp Hochreiter proposed a two time-scale update rule (TTUR) for learning GANs with stochastic gradient descent on any differentiable loss function. Methods from stochastic approximation have been used to prove that the TTUR converges to a stationary local Nash equilibrium. This is the first proof of the convergence of GANs in a general setting. Another contribution is the introduction of the "Fréchet Inception Distance" (FID) which is a more appropriate quality measure for GANs than the previously used Inception Score. He developed rectified factor networks (RFNs) to efficiently construct very sparse, non-linear, high-dimensional representations of the input. RFN models identify rare and small events in the input, have a low interference between code units, have a small reconstruction error, and explain the data covariance structure. RFN learning is a generalized alternating minimization algorithm derived from the posterior regularization method which enforces non-negative and normalized posterior means. RFN were very successfully applied in bioinformatics and genetics.

2013

The pharma industry sees many chemical compounds (drug candidates) fail in late phases of the drug development pipeline. These failures are caused by insufficient efficacy on the biomolecular target (on-target effect), undesired interactions with other biomolecules (off-target or side effects), or unpredicted toxic effects. The Deep Learning and biclustering methods developed by Sepp Hochreiter identified novel on- and off-target effects in various drug design projects. In 2013 Sepp Hochreiter's group won the DREAM subchallenge of predicting the average toxicity of compounds. In 2014 this success with Deep Learning was continued by winning the "Tox21 Data Challenge" of NIH, FDA and NCATS. The goal of the Tox21 Data Challenge was to correctly predict the off-target and toxic effects of environmental chemicals in nutrients, household products and drugs. These impressive successes show Deep Learning may be superior to other virtual screening methods. Furthermore, Hochreiter's group worked on identifying synergistic effects of drug combinations.

1991

Sepp Hochreiter has made numerous contributions in the fields of machine learning, deep learning and bioinformatics. He developed the long short-term memory (LSTM) for which the first results were reported in his diploma thesis in 1991. The main LSTM paper appeared in 1997 and is considered as a discovery that is a milestone in the timeline of machine learning. The foundation of deep learning were led by his analysis of the vanishing or exploding gradient. He contributed to meta learning and proposed flat minima as preferable solutions of learning artificial neural networks to ensure a low generalization error. He developed new activation functions for neural networks like exponential linear units (ELUs) or scaled ELUs (SELUs) to improve learning. He contributed to reinforcement learning via actor-critic approaches and his RUDDER method. He applied biclustering methods to drug discovery and toxicology. He extended support vector machines to handle kernels that are not positive definite with the "Potential Support Vector Machine" (PSVM) model, and applied this model to feature selection, especially to gene selection for microarray data. Also in biotechnology, he developed "Factor Analysis for Robust Microarray Summarization" (FARMS).

Sepp Hochreiter developed the long short-term memory (LSTM) for which the first results were reported in his diploma thesis in 1991. The main LSTM paper appeared in 1997 and is considered as a discovery that is a milestone in the timeline of machine learning. LSTM overcomes the problem of recurrent neural networks (RNNs) and deep networks to forget information over time or, equivalently, through layers (vanishing or exploding gradient). LSTM learns from training sequences to process new sequences in order to produce an output (sequence classification) or generate an output sequence (sequence to sequence mapping). Neural networks with LSTM cells solved numerous tasks in biological sequence analysis, drug design, automatic music composition, machine translation, speech recognition, reinforcement learning, and robotics. LSTM with an optimized architecture was successfully applied to very fast protein homology detection without requiring a sequence alignment. LSTM has been used to learn a learning algorithm, that is, LSTM serves as a Turing machine, i.e. as a computer, on which a learning algorithm is executed. Since the LSTM Turing machine is a neural network, it can develop novel learning algorithms by learning on learning problems. It turns out that the learned new learning techniques are superior to those designed by humans. LSTM networks are used in Google Voice transcription, Google voice search, and Google's Allo as core technology for voice searches and commands in the Google App (on Android and iOS), and for dictation on Android devices. Also Apple has used LSTM in their "Quicktype" function since iOS 10.

1967

Sepp Hochreiter (born Josef Hochreiter in 1967) is a German computer scientist. Since 2018 he has led the Institute for Machine Learning at the Johannes Kepler University of Linz after having led the Institute of Bioinformatics from 2006 to 2018. In 2017 he became the head of the Linz Institute of Technology (LIT) AI Lab which focuses on advancing research on artificial intelligence. Previously, he was at the Technical University of Berlin, at the University of Colorado at Boulder, and at the Technical University of Munich.