ML enthusiast. Get my book: https://bit.ly/modern-dl-book.
Unsplash

A response and thoughts

In June of 2020, I wrote a piece entitled “You Don’t Understand Neural Networks Until You Understand the Universal Approximation Theorem”. Its thesis was that by the Universal Approximation Theorem (UAT), we can understand the appearance of intelligent behavior in certain problems exhibited by neural networks (e.g. image recognition, image…

Source

Creative techniques to make complex models smaller

With the advent of convolutional neural networks and transformers to handle complex image recognition and natural language processing tasks, deep learning models have skyrocketed in size.

Although the increase in size is usually associated with an increase in predictive power, this supersizing comes with undesirable costs.

  • Longer training time. Models…

Source

Rethinking the GAN without competition

Generative Adversarial Networks (GANs) are an important development in deep learning; its formulation inspired a new generation of model ensembles that interact with each other in ways to produce incredible results.

Competition is in the blood of GAN design — “adversarial” is in its name. Two models, the discriminator and…

Source: Unsplash

Humans domain knowledge ceases to be useful

Machine learning has often been described as the study of “algorithms that create algorithms”. To a certain extent, this is true — machine learning finds the best model to fit the data. It is the process by which we attain the algorithm that can give predictions on the data.

Yet…

Source

Deep learning can’t beat human solutions — yet

The Travelling Salesman Problem was formulated in 1930, and is a classical computer science problem for optimization. It’s a simple problem:

Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city exactly once and returns to the…

Source.

Revolutionizing traditional Transfer Learning, k-Means, and the CNN

Quantum computing is a buzz-word that’s been thrown around quite a bit. Unfortunately, despite its virality in pop culture and quasi-scientific Internet communities, its capabilities are still quite limited.

As a very new field, quantum computing presents a complete paradigm shift to the traditional model of classical computing. Classical bits…

Andre Ye

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store