paint-brush
Up to Speed on AI & Deep Learning: October Update, Part 3by@RequestsForStartups
700 reads
700 reads

Up to Speed on AI & Deep Learning: October Update, Part 3

by Requests for StartupsOctober 26th, 2017
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

<em>By </em><a href="https://www.linkedin.com/in/isaacmadan" target="_blank"><em>Isaac Madan</em></a><em> (</em><a href="mailto:[email protected]" target="_blank"><em>email</em></a><em>)</em>

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - Up to Speed on AI & Deep Learning: October Update, Part 3
Requests for Startups HackerNoon profile picture

Sharing some of the latest research, announcements, and resources on deep learning.

By Isaac Madan (email)

Continuing our series of deep learning updates, we pulled together some of the awesome resources that have emerged since our last post. In case you missed it, you can find all past updates here. As always, this list is not comprehensive, so let us know if there’s something we should add, or if you’re interested in discussing this area further. If you’re a machine learning practitioner or student, join our Talent Network here to get exposed to awesome ML opportunities.

Research & Announcements

Colaboratory by Google Research. Colaboratory is a data analysis tool that combines text, code, and code outputs into a single collaborative document. Google releases its own cloud notebook platform — try it.

OpenFermion: The Electronic Structure Package For Quantum Computers by Google. OpenFermion is an open source effort for compiling and analyzing quantum algorithms to simulate fermionic systems, including quantum chemistry. Among other functionalities, the current version features data structures and tools for obtaining and manipulating representations of fermionic and qubit Hamiltonians. Original paper here.

Deep learning and the Schrödinger equation by Mills et al of University of Ontario. We have trained a deep (convolutional) neural network to predict the ground-state energy of an electron in four classes of confining two-dimensional electrostatic potentials.

NVIDIA Deep Learning Accelerator. NVIDIA open sources its deep learning chip architecture to broaden its adoption as an IoT standard. The NVIDIA Deep Learning Accelerator (NVDLA) is a free and open architecture that promotes a standard way to design deep learning inference accelerators.

Micromouse contest first place video. Micromouse is an event where small robot mice solve a 16x16 maze (Wikipedia). Watch first place winner Ning6A1 by BengKiat Ng solve the maze (Youtube video). Read more about the contest here via this blog post.

Neural Networks API by Google. Google announces Neural Networks API for Android which executes against machine learning models on the device, bringing more AI to the edge. The Android Neural Networks API (NNAPI) is an Android C API designed for running computationally intensive operations for machine learning on mobile devices.

Generative Adversarial Networks: An Overview by Creswell et al of Imperial College London. The aim of this review paper is to provide an overview of GANs for the signal processing community, drawing on familiar analogies and concepts where possible. In addition to identifying different methods for training and constructing GANs, we also point to remaining challenges in their theory and application.

Announcing PlaidML: Open Source Deep Learning for Every Platform by Vertex AI. Open source portable deep learning engine. Our mission is to make deep learning accessible to every person on every device.

Resources, Tutorials & Data

Arxiv Vanity by Andreas Jansson and Ben Firshman. A handy tool that renders Arxiv academic papers as easy to read web pages so you don’t have to read the PDF versions that is typical of most ML papers.

Video lectures accompanying Deep Learning book by Alena Kruchkova. Great series of lecture videos that follow the Deep Learning book by Goodfellow et al. Original book here.

Raspberry Pi: Deep learning object detection with OpenCV by Adrian Rosebrock. Tutorial demonstrating near real-time object detection via a Rasberry Pi.

Dimensionality Reduction: Principal Components Analysis, Part 1 by Data4Bio. Thorough and understandable explanation of Principal Component Analysis (Youtube video).

Explaining Your Machine Learning Model (or 5 Ways to Assess Feature Importance) by ClearBrain. Knowing which features, inputs, or variables in a model are influencing its effectiveness is valuable to improving its actionability. Assessing feature importance though is not straightforward. Below we outline five ways of addressing feature importance, with a focus on logistic regression models for simplicity.

Word embeddings in 2017: Trends and future directions by Sebastian Ruder. Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers (Wikipedia). This post will focus on the deficiencies of word embeddings and how recent approaches have tried to resolve them.

How to Win a Data Science Competition: Learn from Top Kagglers by Coursera. In this course, you will learn to analyse and solve competitively such predictive modelling tasks. Course just started October 23.

By Isaac Madan. Isaac is an investor at Venrock (email). If you’re interested in deep learning or there are resources I should share in a future newsletter, I’d love to hear from you. If you’re a machine learning practitioner or student, join our Talent Network here to get exposed to awesome ML opportunities.

Requests for Startups is a newsletter of entrepreneurial ideas & perspectives by investors, operators, and influencers.

**Please tap or click “︎**❤” to help to promote this piece to others.