Here we will focus on how we can use custom data types inside of Numba optimized functions as well as parallelization.
[Read More]
Numba series part 1: The @jit decorator and some more Numba basics
In this part we'll have a closer look at the @jit decorator of the Numba library and talk about some pitfalls, as well as some more basics.
[Read More]
Introduction to the Numba library
The Numba library allows you to achieve near C/C++/Fortran performance with your Python code without many code changes. This post will introduce the concept of Numba and compare the actual performance gain.
[Read More]
Speeding up TensorFlows Input Pipeline
Doubling the training speed by adding two arguments to the new input pipeline - or why you should always carefully read the docs.
[Read More]
Example of TensorFlows new Input Pipeline
With version 1.2rc0 TensorFlow has gotten a new input pipeline. In this blog post I will explain usage and give an example of an entire input pipeline.
[Read More]
Finetuning AlexNet with TensorFlow
This blog post will guide you on how to finetune AlexNet with pure TensorFlow.
[Read More]
Understanding the backward pass through Batch Normalization Layer
An explanation of gradient flow through BatchNorm-Layer following the circuit representation learned in Standfords class CS231n.
[Read More]