查看: 1934|回复: 1

Getting started with TFLearn【1】

[复制链接]

166

主题

616

帖子

1万

积分

xdtech

Rank: 5Rank: 5

积分
11590
发表于 2019-1-12 16:34:22 | 显示全部楼层 |阅读模式
本帖最后由 Happy清子 于 2019-1-12 16:35 编辑

Here is a basic guide that introduces TFLearn and its functionalities. First, highlighting TFLearn high-level API for fast neural network building and training, and then showing how TFLearn layers, built-in ops and helpers can directly benefit any model implementation with Tensorflow.

High-Level API usage

TFLearn introduces a High-Level API that makes neural network building and training fast and easy. This API is intuitive and fully compatible with Tensorflow.

Layers

Layers are a core feature of TFLearn. While completely defining a model using Tensorflow ops can be time consuming and repetitive, TFLearn brings "layers" that represent an abstract set of operations to make building neural networks more convenient. For example, a convolutional layer will:

  • Create and initialize weights and biases variables
  • Apply convolution over incoming tensor
  • Add an activation function after the convolution
  • Etc...

In Tensorflow, writing these kinds of operations can be quite tedious:

with tf.name_scope('conv1'):   
W = tf.Variable(tf.random_normal([5, 5, 1, 32]), dtype=tf.float32, name='Weights')   
b = tf.Variable(tf.random_normal([32]), dtype=tf.float32, name='biases')   
x = tf.nn.conv2d(x, W, strides=[1, 1, 1, 1], padding='SAME')   
x = tf.add_bias(x, b)   
x = tf.nn.relu(x)

While in TFLearn, it only takes a line:

tflearn.conv_2d(x, 32, 5, activation='relu', name='conv1')

Here is a list of all currently available layers:

[td]
File
Layers
input_data, fully_connected, dropout, custom_layer, reshape, flatten, activation, single_unit, highway, one_hot_encoding, time_distributed
conv_2d, conv_2d_transpose, max_pool_2d, avg_pool_2d, upsample_2d, conv_1d, max_pool_1d, avg_pool_1d, residual_block, residual_bottleneck, conv_3d, max_pool_3d, avg_pool_3d, highway_conv_1d, highway_conv_2d, global_avg_pool, global_max_pool
simple_rnn, lstm, gru, bidirectionnal_rnn, dynamic_rnn
embedding
batch_normalization, local_response_normalization, l2_normalize
merge, merge_outputs
regression

Built-in Operations

Besides layers concept, TFLearn also provides many different ops to be used when building a neural network. These ops are firstly mean to be part of the above 'layers' arguments, but they can also be used independently in any other Tensorflow graph for convenience. In practice, just providing the op name as argument is enough (such as activation='relu' or regularizer='L2' for conv_2d), but a function can also be provided for further customization.

[td]
File
Ops
linear, tanh, sigmoid, softmax, softplus, softsign, relu, relu6, leaky_relu, prelu, elu
softmax_categorical_crossentropy, categorical_crossentropy, binary_crossentropy, mean_square, hinge_loss, roc_auc_score, weak_cross_entropy_2d
SGD, RMSProp, Adam, Momentum, AdaGrad, Ftrl, AdaDelta
Accuracy, Top_k, R2
zeros, uniform, uniform_scaling, normal, truncated_normal, xavier, variance_scaling
l1, l2


Below are some quick examples:

# Activation and Regularization inside a layer:fc2 = tflearn.fully_connected(fc1, 32, activation='tanh', regularizer='L2')
# Equivalent to:fc2 = tflearn.fully_connected(fc1, 32)
tflearn.add_weights_regularization(fc2, loss='L2')
fc2 = tflearn.tanh(fc2)
# Optimizer, Objective and Metric:
reg = tflearn.regression(fc4, optimizer='rmsprop', metric='accuracy', loss='categorical_crossentropy')
# Ops can also be defined outside, for deeper customization:
momentum = tflearn.optimizers.Momentum(learning_rate=0.1, weight_decay=0.96, decay_step=200)
top5 = tflearn.metrics.Top_k(k=5)
reg = tflearn.regression(fc4, optimizer=momentum, metric=top5, loss='categorical_crossentropy')

回复

使用道具 举报

665

主题

1234

帖子

6670

积分

xdtech

Rank: 5Rank: 5

积分
6670
发表于 2019-1-14 11:26:18 | 显示全部楼层
tflearn
这个工具
用了一阵子后
也没啥声响了
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

快速回复 返回顶部 返回列表