Releases: tensorlayer/TensorLayer
TensorLayer 3.0.0-alpha Release
Dear all,
It is our great honour to pre-released TensorLayer 3.0.0-alpha.
It supports TensorFlow and MindSpore backends, and supports some PaddlePaddle operator backends, allowing users to run the code on different hardware like Nvidia-GPU and Huawei-Ascend.
In the next step, we support TensorFlow, MindSpore, PaddlePaddle, and PyTorch backends in the future. Feel free to use it and make suggestions.
TensorLayer 3.0.0-alpha is a maintenance release.
TensorLayer 2.2.4 Release
TensorLayer 2.2.4 is a maintenance release.
Added
Changed
Dependencies Update
Deprecated
Fixed
Removed
Security
Contributors
TensorLayer 2.2.3 Release
TensorLayer 2.2.3 is a maintenance release.
It contains numerous bug fixes.
Added
Changed
Dependencies Update
Deprecated
Fixed
- Fix VGG. (#1078, 1079, 1089)
- Fix norm layer. (#1080)
- Fix DeCov2d layer. (#1081)
- Fix ModelLayer and LayerList doc. (#1083)
- Fix bug in SAC. (#1085)
- Fix refactoring: Deduplication. (#1086)
- Fix maxpool, batchnorm Data format fixed, vgg forward. (#1089)
- Fix package info. (#1090)
Removed
Security
Contributors
TensorLayer 2.2.2 Release
TensorLayer 2.2.2 is a maintenance release.
Added
Fixed
- Fix README.
- Fix package info.
Contributors
TensorLayer 2.2.1 Release
TensorLayer 2.2.0
TensorLayer 2.2.0 is a maintenance release.
It contains numerous API improvement and bug fixes.
This release is compatible with TensorFlow 2 RC1.
Added
- Support nested layer customization (#PR 1015)
- Support string dtype in InputLayer (#PR 1017)
- Support Dynamic RNN in RNN (#PR 1023)
- Add ResNet50 static model (#PR 1030)
- Add performance test code for static models (#PR 1041)
Changed
SpatialTransform2dAffine
autoin_channels
- support TensorFlow 2.0.0-rc1
- Update model weights property, now returns its copy (#PR 1010)
Fixed
- RNN updates: remove warnings, fix if seq_len=0, unitest (#PR 1033)
- BN updates: fix BatchNorm1d for 2D data, refactored (#PR 1040)
Dependencies Update
Deprecated
Fixed
- Fix
tf.models.Model._construct_graph
for list of outputs, e.g. STN case (PR #1010) - Enable better
in_channels
exception raise. (PR #1015) - Set allow_pickle=True in np.load() (#PR 1021)
- Remove
private_method
decorator (#PR 1025) - Copy original model's
trainable_weights
andnontrainable_weights
when initializingModelLayer
(#PR 1026) - Copy original model's
trainable_weights
andnontrainable_weights
when initializingLayerList
(#PR 1029) - Remove redundant parts in
model.all_layers
(#PR 1029) - Replace
tf.image.resize_image_with_crop_or_pad
withtf.image.resize_with_crop_or_pad
(#PR 1032) - Fix a bug in
ResNet50
static model (#PR 1041)
Removed
Security
Contributors
TensorLayer 2.1.0
Dear All,
Three things need to be mentioned for this release.
- Deep Reinforcement Learning Model Zoo Release!!!
- We are going to support more Attention models for NLP officially.
- The
model.conf
is almost stable, the AIoT team from Sipeed is now working hard to support TL model on the AI Chips.
Enjoy!
TensorLayer Team
Changed
- Add version_info in model.config. (PR #992)
- Replace tf.nn.func with tf.nn.func.__name__ in model config.
- Add Reinforcement learning tutorials. (PR #995)
- Add RNN layers with simple rnn cell, GRU cell, LSTM cell. (PR #998)
- Update Seq2seq (#998)
- Add Seq2seqLuongAttention model (#998)
Contributors
TensorLayer 2.0.2
Hello, we want to tell you some GOOD NEWS.
Today, AI chip is anywhere, from our phone to our car, however, it still hard for us to have our own AI chip.
To end this, TensorLayer team starts to work on AIoT and will soon support to run the TensorLayer models on the low-cost AI chip (e.g., K210) and microcontrollers (e.g., STM32). Details in the following:
- NNoM is a higher-level layer-based Neural Network library specifically for microcontrollers (MCU), our team and the author of NNoM is working hard to make TensorLayer models to run on different MCUs. Yes! Something like BinaryNet.
- K210 is a low-cost AI chip, we are collaborating with the designers of K210 and the Sipeed team to make TensorLayer models to run on the K210 AI chip.
If you are interested in AIoT, feel free to discuss in Slack.
TensorLayer, Sipeed, NNoM teams
=======
Maintain release, recommended to update.
Changed
- change the format of network config, change related code and files; change layer act (PR #980)
- update Seq2seq (#989)
Fixed
- Fix dynamic model cannot track PRelu weights gradients problem (PR #982)
- Raise .weights warning (commit)
Contributors
TensorLayer 2.0.1
Maintain release, recommended to update.
Changed
Added
- Layer
InstanceNorm
,InstanceNorm1d
,InstanceNorm2d
,InstanceNorm3d
(PR #963)
Changed
- remove
tl.layers.initialize_global_variables(sess)
(PR #931) - change
tl.layers.core
,tl.models.core
(PR #966) - change
weights
intoall_weights
,trainable_weights
,nontrainable_weights
Dependencies Update
- nltk>=3.3,<3.4 => nltk>=3.3,<3.5 (PR #892)
- pytest>=3.6,<3.11 => pytest>=3.6,<4.1 (PR #889)
- yapf>=0.22,<0.25 => yapf==0.25.0 (PR #896)
- imageio==2.5.0 progressbar2==3.39.3 scikit-learn==0.21.0 scikit-image==0.15.0 scipy==1.2.1 wrapt==1.11.1 pymongo==3.8.0 sphinx==2.0.1 wrapt==1.11.1 opencv-python==4.1.0.25 requests==2.21.0 tqdm==4.31.1 lxml==4.3.3 pycodestyle==2.5.0 sphinx==2.0.1 yapf==0.27.0(PR #967)
Fixed
- fix docs of models @zsdonghao #957
- In
BatchNorm
, keep dimensions of mean and variance to suitchannels first
(PR #963)
Contributors
TensorLayer 2.0.0
Dear all,
It is our great honour to release TensorLayer 2.0.0.
In the past few months, we have refactored all layers to support TensorFlow 2.0.0-alpha0 and the dynamic mode! The new API designs allow you to customize layers easily, compared with other libraries.
We would like to thanks all contributors especially our core members from Peking University and Imperial College London, they are @zsdonghao @JingqingZ @ChrisWu1997 @warshallrho. All contributions are listed in the following.
In the next step, we are interested in supporting more advanced features for 3D Vision, such as PointCNN and GraphCNN. Also, we still have some remaining examples that need to be updated, such as A3C and distributed training. If you are interested in joining the development team, feel free to contact us: [email protected]
Enjoy coding!
TensorLayer Team
References
Contribution List
All contribution can be found as follows:
Layers
- core.py:
- Layer:
- refactored @JingqingZ 2019/01/28
- tested @JingqingZ 2019/01/31 2019/03/06
- documentation @JingqingZ 2019/03/06
- ModelLayer:
- created @JingqingZ 2019/01/28
- tested @JingqingZ 2019/03/06
- documentation @JingqingZ 2019/03/06
- LayerList:
- created @JingqingZ 2019/01/28 @ChrisWu1997
- tested @JingqingZ 2019/03/06
- documentation @JingqingZ 2019/03/06
- LayerNode:
- created @ChrisWu1997
- tested @ChrisWu1997 2019/03/22
- documentation @ChrisWu1997 2019/03/22
- Layer:
- activation.py:
- PRelu:
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/03/20
- tested @JingqingZ 2019/03/20
- documentation @JingqingZ 2019/03/20
- PRelu6:
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/03/20
- tested @JingqingZ 2019/03/20
- documentation @JingqingZ 2019/03/20
- PTRelu6:
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/03/20
- tested @JingqingZ 2019/03/20
- documentation @JingqingZ 2019/03/20
- PRelu:
- convolution/
- AtrousConv1dLayer, AtrousConv2dLayer and AtrousDeConv2d are removed, use Conv1d/2d and DeConv2d with
dilation_rate
instead. (🀄️remember to change CN docs) - BinaryConv2d:
- refactored @zsdonghao 2018/12/05
- tested @warshallrho 2019/03/16
- documentation @warshallrho 2019/03/20
- Conv1d:
- refactored @zsdonghao 2019/01/16
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- Conv2d:
- refactored @zsdonghao 2019/01/16
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- Conv3d:
- add @zsdonghao 2019/01/16 : (🀄️remember to change CN docs)
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- Conv1dLayer:
- refactored @zsdonghao 2018/12/05
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- Conv2dLayer:
- refactored @zsdonghao 2018/12/05
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- Conv3dLayer:
- refactored @zsdonghao 2018/12/05
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- DeConv1dLayer:
- refactored @warshallrho 2019/03/16
- tested @warshallrho 2019/03/16
- documentation @warshallrho 2019/03/17
- DeConv2dLayer:
- refactored @zsdonghao 2018/12/06
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- DeConv3dLayer:
- refactored @zsdonghao 2018/12/06
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- DeConv2d:
- refactored @zsdonghao 2019/01/16
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- DeConv3d:
- refactored @zsdonghao 2019/01/16
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/17
- DeformableConv2d:
- refactored @warshallrho 2019/03/18
- tested @warshallrho 2019/03/18
- documentation @warshallrho 2019/03/18
- DepthwiseConv2d:
- refactored @zsdonghao 2018/12/05
- tested @warshallrho 2019/03/15
- documentation @warshallrho 2019/03/18
- DorefaConv2d:
- refactored @zsdonghao 2018/12/06
- tested @warshallrho 2019/03/17
- documentation @warshallrho 2019/03/20
- GroupConv2d:
- refactored @zsdonghao 2018/12/06
- tested @warshallrho 2019/03/17
- documentation @warshallrho 2019/03/20
- QuanConv2d:
- refactored @zsdonghao 2018/12/06
- tested @warshallrho 2019/03/17
- documentation @warshallrho 2019/03/20
- QuanConv2dWithBN:
- refactored
- tested
- documentation
- SeparableConv1d:
- refactored @zsdonghao 2019/01/16
- tested @warshallrho 2019/03/17
- documentation @warshallrho 2019/03/18
- SeparableConv2d:
- refactored @zsdonghao 2019/01/16
- tested @warshallrho 2019/03/17
- documentation @warshallrho 2019/03/18
- SubpixelConv1d:
- refactored @zsdonghao 2018/12/05 @warshallrho 2019/03/18
- tested @warshallrho 2019/03/18
- documentation @warshallrho 2019/03/18
- SubpixelConv2d:
- refactored @zsdonghao 2018/12/05 @warshallrho 2019/03/18
- tested @warshallrho 2019/03/18
- documentation @warshallrho 2019/03/18
- TernaryConv2d:
- refactored @zsdonghao 2018/12/06
- tested @warshallrho 2019/03/17
- documentation @warshallrho 2019/03/20
- AtrousConv1dLayer, AtrousConv2dLayer and AtrousDeConv2d are removed, use Conv1d/2d and DeConv2d with
- dense/ [WIP] @ChrisWu1997
- BinaryDense:
- refactored @zsdonghao 2018/12/06
- tested @ChrisWu1997 2019/04/23 need further test by example
- documentation @ChrisWu1997 2019/04/23
- Dense:
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/01/28
- tested @JingqingZ 2019/01/31 2019/03/06 2019/03/15
- documentation @JingqingZ 2019/03/15
- DorefaDense:
- refactored @zsdonghao 2018/12/04
- tested @ChrisWu1997 2019/04/23 need further test by example
- documentation @ChrisWu1997 2019/04/23
- DropconnectDense:
- refactored @zsdonghao 2018/12/05
- tested @ChrisWu1997 2019/04/23 need further test by example
- documentation @ChrisWu1997 2019/04/23
- QuanDense:
- refactored @zsdonghao 2018/12/06
- tested @ChrisWu1997 2019/04/23 need further test by example
- documentation @ChrisWu1997 2019/04/23
- QuanDenseWithBN:
- refactored
- tested
- documentation
- TernaryDense:
- refactored @zsdonghao 2018/12/06
- tested @ChrisWu1997 2019/04/23 need further test by example
- documentation @ChrisWu1997 2019/04/23
- BinaryDense:
- dropout.py
- Dropout:
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/01/28
- tested @JingqingZ 2019/01/31 2019/03/06 2019/03/15
- documentation @JingqingZ 2019/03/15
- Dropout:
- extend.py
- ExpandDims:
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/03/22
- tested @JingqingZ 2019/03/22
- documentation @JingqingZ 2019/03/22
- Tile:
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/03/22
- tested @JingqingZ 2019/03/22
- documentation @JingqingZ 2019/03/22
- ExpandDims:
- image_resampling.py
- UpSampling2d:
- refactored @zsdonghao 2018/12/04 @ChrisWu1997 2019/04/03
- tested @ChrisWu1997 2019/04/03
- documentation @ChrisWu1997 2019/04/03
- DownSampling2d:
- refactored @zsdonghao 2018/12/04 @ChrisWu1997 2019/04/03
- tested @ChrisWu1997 2019/04/03
- documentation @ChrisWu1997 2019/04/03
- UpSampling2d:
- importer.py
- SlimNets:
- refactored
- tested
- documentation
- Keras:
- refactored
- tested
- documentation
- SlimNets:
- inputs.py
- Input:
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/01/28
- tested @JingqingZ 2019/03/06
- documentation @JingqingZ 2019/03/06
- Input:
- embedding.py
- OneHotInput: --> OneHot (🀄️remember to change CN docs)
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/02/23
- tested @JingqingZ 2019/03/19
- documentation @JingqingZ 2019/03/19
- Word2vecEmbeddingInput: --> Word2vecEmbedding (🀄️remember to change CN docs)
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/02/21
- tested @JingqingZ 2019/03/19
- documentation @JingqingZ 2019/03/19
- EmbeddingInput: --> Embedding
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/02/22
- tested @JingqingZ 2019/03/19
- documentation @JingqingZ 2019/03/19
- AverageEmbeddingInput: --> AverageEmbedding (🀄️remember to change CN docs)
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/02/20
- tested @JingqingZ 2019/03/19
- documentation @JingqingZ 2019/03/19
- OneHotInput: --> OneHot (🀄️remember to change CN docs)
- lambda_layers.py
- ElementwiseLambda:
- refactored @JingqingZ 2019/03/24
- tested @JingqingZ 2019/03/24
- documentation @JingqingZ 2019/03/24
- Lambda:
- refactored @JingqingZ 2019/03/24
- tested @JingqingZ 2019/03/24
- documentation @JingqingZ 2019/03/24
- ElementwiseLambda:
- merge.py
- Concat:
- refactored @zsdonghao 2018/12/04
- tested @JingqingZ 2019/03/15
- documentation @JingqingZ 2019/03/15
- Elementwise:
- refactored @zsdonghao 2018/12/04 @JingqingZ 2019/03/15
- tested...
- Concat: