diff --git a/README.md b/README.md
index d7c8c20b5..6637c2b9a 100644
--- a/README.md
+++ b/README.md
@@ -69,24 +69,101 @@ See LiBai's [documentation](https://libai.readthedocs.io/en/latest/index.html) f
## ChangeLog
-**Beta 0.2.0** was released in 07/07/2022, the general changes in **0.2.0** version are as follows:
+**Beta 0.3.0** was released in 03/11/2024, the general changes in **0.3.0** version are as follows:
**Features:**
-- Support evaluation enabled and set `eval_iter`
-- Support customized sampler in `config.py`
-- Support rdma for pipeline-model-parallel
-- Support multi fused kernel
- - fused_scale_mask_softmax_dropout
- - fused_scale_tril_softmax_mask_scale
- - fused_self_attention in branch `libai_bench`
+- Support mock transformers, see [Mock transformers](https://github.com/Oneflow-Inc/libai/tree/main/projects/mock_transformers#readme)
+- Support lm-evaluation-harness for model evaluation
- User Experience Optimization
-- Optimization for training throughput, see [benchmark](https://libai.readthedocs.io/en/latest/tutorials/get_started/Benchmark.html) for more details
-**Supported Models:**
-- Support 3D parallel [Roberta](https://arxiv.org/abs/1907.11692) model
-- Support 2D parallel (data parallel + tensor model parallel) [SimCSE](https://arxiv.org/abs/2104.08821) model
-- Support Data parallel [MAE](https://arxiv.org/abs/2111.06377) model
-- Support Data parallel [MOCOV3](https://arxiv.org/abs/2104.02057) model
+**New Supported Models:**
+- These models are natively supported by libai
+
+
+**New Mock Models:**
+- These models are extended and implemented by libai through mocking transformers.
+
+
+
+ Models |
+ Tensor Parallel |
+ Pipeline Parallel |
+
+
+ BLOOM |
+ ✔ |
+ - |
+
+
+ GPT2 |
+ ✔ |
+ - |
+
+
+ LLAMA |
+ ✔ |
+ - |
+
+
+ LLAMA2 |
+ ✔ |
+ - |
+
+
+ Baichuan |
+ ✔ |
+ - |
+
+
+ OPT |
+ ✔ |
+ - |
+
+
+
See [changelog](./changelog.md) for details and release history.
diff --git a/changelog.md b/changelog.md
index 7eba8fc0d..df91fb42a 100644
--- a/changelog.md
+++ b/changelog.md
@@ -1,26 +1,94 @@
-## Changelog
-
-### Beta 0.1.0 (22/03/2022)
-
+### v0.3.0 (03/11/2024)
**New Features:**
-- Support Data Parallelism
-- Support 1D Tensor Parallelism
-- Support Pipeline Parallelism
-- Unified distributed Layers for both single-GPU and multi-GPU training
-- `LazyConfig` system for more flexible syntax and no predefined structures
-- Easy-to-use trainer and engine
-- Support both CV and NLP data processing
-- Mixed Precision Training
-- Activation Checkpointing
-- Gradient Accumulation
-- Gradient Clipping
-- Zero Redundancy Optimizer (ZeRO)
+- Support mock transformers, see [Mock transformers](https://github.com/Oneflow-Inc/libai/tree/main/projects/mock_transformers#readme)
+- Support lm-evaluation-harness for model evaluation
+- User Experience Optimization
+
+**New Supported Models:**
+- These models are natively supported by libai
+
-**Supported Models:**
-- Support 3D parallel [BERT](https://arxiv.org/abs/1810.04805) model
-- Support 3D parallel [GPT-2](https://cdn.openai.com/better-language-models/language_models_are_unsupervised_multitask_learners.pdf) model
-- Support 3D parallel [T5](https://arxiv.org/abs/1910.10683) model
-- Support 3D parallel [Vision Transformer](https://arxiv.org/abs/2010.11929)
-- Support Data parallel [Swin Transformer](https://arxiv.org/abs/2103.14030) model
-- Support finetune task in [QQP project](/projects/QQP/)
-- Support text classification task in [text classification project](/projects/text_classification/)
+**New Mock Models:**
+- These models are extended and implemented by libai through mocking transformers.
+
+
+
+ Models |
+ Tensor Parallel |
+ Pipeline Parallel |
+
+
+ BLOOM |
+ ✔ |
+ - |
+
+
+ GPT2 |
+ ✔ |
+ - |
+
+
+ LLAMA |
+ ✔ |
+ - |
+
+
+ LLAMA2 |
+ ✔ |
+ - |
+
+
+ Baichuan |
+ ✔ |
+ - |
+
+
+ OPT |
+ ✔ |
+ - |
+
+
+
\ No newline at end of file