This article compares the proposed IIL method against SOTA incremental learning techniques on Cifar-100 and ImageNet-100.This article compares the proposed IIL method against SOTA incremental learning techniques on Cifar-100 and ImageNet-100.

SAGE Net Ablation Study: Analyzing the Impact of Input Sequence Length on Performance

2025/11/06 02:00

Abstract and 1 Introduction

  1. Related works

  2. Problem setting

  3. Methodology

    4.1. Decision boundary-aware distillation

    4.2. Knowledge consolidation

  4. Experimental results and 5.1. Experiment Setup

    5.2. Comparison with SOTA methods

    5.3. Ablation study

  5. Conclusion and future work and References

    \

Supplementary Material

  1. Details of the theoretical analysis on KCEMA mechanism in IIL
  2. Algorithm overview
  3. Dataset details
  4. Implementation details
  5. Visualization of dusted input images
  6. More experimental results

5.2. Comparison with SOTA methods

Tab. 1 shows the test performance of different methods on the Cifar-100 and ImageNet-100. The proposed method achieves the best performance promotion after ten consecutive IIL tasks by a large margin with a low forgetting rate. Although ISL [13] which is proposed for a similar setting of learning from new sub-categories has a low forgetting rate, it fails on the new requirement of model enhancement. Attain a better performance on the test data is more important than forgetting on a certain data.

\ In the new IIL setting, all rehearsal-based methods including iCarl [22], PODNet [4], Der [31] and OnPro [29], not perform well. Old exemplars can cause memory overfitting and model bias [35]. Thus, limited old exemplars not always have a positive influence to the stability and plasticity [26], especially in the IIL task. Forgetting rate of rehearsal-based methods is high compared to other methods, which also explains their performance degradation on the test data. Detailed performance at each learning phase is shown in Fig. 4. Compared to other methods that struggle in resisting forgetting, our method is the only one that stably promotes the existing model on both of the two datasets.

\ Following ISL [13], we further apply our method on the incremental sub-population learning as shown in Tab. 2. Sub-population incremental learning is a special case of the IIL where new knowledge comes from the new subclasses. Compared to the SOTA ISL [13], our method is notably superior in learning new subclasses over long incremental steps with a comparable small forgetting rate. Noteworthy, ISL [13] use Continual Hyperparameter Framework (CHF) [3] searching the best learning rate (such as low to 0.005 in 15-step task) for each setting. While our method learns utilizing ISL pretrained base model with a fixed learning rate (0.05). Low learning rate in ISL reduces the forgetting but hinders the new knowledge learning. The proposed method well balances learning new from unseen subclasses and resisting forgetting on seen classes.

\

:::info Authors:

(1) Qiang Nie, Hong Kong University of Science and Technology (Guangzhou);

(2) Weifu Fu, Tencent Youtu Lab;

(3) Yuhuan Lin, Tencent Youtu Lab;

(4) Jialin Li, Tencent Youtu Lab;

(5) Yifeng Zhou, Tencent Youtu Lab;

(6) Yong Liu, Tencent Youtu Lab;

(7) Qiang Nie, Hong Kong University of Science and Technology (Guangzhou);

(8) Chengjie Wang, Tencent Youtu Lab.

:::


:::info This paper is available on arxiv under CC BY-NC-ND 4.0 Deed (Attribution-Noncommercial-Noderivs 4.0 International) license.

:::

\

Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact service@support.mexc.com for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Why This New Trending Meme Coin Is Being Dubbed The New PEPE After Record Presale

Why This New Trending Meme Coin Is Being Dubbed The New PEPE After Record Presale

The post Why This New Trending Meme Coin Is Being Dubbed The New PEPE After Record Presale appeared on BitcoinEthereumNews.com. Crypto News 17 September 2025 | 20:13 The meme coin market is heating up once again as traders look for the next breakout token. While Shiba Inu (SHIB) continues to build its ecosystem and PEPE holds onto its viral roots, a new contender, Layer Brett (LBRETT), is gaining attention after raising more than $3.7 million in its presale. With a live staking system, fast-growing community, and real tech backing, some analysts are already calling it “the next PEPE.” Here’s the latest on the Shiba Inu price forecast, what’s going on with PEPE, and why Layer Brett is drawing in new investors fast. Shiba Inu price forecast: Ecosystem builds, but retail looks elsewhere Shiba Inu (SHIB) continues to develop its broader ecosystem with Shibarium, the project’s Layer 2 network built to improve speed and lower gas fees. While the community remains strong, the price hasn’t followed suit lately. SHIB is currently trading around $0.00001298, and while that’s a decent jump from its earlier lows, it still falls short of triggering any major excitement across the market. The project includes additional tokens like BONE and LEASH, and also has ongoing initiatives in DeFi and NFTs. However, even with all this development, many investors feel the hype that once surrounded SHIB has shifted elsewhere, particularly toward newer, more dynamic meme coins offering better entry points and incentives. PEPE: Can it rebound or is the momentum gone? PEPE saw a parabolic rise during the last meme coin surge, catching fire on social media and delivering massive short-term gains for early adopters. However, like most meme tokens driven largely by hype, it has since cooled off. PEPE is currently trading around $0.00001076, down significantly from its peak. While the token still enjoys a loyal community, analysts believe its best days may be behind it unless…
Share
BitcoinEthereumNews2025/09/18 02:50