Advertisement · 728 × 90
#
Hashtag
#CIFAR10
Advertisement · 728 × 90
Negotiated Representations Reduce Overfitting in Machine Learning

Negotiated Representations Reduce Overfitting in Machine Learning

Researchers introduced a negotiation‑based method that boosts classification accuracy and cuts overfitting on CIFAR‑10, CIFAR‑100 and MNIST. Published Sep 2025. getnews.me/negotiated-representatio... #negotiatedlearning #overfitting #cifar10

0 0 0 0
New Tight PAC-Bayesian Certificates Boost Contrastive Learning Theory

New Tight PAC-Bayesian Certificates Boost Contrastive Learning Theory

Researchers introduced tighter PAC‑Bayesian risk certificates for contrastive learning; CIFAR‑10 tests show the bounds closely match observed empirical errors. Read more: getnews.me/new-tight-pac-bayesian-c... #pacbayesian #cifar10

1 0 0 0
Lipschitz‑Guided Stochastic Depth Improves Adversarial Robustness

Lipschitz‑Guided Stochastic Depth Improves Adversarial Robustness

A Lipschitz‑guided stochastic depth schedule applied to a Vision Transformer‑Tiny on CIFAR‑10 keeps clean accuracy, boosting robustness against FGSM, PGD‑20 and AutoAttack, and cuts FLOPs. getnews.me/lipschitz-guided-stochas... #adversarial #cifar10

0 0 0 0

∇QDARTS: Quantization as an Elastic Dimension to Differentiable NAS

Payman Behnam, Uday Kamal, Sanjana Vijay Ganesh et al.

Action editor: Naigang Wang

https://openreview.net/forum?id=ubrOSWyTS8

#imagenet #cifar10 #optimized

1 0 0 0

Ended up using this paper as a launchpad: ceur-ws.org/Vol-3741/pap.... They had a nice Jupyter notebook for us to work from. Results were disappointing, but I believe that's because we were working with very small CIFAR10 images. #ViT #transformers #CIFAR10 #DRL

0 0 0 0