--- description: A Brief Introduction to Avalanche --- # Introduction **Avalanche** was born within [ContinualAI](https://www.continualai.org/) with a clear goal in mind: > ### _Pushing Continual Learning to the next level, providing a shared and collaborative library for fast prototyping, training and reproducible evaluation of continual learning algorithms._ As a powerful _avalanche_, a _Continual Learning_ agent _incrementally_ _improves_ its knowledge and skills over time, building upon the previously acquired ones and learning how to interact with the external world. We hope _Avalanche_ may trigger the same _**positive reinforcement loop**_ within our community, moving towards a more _**collaborative**_ **and inclusive** way of doing research and helping us tackle bigger problems, faster and better, but together! πŸ‘ͺ {% embed url="https://www.youtube.com/watch?v=EyO1eM0-Hi8" caption="A complete Introduction to Avalanche: an End-to-End Library for Continual Learning." %} ## πŸ’ͺThe Avalanche Advantage Avalanche has several advantages: * **Shared & Coherent Codebase**: Aren't you tired of re-inventing the wheel in continual learning? We are. Re-producing paper results has always been daunting in machine learning and it is even more so in continual learning. _Avalanche_ makes you stop re-write your \(and other people\) code all over again with a coherent and shared codebase that provides already all the utilities, benchmark, metrics and baselines you may need for your next great continual learning research project! * **Errors Reduction**: The more code we write, the more bugs we introduce in our code. This is the rule, not the exception. _Avalanche_, let you focus on what really matters: defining your CL solution. _Benchmarks_ preparation to _training,_ _evaluation_ and _comparison_ with other methods will be already there for you. This in turn, massively reduce the amount of errors introduced and the time needed to debug your code. * **Faster Prototyping**: As researchers or data scientists, we have dozens ideas every day and time is always too little to execute them. However, if we think about it, most of the time spent in bringing our ideas to life is consumed in installing software, preparing and cleaning our data, preparing the experiments code infrastructure and so on. _Avalanche_ lets you focus just on the original algorithmic proposal, taking care of most of the rest! * **Improved Reproducibility & Portability**: One of the great features of _Avalanche_, is the possibility of reproducing experimental results easily and on any OS. Researchers can simply plug-in their algorithm into the codebase and see how it goes with respect of other researchers' methods. Their algorithm in turn, is used as a baseline for other methods, creating a virtuous circle. This is only possible thanks to the simple, yet powerful idea of providing shared _benchmarks_, _training_ and _evaluation_ in a single place. * **Improved Modularity**: _Avalanche_ has been designed with modularity in mind. As you will learn more about Avalanche, you will realize we have sometimes forego simplicity in favor of modularity and reusability \(we hate code replication as you do πŸ€ͺ\). However, we believe this will help us scale in the near future as we collaboratively bring this codebase into maturity. * **Increased Efficiency & Scalability**: Full-stack researchers & data scientists know this, making your algorithm memory and computationally efficient is tough. _Avalanche_ is already optimized for you, so that you can run your ImageNet continual learning experiment on your 8GB laptop \(buy a cooling fan πŸ’¨\) or even try it on embedded devices of your latest product! But most of all, _Avalanche_, can help us standardize our field and work better together, more collaboratively, towards our shared goal of making machines learn over time like humans do. _Avalanche_ the first experiment of a **End-to-end Library** for reproducible _continual learning_ research where you can find _benchmarks_, _algorithms, evaluation utilities_ and much more in the same place. Let's make it together πŸ‘« a wonderful ride! 🎈