Skip to main content Link Menu Expand (external link) Document Search Copy Copied

Big-O Notation

2022.10.01

Definition

Big-O is mathematical notation for asymptotic execution times.
In Computing Science, Big-O notation is used to categorize algorithms based on how their runtime or space needs increase as the input size grows. Therefore, the execution time may vary greatly depending on the efficiency of the algorithm.