F(n), S(n) and T(n) are commonly used notations in the field of computer science, used to describe or analyse the complexity of algorithms.
F(n) typically represents the funciton that describes the runtime of an algorithm of input size ‘n’. For example, we may say an algorithm has a runtime of F(n) = 2n^2 + 3n + 5
. It could represent any measurable resource usage, such as time, space, or some other resource.
S(n) denotes the space complexity of an algorithm, indicating the amount of memory or storage required by the algorithm as a function of the input size “n”. For example S(n) = O(n)
means the space grows linearly with input size.
T(n) refers to the time complexity. It describes the runtime as a function of n. For example we may state an algorithm has time complexity T(n) = O(n log n)
. F(n) and T(n) notations are very similar and often used interchangeably when analyzing algorithms.
These notations are used to analyze and compare algorithms in terms of their efficiency and scalability.
The asymptotic notation like O(n), Ω(n) etc are used in conjunction with F(n), S(n) and T(n) to describe algorithms’ growth rates and resource usage as the input grows to infinity.