Sometimes yes they are related, and sometimes no they are not related, actually we sometimes use more space to get faster algorithms as in dynamic programming https:///wiki/tutorial-dynamic-programming dynamic programming uses memoization or bottom-up, the first technique use the memory to remember the repeated solutions so the algorithm needs not to recompute it rather just get them from a list of solutions. and the bottom-up approach start with the small solutions and build upon to reach the final solution. Here two simple examples, one shows relation between time and space, and the other show no relation: suppose we want to find the summation of all integers from 1 to a given n integer: code1:
Melvyn Bragg and his guests discuss complexity and how it can help us understand the world around us. When living beings come together and act in a group, they do so in complicated and unpredictable ways: societies often behave very differently from the individuals within them. Complexity was a phenomenon little understood a generation ago, but research into complex systems now has important applications in many different fields, from biology to political science. Today it is being used to explain how birds flock, to predict traffic flow in cities and to study the spread of diseases.
Cost functions are defined from Input size to the number of steps to solve the problem. When you see the dynamic version of Fibonacci ( n steps to compute the table) or the easiest algorithm to know if a number is prime ( sqrt(n) to analyze the valid divisors of the number). you may think that these algorithms are O(n) or O(sqrt(n)) but this is simply not true for the following reason: The input to your algorithm is a number: n , using the binary notation the input size for an integer n is log2(n) then doing a variable change of