Hello friends today we will study how a function grows when we define an algorithm so we first analyze the algorithm and what is the growth of it what is the goal of analysis of an algorithm is an important aspect that we need to consider that why we need to analyze any algorithm first step is to compare algorithms mainly in terms of their running time but also in terms of their factors that is the memory requirement programmers efforts and whatever other tester efforts in the end when we design an algorithm now what do we mean by the running time analysis we have seen the space and the time complexity of an algorithm which plays a very very important role in analysis of an algorithm now the question that comes in our mind is determining how running time increases as the size of the problem increases that is when we take a large real-world problem then how the running time also increases now the various types of analysis that we do on an algorithm those are the worst case analysis the best case analysis and the average case analysis now what do you mean by the worst case analysis the worst case analysis for any algorithm is it provides an upper bound on the running time that is it takes a large amount of time for any input data to run and give us a describe or a prescribed output that is the worst-case probability or worst-case analysis of an algorithm and absolute guarantee that algorithm would not run longer no matter what are the inputs that is it will continuously work in a vigorous manner but will not give us a prescribed or a describe output and the second type is the best case now what do you mean by the best case it provides the lower bound on the running time that is it takes the minimum time required for us to give the desired and the required precise output input is one for which the algorithm runs the fastest is the best case analysis for an algorithm it is given as a syntax as lower bound less than or equal to running time or less than or equal to upper bound and the next case which is known as the average case complexity or the average case analysis of an algorithm now what do you mean by an average case that is it provides a prediction above the running time that is it lies somewhere in between the worst case scenario and the best case scenario it assumes any input which is very very random that is we take a random input and for that random input we get an output so it lies between the upper bound and the lower bound of an algorithm analysis now what are the various analysis functions now how do we define an analysis function it is the running time which is expressed as a function for the input size of size N and that running time of a function is given by f of n the various functions are compared with respect to their running time the analysis are independent of machine time and the programming style the cost analysis for the code is the code structure is given to us the steps and finally we do a simple cost analysis now for this particular statement which is known as an assignment statement which is sum is equal to zero so step is C 1 that is anything that we can give it is a constant step so cost analysis we are doing for the court we give it a terminology c1 and the cost analysis for it is also c1 as we take the next step it is for I is equal to 0 I less than n I plus plus it is given a name c2 and the cost analysis is C 2 into n plus 1 and the third step is for J is equal to 0 to J less than n J plus plus it is again C 2 and C 2 into n into n plus 1 and the third step is again an assignment statement which is a calculation it has given some plus is equal to a of IJ it is given a step C 3 so C 3 into n square so the total cost analysis is C 1 plus C 2 into n plus 1 plus C 2 into n into n plus 1 plus C 3 into n square so this is the cost analysis for the simple code for which example we have taken now what are the asymptotic analysis that is for a particular code when we have done this analysis of cost how do we calculate an asymptotic analysis to compare two algorithms with running times of function f of N and G of n which is denoted for the growth of a functional as we have seen before that a function increases with time the complexity of an algorithm will increase with time so when we compare two algorithms we will see how the running time effects the growth of a function the measure is which characterizes how fast each function grows asymptotic notation now we will see what is big Omega notation now big Omega notation defines and ah Tech greater than that is if you see it is signified by F of N is equal to Omega G of n which is the function will grow with the Omega for the growth of a function it implies that F of n is greater than or equal to G of n it is also represented as function f of n is omega G of n if function f of n is asymptotically greater than or equal to G of it and G of n is an asymptotic lower bound for F of n all this as graphically represented as you can see in the graph the function f of n is greater than or equal to the growth of a function the third case scenario is an asymptotic theta notation which is an asymptotic equality that is the function and the growth are equal both of them grow equally that is the function f of n is Theta growth G of n if f of n is asymptotically equal to G of N and G of M is an asymptotic tight bound for the function and it is represented as shown in the graph the function and the growth of a function are tightly bound now what is a Big O and the growth rate the Big O notation gives an upper bound on the growth rate of a function the function f of n is G of n means that the growth rate of a function is no more than the growth rate of G of n the Big O notation is used to rank functions according to the growth rate Big O notation and the growth rate is also seen in the stable where we put the headers as f of n is all G of N – as Oh F of n if you see G of n cause more F function is with the growth of the given algorithm there is no growth for the simple function and F of n grows more when there’s a growth for a Big O notation and the same growth is there when F of n is G of N and G of n s of FN now relationship between the various asymptotic notations can be represented graphically as if you can see here the Big O the big Omega and the big theta all of them form the 3 different layout structure as Big O falls on the right hand side and big Omega falls on the left hand side and theta is a tight bound for the particular function so the center part of it known as a tight bound for asymptotic notation this represent a clear very clear picture of how my asymptotic notations are bound by the boundaries with each other thank you

Nic video

Thanks.very informative ππ»

Nice video but am I the only one that hears Christmas Music in the background?