Order of growth for g(x) - function

int f (int x)
{
if (x < 1) return 1;
else return (f(x-1) + g(x));
}
int g (int x)
{
if (x < 2) return 2;
else return (f(x-1) + g(x/2));
}
How to calculate the order of growth for g(x) here?

For calculating order of f and g we need to calculate both and the worst order is the answer because they are adding to each other in both function. for example between n and n/2 , n is the worst so the order is n.
Whenever f is called it's f(x-1) , then the order is n. and in g function we have g(x/2) that it means the order of g(x/2) is n/2 because x divided by 2 every time.
So in g we have order n + n/2 and the worst order is n so the result of "order of g(x)" is n.

Related

Find (num * (pow(b, p) - 1) / den) % mod where p is very large(10 ^ 18)

I want to find (num * (pow(b, p) - 1) / den) % mod. I know about binary exponentiation. But we can't do it straightforward. It is guaranteed that the numerator is divisible by the denominator. That means
[num * (pow(b, p) - 1)] % den == 0
constraints on mod: are 1 <= mod <= 10 ^ 9 and mod might be prime or composite
constraints on b: 1 <= b <= 10
constraints on p: 1 <= p <= (10^18)
constraints on num: 1 <= num <= (10^9)
constraints on den: 1 <= den <= (10^9)
Here pow(b, p) means b raised to power p(b ^ p). It is guaranteed that the numerator is divisible by the denominator. How can I do it with binary exponentiation
Your expression should rewritten to simplIfy it. First let k=num/den, with k integer according to your question.
So you have to compute
(k×(b^p-1))mod m=( (k mod m) × ((b^p -1) mod m) ) mod m
= ( (k mod m) × ( (b^p mod m) -1 mod m ) mod m ) mod m
= ((k mod m) × ((b^p mod m) + m-1) mod m) mod m (1)
So the real problem is to compute b^p mod m
Many languages (python, java, etc) already have a modular exponentiation in their standard libraries. Consult the documentation and use it. Otherwise, here is a C implementation.
unsigned long long modexp(unsigned long long b, unsigned long long e, unsigned long long m) {
if (m==1) return 0;
unsigned long long res=1;
unsigned long long bb = b % m;
while (e) {
if (e & 1)
res = (res*b) % m;
e >>= 1;
bb = (bb*bb) % m;
}
return res;
}
The implementation uses long long to fit your constraints. It relies on the classical trick of binary exponentiation. All values of b^l, where l is a power of two (l=2^t) are computed and stored in var bb and if the corresponding tth bit of e is set, this value of b^l is integrated in the result. Bit testing is done by checking the successive parities of e, while shifting e rightward at each step.
Last, the fact that (a×b)mod m=((a mod m)×(b mod m))mod m is used to avoid computation on very large numbers. We always have res<m and bb<m and hence res and bb are codable on standard integers.
Then you just have to apply (1) to get the final result.
EDIT according to the precisions given in the comments
To compute n=(3^p-1)/2 mod m, one can remark that
(3^p-1)/2 = x*m + n (as 3^p-1 is even, x is an integer, 0&leq;n<m)
3^p-1=x*2*m+2n (0&leq;2n<2m)
so 2n=(3^p-1) mod 2m
We can just apply the previous method with a modulo of 2*m, and divide the result (that will be even) by 2.

What's the time complexity of the following code?

What's the time complexity of the following code?
a = 2;
while (a <= n)
{
for (k=1; k <= n; k++)
{
b = n;
while (b > 1)
b = b / 2;
}
a = a * a * a;
}
I'm struggling with the outer while loop, which is loglogn, I can't understand why. How would the time complexity change if the last line was a = a * a * a * a;?
the for loop is O(n), and inner one is O(logn).
So in total, O(n*logn*loglogn)
a values would be:
a = 2 2^3 2^9 2^27 2^81 ...
and so on.
Now let's assume that the last value of a is 2^(3^k)
Where k is the number of iterations of the outer while loop.
For simplicity let's assume that a = n^3, so 2^(3^k) = n^3
So 3^k = 3*log_2(n) => k = log_3(3log_2(n)) = 𝛩(loglogn)
If the last line was a = a * a * a * a the time-complexity would remain 𝛩(loglogn) because k = log_4(4log_2(n)) = 𝛩(loglogn).
the loop is running n times and the inner loop has time complexity is log n so total time complexity is O(n log n)

What is the time complexity of the following pseudocode?

XYZ(a, b, c, m, n){
For p = 1 to m do
For q=p to n do
c[p,q] = a[p,q] + b[p,q];}
I think it is n + n-1 + n-2 +.....+(n-m+1). But I am not sure. Is it this or m*n?
Let's simplify your code :
For p from 1 to m
For q from p to n
Do something
Assuming the Do something part is done in constant time, what determines the time complexity of the code are the two loops. The outer loop runs m times, while the inner loop runs n-p, with p going from 1 to m.
If m >= n, the Do something part is repeated n+(n-1)+...+1 = n*(n+1)/2 = n²/2 + n/2 = O(n²) times.
Otherwise, if n > m, it's repeated n+(n-1)+...+(n-m+1) = (n*(n+1) - (n-m)*(n-m+1))/2 = 1/2 * (n² + n - n² + 2*n*m - n - m² + m) = O(2*n*m - m²) = O(n²) times.
In any case, O(n²) is a right answer, but if n >> m, a more precise answer is O(n*m).

How to find the time complexity of nested for-loop

What is the time complexity for the nested loops shown below:
1)
for (int i = 1; i <=n; i += 2) {
for (int j = 1; j <=n; j += 2) {
// some O(1) expressions
}
}
2)
for (int i = 1; i <=n; i += 3) {
for (int j = 1; j <=n; j += 3) {
// some O(1) expressions
}
}
In general:
for (int i = 1; i <=n; i += c) {
for (int j = 1; j <=n; j += c) {
// some O(1) expressions
}
}
Is is really this the following? O(nc)
Your algorithm will execute (n / c) * (n /c) iterations. We're dividing, because we are skipping c characters for each iteration. See that:
for (var i = 0; i <= n; i = i + 1)
Will have n / 1 iterations
for (var i = 0; i <= n; i = i + 2)
Will have n / 2 iterations
*Note that the result will be floored. That is, if n = 3 and c = 2, it will execute only one time (floor(3 / 2) == 1)
So, we can generalize it to be
(n / c)2
= (n2/c2)
= 1/c2 * n2
Remember that Big O is only interested in the rate of change. Since c is a constant, it is ignored from the calculation.
So, the result is:
O(1/c2 * n2) = O(n2)
For the general case, the inner loop has O(n) and the outer loop has O(n). Therefore, for each iteration of the outside loop, the inner loop iterates n times (c does not matter for order of complexity and should be treated as if it is 1). If the outer loop iterates n times, the total number of iterations in the inner loop is n*n, or O(n^2).
Imagine there are 10 chairs (n here)
in one for loop you are iterating over all the chairs, let say you sit on all the chairs, so in total you need to sit 10 times to sit on all the chairs for a given loop.
Now imagine you sit on first chair and ask your friend to sit on the other chairs one by one including your chair, so in total your friend has to sit on 10 chairs.
Now you choses the second chair, and again ask you friend to sit on each chair again, so in total he again has to sit on 10 chairs.
Similarly you can choose the 3rd,4th... chair and so on, so in total your friend has to sit on 10 chairs for each of the chair you choose.
10 + 10 + ... = 100 times
which is equivalent to 10^2 = 100
So the complexity is O(n^2), where n is the number of chairs.

Blending Function/Bezier

Am I calculating the Bezier blend wrong? Any help would be appreciated.
Thank you very much.
double bezierBlend(int i, double u, int m) {
double blend = 1;
blend = factorial(m) * pow(u, i) * pow(1 - u, (m - i)) / (factorial(i) * factorial(m - i));
return blend;
}
Here's a sample to compute the Bezier blend function, following directly from the formulation:
double choose( long n, long k )
{
long j;
double a;
a = 1;
for (j = k + 1; j <= n; j++)
a *= j;
for (j = 1; j <= n - k; j++)
a /= j;
return a;
};
double bezierBlend( int i, double t, int n )
{
return choose( n, i ) * pow(1 - t, n - i) * pow( t, i );
}
For most applications though, computing the powers and the binomial coefficients each time is absurdly inefficient. In typical applications, the degree of the curve is constant (e.g., 2 for quadratic or 3 for cubic), and you can compute the function much more efficiently by pre-expanding the formula. Here's an example for cubic curves:
double BezCoef(int i, double t)
{
double tmp = 1-t;
switch (i)
{
case 0: return tmp*tmp*tmp;
case 1: return 3*tmp*tmp*t;
case 2: return 3*tmp*t*t;
case 3: return t*t*t;
}
return 0; // not reached
}