problem about asymptotic-complexity-Collection of common programming errors
templatetypedef
danben
runtime np-complete np-hard asymptotic-complexity
Is there some language that is NP-complete but for which we know some “quick” algorithm? I don’t mean like the ones for knapsack where we can do well on average, I mean that even in the worst case the runtime is something like 2^n^epsilon, where the result holds for any epsilon>0 and so we can allow it to get arbitrarily close to 0.
Jan Hoffmann
runtime time-complexity asymptotic-complexity
Is somebody aware of a natural program or algorithm that has a non-monotone worst-case behavior?By non-monotone worst-case behavior I mean that there is a natural number n such that the worst-case runtime for inputs of size n+1 is less than the worst-case runtime for inputs of size n.Of course, it is easy to construct a program with this behavior. It might even be the case that this happens for small n (like n = 1) in natural programs. But I’m interested in a useful algorithm that is non-monot
devjeetroy
algorithm asymptotic-complexity
I was looking for a good resource on Asymptotic Analysis. Now I am not looking for a book that tells me “the runtime of this algorithm is O(N)”. I want to find a book that teaches me how to actually analyse algorithms and find their runtime. The book should have some practice problems preferably, including questions like “Prove that O(logN) = O(log(N*N))” or something like that. I have been trying to find a book for a while now and all I am left with are books that just skim over the topic and t
Bart
asymptotic-complexity
I just started a course on Asymptotic Analysis and in one of our assignments I am supposed to add functionality to a function without changing the complexity. The complexity is log(N). The homework guideline asks me specifically to change the runtime by a ‘constant’. Would making it 3Log(N) be considered changing it by a constant?
Bill the Lizard
c algorithm asymptotic-complexity
I have a question and I tried to think over it again and again… but got nothing so posting the question here. Maybe I could get some view-point of others, to try and make it work…The question is: we are given a SORTED array, which consists of a collection of values occurring an EVEN number of times, except one, which occurs ODD number of times. We need to find the solution in log n time.It is easy to find the solution in O(n) time, but it looks pretty tricky to perform in log n time.
ThiefMaster
java big-o code-analysis asymptotic-complexity
The method hasTwoTrueValues return true if at least two values in an array of boolean are true. Provide the Big-O running time for all three implementations proposed.// Version 1public boolean hasTwoTrueValues(boolean[] arr) {int count = 0;for(int i = 0; i < arr.length; i++)if(arr[i])count++;return count >= 2; }// Version 2public boolean hasTwoTrueValues(boolean[] arr) {for(int i = 0; i < arr.length; i++)for(int j = i + 1; j < arr.length; j++ )if(arr[i] && arr[j])return true;
Ian R
big-o quicksort asymptotic-complexity big-theta
If so can you provide explicit examples? I understand that an algorithm like Quicksort can have O(n log n) expected running time, but O(n^2) in the worse case. I presume that if the same principle of expected/worst case applies to theta, then the above question could be false. Understanding how theta works will help me to understand the relationship between theta and big-O.
rambo coder
python algorithm computer-science asymptotic-complexity
I want to calculate the running time O(n, x) = Theta(n, x) for a given algorithm depending on n and x by a big amount (> 100) of examples (how long the algorithm will take for n and x). Is there actually a way to do this?I know that the running time increases as n and x (!) increase, but I think the coherences are too complex to figure out O(n, x) by “hand”, since n or x mac increase like n ^ x, or even worse.btw. my most favored languages for solving this problem are Python or PHP.
user1798362
performance algorithm runtime time-complexity asymptotic-complexity
I try to ask it shortly:I have a algorithm, as a function, let call it f:void f(int[1..N]) {// algorithm goes here }Now, I have the real runtime for a N input.Please assume that the function time() returns the current system’s time in milliseconds.int[1…N] n; unsigned long start = time(), end; f(N); end = time(); printf(“Real runtime: %ul”, end – start);So in other words, I know how many milliseconds will f run for argument N.By this information, how can I calculate f(N) run time complexity,
templatetypedef
algorithm runtime big-o time-complexity asymptotic-complexity
I’ve written a function append() in Java and I need to analize its runtime complexity by O(N) and ?(N).This is the original question:Suppose append()’s runtime complexity is t = O(N), it’s mean that t can also be represented by t = C*N. (as C is a constant)Therefore (t / N) = C.If, for example, t = O(N^2), then (t / N^2) = C and so on.use this method to find append() run time coplexity.So I ran append() N times for 3 different Ns: 1,000, 5,000, 10,000.long start = System.currentTimeMillis(); for
user1408872
runtime asymptotic-complexity insertion-sort finger-tree
I’ve searched high and low in my book aswell as several sites on the internet, but I’m just not entirely sure about my answers.I need to give asymptotic runtimes of InsertionSort and FingerTreeSort (based on RB-Trees), in regards to the number of inversions present. InsertionSort runs in O(n+INV) time and FingerTreeSort runs in O(n+n*lg(INV/n+1). I need to give asymptotic runtimes for INV = 0, n, n^1.5 and n^2/4.What I’ve come up with myself is that InsertionSort runs in: O(n), O(n), O(n^2) and
templatetypedef
Andrew
runtime asymptotic-complexity
Can anyone help me analyze the run time of the following pseudocodefor(i = 0; i < n*n*n; i++)for(j = i; j < n; j++)x++The way I see it’s omega(n^3) for the lower bound, since that’s what it would be if inside the outer for-loop was just theta(1).I’m getting confused by the inner loop that only runs for the first n iterations of the outer loop. Do I just average the run-time of the inside loop: n^3 * ((1/n^2)*n + (1/n)*1, in which case it’s O(n^3)?
perimosocordiae
python complexity time-complexity asymptotic-complexity sympy
I’m interested in programming languages that can reason about their own time complexity. To this end, it would be quite useful to have some way of representing time complexity programmatically, which would allow me to do things like:f_time = O(n) g_time = O(n^2) h_time = O(sqrt(n))fastest_asymptotically = min(f_time, g_time, h_time) # = h_time total_time = f_time.inside(g_time).followed_by(h_time) # = O(n^3)I’m using Python at the moment, but I’m not particularly tied to a language. I’ve exper
sasha.sochka
c++ c printf big-o asymptotic-complexity
Assuming that I’m printing a string, as follows:printf(“%s”, s);What can we assume the asymptotic complexity of this function is?Is it O(n) where n is strlen(s) – it’s length? Or is it somehow O(1), constant time. Or something different? I supposed you’d need to know how printf tends to be implemented, however. Any insight is appreciated!(I should clarify that I’m talking about C rather than C++ but I doubt they’re implemented differently)Edit: added formatting string to printf()
Web site is in building