Forums before death by AOL, social media and spammers... "We can't have nice things"
|    soc.culture.quebec    |    More than just pale imitations of France    |    108,435 messages    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
|    Message 106,693 of 108,435    |
|    Wisdom90 to All    |
|    Read again, i correct.. (1/2)    |
|    11 Jan 20 17:58:52    |
      From: d@d.d              Hello,              Read this:                     Read again, i correct..                     I will add again more logical rigor to my post about about computational       complexity:              As you have just noticed in my previous post (read below), i said the       following:              That time complexities such as n*log(n) and n are fuzzy.              But we have to be more logical rigor:              But what can we take as enough precision ?              I think that when we say 2+2=4, it has no missing part       of precision, so this fact is enough precision,       but if we say a time complexity of n*log(n), there is a missing part       of precision, because n*log(n) is dependent on reality that needs       in this case more precision about an exact precision about the       resistance of the algorithm(read below my analogy with material       resistance), so this is why we can affirm that time complexities such as       n*log(n) and n are fuzzy, because there is a missing part of precision,       but eventhough there is a missing part of precision, there is enough       precision that permits to predict that there resistance in reality are       average resistance.              Read the rest of my previous thoughts to understand:              I correct again one last typo, here is my final post about computational       complexity:              I continu about computational complexity by being more and more       rigorous, read again:              I said previously(read below) that for example the time complexities       such as n*(log(n)) and n are fuzzy, because we can say that n*(log(n) is       an average resistance(read below to understand the analogy with material       resistance) or we can say that n*log(n) is faster than if it was a       quadratic complexity or exponential complexity, but we can not say       giving a time complexity of n or n*log(n) how fast it is giving the       input of the n of the time complexity, so since it is not exact       prediction, so it is fuzzy, but this level of fuzziness, like in the       example below of the obese person, permits us to predict important       things in the reality, and this level of fuzziness of computational       complexity is also science, because it is like probability calculations       that permits us to predict, since computational complexity can predict       the resistance of the algorithm if it is high or low or average (by       analogy with material resistance, read below to understand).              Read my previous thoughts to understand:              What is science? and is computational complexity science ?              You just have seen me talking about computational complexity,       but we need to answer the questions of: What is science ?       and is computational complexity science ?              I think that we have to be more smart because there is like       higher level abstractions in science, and we can be in those       abstractions exact precisions of science, but we can be more fuzzy       precisions that are useful and that are also science, to understand me       more, let me give you an example:              If i say that a person is obese, so he has a high risk to get a disease       because he is obese.              Now you are understanding more that with this abstraction we are not       exact precision, but we are more fuzzy , but this fuzziness       is useful and its level of precision is also useful, but is it       science ? i think that this probabilistic calculations are       also science that permits us to predict that the obese person       has a high risk to get a disease. And this probabilistic calculations       are like a higher level abstractions that lack exact precision but       they are still useful precisions. This is how look like computational       complexity and its higher level abstractions, so you are immediately       understanding that a time complexity of O(n*log(n)) or a O(n)       is like a average level of resistance(read below to know why i am       calling it resistance by analogy with material resistance) when n grows       large, and we can immediately notice that an exponential time complexity       is a low level resistance when n grows large, and we can immediately       notice that a log(n) time complexity is a high level of resistance       when n grows large, so those time complexities are like a higher level       abstractions that are fuzzy but there fuzziness, like in the example       above of the obese person, permits us to predict important things in the       reality, and this level of fuzziness of computational complexity is also       science, because it is like probability calculations that permits us       to predict.              Read the rest of my previous thoughts to understand better:              The why of computational complexity..                     Here is my previous answer about computational complexity and the rest       of my current answer is below:                     =====================================================================       Horand gassmann wrote:              "Where your argument becomes impractical is in the statement "n becomes       large". This is simply not precise enough for practical use. There is a       break-even point, call it n_0, but it cannot be computed from the Big-O       alone. And even if you can compute n_0, what if it turns out that the       breakeven point is larger than a googolplex? That would be interesting       theoretically, but practically --- not so much."                     I don't agree, because take a look below at how i computed the binary       search time complexity, it is a divide and conquer algorithm, and it is       log(n), but we can notice that a log(n) is good when n becomes large, so       this information is practical because a log(n) time complexity is       excellent in practice when n becomes large, and when you look at an       insertion sort you will notice that it is a quadratic time complexity of       n^2, here again, you can feel that it is practical because an quadratic       time complexity is not so good when n becomes large, so you can say that       n^2 is not so good in practice when n becomes large, so       as you are noticing having time complexities of log(n) and n^2       are useful in practice, and for the rest of the the time complexities       you can also benchmark the algorithm in the real world to have an idea       at how it is performing.       =================================================================                            I think i am understanding better Lemire and Horand gassmann,       they say that if it is not exact needed practical precision, so it is       not science or engineering, but i don't agree with this, because       science and engineering can be like working with more higher level       abstractions that are not exact needed practical precision calculations,       but they can still be useful precision in practice, it is like being a       fuzzy precision that is useful, this is why i think that probabilistic              [continued in next message]              --- SoupGate-Win32 v1.05        * Origin: you cannot sedate... all the things you hate (1:229/2)    |
[   << oldest   |   < older   |   list   |   newer >   |   newest >>   ]
(c) 1994, bbs@darkrealms.ca