1

My code:

s = '101000101'
for i in range(len(s)-1):                 # check till second last element
     if s[i] == '0' and s[i+1] == '1':
          print("Exists")
          break

In built function:

s = '101000101'
if '01' in s:
     print("Exists")

I have performed both code on hundreds of large strings. And outcome has huge difference.

My code time: 1.26

In built code time: 0.02

At my knowledge, I think I have used simplest and atmost required algorithm. I don't think so there exists better way to search. Then why I am seeing this much difference in outcome?

  • 6
    Python built_in operators are often implemented more efficiently, potentially in C. – Lcj Apr 18 '21 at 14:18
  • 4
    Hint: builtin functions are not written in python – Stefan Apr 18 '21 at 14:18
  • The answers here explain it https://stackoverflow.com/questions/30081275/why-is-1000000000000000-in-range1000000000000001-so-fast-in-python-3 – Ron Serruya Apr 18 '21 at 14:20
  • @RonSerruya - i don't think that answers this question. The link you posted is about generators, but generators aren't really relevant here. – jakub Apr 18 '21 at 14:24
  • @Lcj Your reply totally solved the question and I am laughing so hard at python's poor performance :) – stack aayush Apr 18 '21 at 14:26
  • You will probably see closer results if you run the test in PyPy. But the builtin operator should still be faster. – interjay Apr 18 '21 at 14:27
  • @jakub Yeah that isn't my question but still, I have read it and is useful for me :) – stack aayush Apr 18 '21 at 14:28
  • 2
    *"laughing so hard at python's poor performance"* — Depends on what you mean by that. Python performs fine if you use the builtin `in` operator. If you reimplement that operation in high level Python code, you're not really comparing apples to apples. Python frees you from writing a _ton_ of boilerplate code and allows you to implement _an_ algorithm in little more than is required for an English sentence. The underlying C implementation is probably a lot more complex. When writing in a high level language, you're always trading speed for convenience. – deceze Apr 18 '21 at 14:36
  • 2
    This. You don't normally do complex computations in python. The general idea is more that you *control* those complex computations in python. That's why libs such as numpy, pandas or tensorflow synergize so well, you get the performance of optimized, compiled algorithms and the flexibility of writing the high-level workflow in python. – spectras Apr 18 '21 at 14:40
  • I got a complete guide and this answer is solved by your comments. How can I accept and close this question for further comments? – stack aayush Apr 19 '21 at 03:42

0 Answers0