TransWikia.com

Finding a Good Estimate for Amount of Time Computers Spend Sorting Lists of What Lengths?

Computer Science Asked on December 19, 2021

I have an assignment to envision and calculate the possible effects the implementation of a general sorting algorithm that is O(n) time and O(1) space ( assuming general case ) would have on computation and society.

In order to do that, I am trying to estimate how much computation time computers in server farms spend on sorting, and what general orders of magnitude the lengths of lists being sorted fall in. Server farms seem to be the most impactful area to analyze for pertinent effects, but if there are some other areas that people would expect to see an outsize effect, I would appreciate them being called out.

I can then present sets of assumptions with the "new" algorithm being faster at sorting lists at such a length and larger and how much faster than the current state of the art. eg. "If the new algorithm is even with the state of the art for lists of 500 billion items, and increases in time with "n" while the soa increases in time with "log n", and 1/2 trillion item and larger sorts take X% of server-farm computation, you could expect to see computation time for the same tasks reduced by order of magnitude Y%". Once I can get some firm numbers ( not the question here ) on how much electricity server farms use, then I can also electricity usage cost comparisons.

When I researched on the web, I found that an O(n)/O(1) sorting algorithm is considered theoretically possible, but undiscovered to this point. I suspect that there might not be much impact on computation, because there seems to be no incentives put out for someone to come up with such an algorithm ( or a proof that it cannot exist ).

What is a the fastest sorting algorithm for an array of integers?
https://en.wikipedia.org/wiki/Sorting_algorithm
http://staff.ustc.edu.cn/~csli/graduate/algorithms/book6/chap09.htm#:~:text=We%20have%20now%20introduced%20several,quicksort%20achieves%20it%20on%20average

I have presumed that sorting algorithms in general use, at least for large lists like web search results, are O(n log n) time and O(1) space as a general matter of course. If this is in error, please point it out!

My apologies if this forum is not appropriate for this kind of question. In that case, I would ask for pointers on where I should go with this question.

Thank you.

One Answer

There is no comparison based sorting algorithm faster than O(n log n). So this is all theory.

There are plenty of situations where sorting an array takes O(n) time. These are special cases, but happen a lot. For example a sorted array. Or a sorted array with few elements changed. And there are cases where sorting the whole array isn’t needed. Like if I see my address book on the screen, apparently sorted - but only the bits that I’m looking up are sorted.

Log n isn’t very large, so O(n) doesn’t mean it’s faster than say Quicksort for anything that is small enough to sort.

Server farms rarely sort. Databases are indexed, and indexes are updated, no sorting needed.

The effect on society would be zero. Some people would be able to handle their computational needs for slightly less money, that’s all the difference.

Answered by gnasher729 on December 19, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP