Technology

Computer Science: How Fast Do Algorithms Improve in Computer Science?

MIT scientists demonstrate how quickly algorithms are improving across a wide range of examples. This is a clear demonstration of their crucial importance in computing advancements.

Algorithms can be thought of as a parent for a computer. They instruct the computer how to interpret information and make it worthwhile.

The computer will do less work if the algorithm is more efficient. Computer performance is just one aspect of the story, despite all the technological advancements in computing hardware and the highly debated life expectancy of Moore’s Law.

A second trend is happening behind the scenes: Algorithms are improving, which means less computing power. Although algorithmic efficiency is less prominent, you will notice a difference in how fast your search engine works or if it feels like you are wading through sludge.

MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) scientists were inspired to ask: How fast can algorithms improve?

The existing data was largely anecdotal and consisted of case studies of algorithms that were believed to be representative of the broader scope. The team needed more evidence and decided to analyze data from more than 1100 research papers and 57 textbooks to find out the history of algorithms getting better. Some research papers reported on the effectiveness of new algorithms, while others required reconstruction by authors using pseudocode. These are shorthand versions that summarize the basics of the algorithm.

The team examined 113 “algorithm family” sets of algorithms that solve the same problem. These were highlighted in computer science textbooks as the most important. Each of the 113 algorithms was examined by the team, which reconstructed the history of each one, noting each new algorithm introduced and highlighting the most efficient ones. The algorithms were sorted by performance and separated over decades. They ranged from the 1940s through the present. On average, eight algorithms were found per family. However, a few of them proved to be more efficient. The team also created Algorithms-Wiki.org to share their knowledge.

Scientists charted the speed at which these families improved. They focused on the most crucial feature of the algorithms — the speed they could guarantee solving the problem (in computer language: “worst case time complexity”). There was a lot of variability and essential insights into the transformative power of algorithmic improvements in computer science.

43% of algorithm families experienced year-on-year increases in performance for significant computing problems that were equal to or greater than the Moore’s Law gains. The performance gains from algorithms outperformed hardware performance in 14 percent of the cases. Big data problems had the most significant benefits from algorithm improvements, making them more important in recent decades.

The authors noted that the most significant change was when an algorithm family went from exponentially complex to polynomial. It takes a lot of effort to solve an exponential problem. This is similar to a person trying to guess a combination on the lock. It is easier to solve an exponential problem if you have only a 10-digit dial. It’s challenging to make sure that nobody steals your bicycle with four dials, but it is possible that you could try every combination. It’s nearly impossible to do this with 50. There would be too many steps. Computers are unable to deal with exponentially complex problems. They grow faster than the computer’s ability to handle them. A polynomial algorithm can often solve problems, making it possible for hardware improvements to make a difference.

The researchers predict that global conversations will be flooded with rumors as Moore’s Law rapidly fades. Computing users will need to turn to algorithms to improve their performance increasingly. According to the team, these findings show that algorithms have had huge benefits historically. However, if the gains are made from algorithms rather than hardware, they will look very different. Moore’s Law allows hardware to improve over time. Algorithms, however, see gradual gains that are often large but rare.

Neil Thompson, the MIT researcher at CSAIL and Sloan School of Management and senior author of the paper, says that this paper is the first to demonstrate how quickly algorithms improve across a wide range of examples. “Through our analysis, we determined how many tasks could be accomplished using the same amount of computing power after an algorithm was improved. Algorithmic improvement is more important than hardware improvements as problems multiply to the billions and trillions of data points. This is an effective way to improve business and other organizations in an age where computing’s environmental footprint is increasing.

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *

Hello there! Thank you for stopping by Interead, and I am Adam, the mastermind behind this versatile blog. At Interead, I like to cater to basic issues regarding technology, fashion, home decor, and more for my readers so they could benefit from every post. Have a good read!