Algorithms are fundamental concepts in computer science and programming in general. Even something as simple as an add function is considered an algorithm.

def add(num1, num2):
	return num1 + num2

An algorithm that returns the sum of 2 numbers written in python.

Algorithms are defined as a set of instructions used to solve problems or perform tasks. In other words, anything that is a repeatable set of finite instructions that performs a task or solves a problem can be considered an algorithm.

Before taking a look here, develop a good foundation of data types and structures.


Warning!

Algorithms and fundamentally useless and irrelevant leetcode problems may seem like a huge part of programming and computer science.

From my limited experience and other actual experienced developers, a lot of these algorithms feel like they are never used in the field.

However, algorithms are often repeated set of instructions that will be applied again and again in development. Most importantly, understanding algorithms enhance your ability to understand how computers actually think and therefore improve your ability to solve problems.

Learning algorithms will make you a better developer, even if you don’t use it in daily work.