Posts

Showing posts from August, 2021

New Post

Objective Function Evaluation | Greedy Method | Knapsack Problem Example...

Image
Knapsack Problem using Greedy Method Algorithm Design Techniques Divide and Conquer Greedy Method Dynamic Programming Back Tracing Branch and Bound Divide and Conquer:  Many algorithms are recursive in structure. To solve any problem, they call themselves recursively again and again [one or more times]. Three steps are followed by divide and conquer algorithms. 1.) Divide the problem into the number of sub-problems. 2.) Conquer the sub-problems by solving them recursively. 3.) Combine the solution to the sub-problems into the solution for the original problem. The greedy method  is the Straight design technique. It can be applied to a wide variety of problems. Obtain a subset that satisfies the same constraints.  Feasible Solution: If any subset satisfies these constraints.  Our GOAL: Find a feasible solution that either Maximize or Minimize the given Objective Function. A feasible solution that does this is known as OPTIMAL SOLUTION.   A feasible Solution is  any subset that satisfie

Metaheuristic Optimization Algorithms

Image
 Optimization Engineering - Metaheuristic Optimization Algorithms Optimization plays a very important role in science and engineering. Optimization aim is to find out the minimum or maximum value using any objective function or cost function. In optimization different Metaheuristic Algorithms are used to solve complex problems in various fields such as Engineering Problems, Medical Problems, Computer Problems, and different real-life problems that can not be solved using classical methods. Metaheuristic optimization algorithms are classified into two main categories as Single-based optimization algorithms and Population-based optimization algorithms.  Single-based Meta-heuristic algorithms are also known as Trajectory Algorithms. Single-based metaheuristic algorithms provide the single solution in every iteration. Single-based Metaheuristic algorithm examples: Tabu Search, Guided Local Search, Iterated Local Search, Stochastic Local Search, Iterated Local Search, Variable neighborhoo

Water Cycle Algorithm Step-by-Step Explanation with Example ~xRay Pixy

Image
Water Cycle Algorithm (WCA)  A number of metaheuristic algorithms have been developed to solve various constraints optimization problems. Because according to the No Free Lunch theorem no algorithm alone can not solve various real-world problems. Different problems exist in real life that is complex in nature and hard to solve. Water Cycle Optimization Algorithm is inspired by nature. The water cycle algorithm is a nature-inspired metaheuristics algorithm. Water Cycle Algorithm is basically inspired by the water cycle process in nature. In this video, you will learn how the water cycle algorithm is working step-by-step with examples and its mathematical Model. Water Cycle Algorithm is a metaheuristic optimization method used to solve different constraints-based problems and real-life engineering design problems. Water Cycle is also known as Hydrological Cycle. Water Cycle represents the continuous movement of water below and above the earth's surface. Most Precipitation: Occur as

C++ Program to Calculate Area of Rectangle using Objects and Classes.

Image
C++ Program to Calculate Area of Rectangle using Objects and Classes. Calculate Area of Rectangle: Area = Length * width Program Output: Source Code  #include<iostream> #include<conio.h> using namespace std; class rectangle { private: int a, b; public: void setdata(int x, int y)  { a = x; b = y; } void area() { int area = a*b; cout<<"\n Area of Rectangle = " <<area; } }; int main() { rectangle r1, r2; //objects r1.setdata(15,40);  //object r1 called setdate() cout<<"\nFor First Rectangle "; r1.area(); // object r1 calls area() r2.setdata(30,60);  //object r1 called setdate() cout<<"\nFor First Rectangle "; r2.area(); // object r1 calls area() getch(); return 0; }

Programming in C - Pointers

Image
 Programming in C Language: Pointers Define Pointer. A pointer is a variable that stores a memory address. Like all other variables, it also has a name, has to be declared, and occupies some spaces in the memory.  Why Pointer is called Pointer?  It is called a pointer because it points to a particular location in memory by sorting the address of that location. Pointer General Syntax of Declaration data-type * Pointername; Here, Pointername = Name of pointer variable Astric * preceding this name informs the compiler that the variable is declared as a pointer.  Data type = Base type of pointer. For example:          int * iptr;          float * fptr; here iptr is a pointer that should point to a variable of type int. Pointers are also variables so, the compiler will reserve space for t hem and they will also have some address. All pointers irrespective of their base type will occupy the same space in memory since all of them contain address only. Generally, 2 bytes are used to store addr

Firefly Algorithm Step-by-Step with Numerical Example [PART - 2]

Image
Firefly Algorithm Firefly algorithm is a swarm-based metaheuristic algorithm that was introduced by Yang. Firefly algorithm is used for solving optimization problems. In this video, you will learn the Firefly algorithm with an example. Firefly Algorithm is inspired by the FLASHING Behavior of Fireflies. For simplicity certain Assumptions used in Firefly Optimization Algorithm: - 1.) Fireflies are attracted to each other. 2) Attractiveness is proportional to BRIGHTNESS. 3.) Less Brighter Firefly is attracted to the Brighter Firefly. 4.) Attractiveness decrease as the distance between 2 fireflies increase. 5.) If the brightness for both is the same, fireflies move randomly. 6.) New Solutions are generated by Random walks & the Attraction of fireflies. Firefly Optimization Algorithm Steps: Initialize Parameters Initialize Population randomly in the search space. Compute Fitness values and select the best solution. Check Stopping Criteria. While Current Iteration = 1:Maximum Iterat

Manta Ray Foraging Optimization (MRFO) Algorithm Example

Image
Manta Ray Foraging Optimization (MRFO) Algorithm  Manta Ray Foraging Optimization (MRFO) Algorithm Example Step 01: Initialize Population Size Suppose, Population Size = 4; Lower Bound = -10; Upper Bound = 10; Maximum Iteration = 4; Suppose Initial Population  1.1  2  0.9  3 Step 02: Compute Fitness Value for each using fitness function. Fitness Values 1.21 4 0.81 9 Step 03: Obtain Best Solution Best solution = Minimum Fitness Value in the current population Best Solution = 0.81 Step 04: Check Stopping Criteria While (Current < Maximum Iteration)  1 < 4   ((True) move to next step )  If stopping criteria is then stop and return the best cost. Step 05: Update Position for each individual. For i = 1 to PopulationSize For i = 1:4 If (rand < 0.5)  THEN Cyclone Foraging Else Chain Foraging End if Step 06: Compute Fitnee Value for Each individual and Select Best Individual. Step 07: Perform Somersault Foraging.  Step 08: Compute Fitness Value for Each. End For End While Step 0

Grey Wolf Optimization Algorithm Numerical Example

Image
 Grey Wolf Optimization Algorithm Numerical Example Grey Wolf Optimization Algorithm Steps 1.) Initialize Grey Wolf Population. 2.) Initialize a, A, and C. 3.) Calculate the fitness of each search agent. 4.) 𝑿_𝜶 = best search agent 5.) 𝑿_𝜷 = second-best search agent 6.) 𝑿_𝜹 = third best search agent. 7.) while (t<Max number of iteration) 8.) For each search agent       update the position of the current search agent by the above equations end for 9.) update a, A, and C 10.) Calculate the fitness of all search agents. 11.) update 𝑿_𝜶, 𝑿_𝜷, 𝑿_𝜹 12.) t = t+1 end while 13.) return 𝑿_𝜶 Grey Wolf Optimization Algorithm Numerical Example STEP 1.  Initialize the Grey wolf Population [Initial Position for each Search Agent] 𝒙_(𝒊  )  (i = 1,2,3,…n)     n = 6 // Number of Search Agents  [ -100, 100] // Range Initial Wolf Position      3.2228     4.1553    -3.8197     4.2330     1.3554    -4.1212 STEP 2. Calculate Fitness for Each Search Agent.    Objective Function: F6(x)  = su

Grey Wolf Optimization Algorithm

Image
 Grey Wolf Optimization Algorithm  (GWO) Grey Wolf Optimization Grey Wolf Optimization Algorithm is a metaheuristic proposed by Mirjaliali Mohammad and Lewis, 2014. Grey Wolf Optimizer is inspired by the social hierarchy and the hunting technique of Grey Wolves. What is Metaheuristic? Metaheuristic means a High-level problem-independent algorithmic framework (develop optimization algorithms). Metaheuristic algorithms find the best solution out of all possible solutions of optimization. Who are the Grey Wolves? Wolf (Animal): Wolf Lived in a highly organized pack. Also known as Gray wolf or Grey Wolf, is a large canine. Wolf Speed is 50-60 km/h. Their Lifespan is 6-8 years (in the wild). Scientific Name: Canis Lupus. Family: Canidae (Biological family of dog-like carnivorans). Grey Wolves lived in a highly organized pack. The average pack size ranges from 5-12.  4 different ranks of wolves in a pack: Alpha Wolf, Beta Wolf, Delta Wolf, and Omega Wolf. How Grey Wolf Optimization Algorithm

Invasive Weed Optimization (IWO) Algorithm Step-by-Step with Numerical E...

Image
Invasive Weed Optimization (IWO) Algorithm with Example Invasive Weed Optimization The invasive weed optimization algorithm (IWO) is a population-based metaheuristic optimization method inspired by the behavior of weed colonies. Weeds are u nwanted plants (plant in the wrong place). Weeds can change their behavior according to the environment and gets fitter. Weeds plant can be easily found in: Parks, Fields, Garden, and Lawns Invasive Weed Optimization Algorithm Steps. 1.) Initialization Phase Initialize all important parameters. 2.) Initialize Population. The initial population is created by spreading the finite number of seeds randomly in the search space. 3.) Compute Fitness Values.  Every seed will grow into a flowering plant and produce seeds. [Reproduction].  Seed production is based on fitness values so compute: Individual Fitness Value Best Fitness Value Worst Fitness Value 4.) Random distribution of germinated seeds. Determine new positions of seeds in the search space For R

PARTICLE SWARM OPTIMIZATION ALGORITHM NUMERICAL EXAMPLE

Image
 PARTICLE SWARM OPTIMIZATION ALGORITHM NUMERICAL EXAMPLE PSO is a computational method that Optimizes a problem. It is a Population-based stochastic search algorithm. PSO is inspired by the Social Behavior of Birds flocking. n Particle Swarm Optimization the solution of the problem is represented using Particles. [Flocking birds are replaced with particles for algorithm simplicity]. Objective Function is used for the performance evaluation for each particle / agent in the current population. PSO solved problems by having a Population (called Swarms) of Candidate Solutions (Particles). Local and global optimal solutions are used to update particle position in each iteration. Particle Swarm Optimization (PSO) Algorithm step-by-step explanation with Numerical Example and source code implementation. - PART 2 [Example 2] 1.) Initialize Population [Current Iteration (t) = 0] Population Size = 4; 𝑥𝑖 : (i = 1,2,3,4) and (t = 0) 𝑥1 =1.3; 𝑥2=4.3; 𝑥3=0.4; 𝑥4=−1.2 2.) Fitness Function used:

Firefly Optimization Algorithm

Image
Firefly algorithm is a swarm-based metaheuristic algorithm that was introduced by Yang. Firefly Algorithm is inspired by the FLASHING Behavior of Fireflies.  Assumptions Fireflies are attracted to each other. Attractiveness is proportional to BRIGHTNESS.  Less Brighter Firefly is attracted to the Brighter Firefly. Attractiveness decrease as the distance between 2 fireflies increase. If brightness for both is the same, fireflies move randomly. New Solutions are generated by Random walks & the Attraction of fireflies. Video Link:  https://youtu.be/QvpEMR-Jp0U Firefly Optimization Algorithm Steps Initialize Parameters. Generate Population of n Fireflies. Calculate Fitness Value for Each Firefly. Check stopping criteria if (CurrentIteration := 1 to MaximumIteration ).  Update Position and Light Intensity for Each Firefly. Report the Best Solution. Initialize Parameters, Population of Fire Fly Swarm. Population Size (n) = 20; Maximum Iteration (Maxt) = 50; Dimension (d) = 10; Upper Bou
More posts