Learn how to implement an obstacle-avoiding path planning for a robot using the Grey Wolf Optimization (GWO) in a static environment. #optimization #algorithm #metaheuristic #robotics #deeplearning #ArtificialIntelligence #MachineLearning #computervision #research #projects #thesis #Python
Get link
Facebook
Twitter
Pinterest
Email
Other Apps
Soft Computing - Fuzzy Logic | Fuzzy Relations | DOM | FIS || Unit 1 || ...
Get link
Facebook
Twitter
Pinterest
Email
Other Apps
-
Learn Soft Computing Basics step-by-step using Example.
Video Chapter:00:00 Introduction
00:09 Topics Covered
02:13 What is Fuzzy Logic?
07:06 What is Crisp Set?
08:03 What is the Degree of Membership?
09:05 Fuzzy Logic Components
10:46 Fuzzy Logic Operators
11:47 Fuzzy Relations
15:03 Fuzzy Relation Composition
17:28 Fuzzy Inference System
17:52 Defuzzification
What is a Crisp Set?
A "crisp set" or "crisp logic" refers to the traditional, classical set theory and logic where elements either belong to a set or do not, with no in-between or degrees of membership. In crisp logic, membership is binary—something is either a member of a set (true) or not (false).
What is a Fuzzy Logic?
Fuzzy logic is a mathematical framework that deals with reasoning and decision-making in the presence of uncertainty and imprecision. Unlike classical (Boolean) logic, which is based on binary values (true or false, 0 or 1), fuzzy logic allows for the representation of partial truth or degrees of truth.
In fuzzy logic, truth values are expressed using the concept of "fuzziness" or degrees of membership in a set. Instead of crisp distinctions, fuzzy logic uses linguistic terms such as "very true," "mostly false," "partially true," etc., to describe the degree of truthfulness.
Fuzzy Sets: In classical set theory, an element either belongs to a set or does not. In fuzzy set theory, elements can belong to a set to a certain degree, expressed as a value between 0 and 1.
Membership Functions: These functions define the degree of membership of an element in a fuzzy set. They map elements to a value between 0 and 1, indicating the degree of membership.
Fuzzy Rules: These are if-then rules that express relationships between input and output variables. They use linguistic terms and fuzzy logic operators to make decisions.
Fuzzy Inference System (FIS): This is the overarching structure that encompasses fuzzy sets, membership functions, rules, and inference mechanisms. FIS processes input data and produces fuzzy output.
Membership Functions: These functions define the degree of membership of an element in a fuzzy set. They map elements to a value between 0 and 1, indicating the degree of membership.
Defuzzification: The process of converting the fuzzy output into a crisp value for practical use.
Example of temperature control in an air conditioner.
In traditional logic, you might have a simple rule: if the temperature is above a certain point, turn the AC ON; otherwise, turn it OFF. It's a clear-cut decision.
In fuzzy logic, instead of strict rules like "ON" or "OFF," you might have rules like:
If it's very hot, increases cooling a lot.
If it's somewhat hot, increase cooling a bit.
If it's neither hot nor cold, maintain the current cooling level.
If it's somewhat cold, decrease cooling a bit.
If it's very cold, turn off the AC.
The terms like "very hot," "somewhat hot," etc., represent fuzzy sets. The degree to which it's "very hot" or "somewhat hot" is determined by fuzzy logic. It allows for a more nuanced and flexible approach to decision-making based on the imprecise nature of temperature descriptions.
What is Degree of Membership (DOM)
The "degree of membership" is a measure used in fuzzy logic to express the extent to which an element belongs to a fuzzy set. In fuzzy logic, unlike classical set theory where an element is either a member (with a membership of 1) or not (with a membership of 0), the degree of membership allows for a more gradual and nuanced representation. The degree of membership is a value between 0 and 1, where 0 means the element does not belong to the fuzzy set at all, and 1 means it fully belongs. Values between 0 and 1 indicate partial membership, representing the degree to which the element is part of the fuzzy set.
PARTICLE SWARM OPTIMIZATION ALGORITHM NUMERICAL EXAMPLE PSO is a computational method that Optimizes a problem. It is a Population-based stochastic search algorithm. PSO is inspired by the Social Behavior of Birds flocking. n Particle Swarm Optimization the solution of the problem is represented using Particles. [Flocking birds are replaced with particles for algorithm simplicity]. Objective Function is used for the performance evaluation for each particle / agent in the current population. PSO solved problems by having a Population (called Swarms) of Candidate Solutions (Particles). Local and global optimal solutions are used to update particle position in each iteration. Particle Swarm Optimization (PSO) Algorithm step-by-step explanation with Numerical Example and source code implementation. - PART 2 [Example 2] 1.) Initialize Population [Current Iteration (t) = 0] Population Size = 4; 𝑥𝑖 : (i = 1,2,3,4) and (t = 0) 𝑥1 =1.3; 𝑥2=4.3; 𝑥3=0.4; 𝑥4=−1.2 2.) Fitness Function used:
Cuckoo Search Algorithm - Metaheuristic Optimization Algorithm What is Cuckoo Search Algorithm? Cuckoo Search Algorithm is a Meta-Heuristic Algorithm. Cuckoo Search Algorithm is inspired by some Cuckoo species laying their eggs in the nest of other species of birds. In this algorithm, we have 2 bird Species. 1.) Cuckoo birds 2.) Host Birds (Other Species) What if Host Bird discovered cuckoo eggs? Cuckoo eggs can be found by Host Bird. Host bird discovers cuckoos egg with Probability of discovery of alien eggs. If Host Bird Discovered Cuckoo Bird Eggs. The host bird can throw the egg away. Abandon the nest and build a completely new nest. Mathematically, Each egg represent a solution and it is stored in the host bird nest. In this algorithm Artificial Cuckoo Birds are used. Artificial Cuckoo can lay one egg at a time. We will replace New and better solutions with less fit solutions. It means eggs that are more similar to host bird has opportunity to develop in the new generation a
Particle Swarm Optimization (PSO) is a p opulation-based stochastic search algorithm. PSO is inspired by the Social Behavior of Birds flocking. PSO is a computational method that Optimizes a problem. PSO searches for Optima by updating generations. It is popular is an intelligent metaheuristic algorithm. In Particle Swarm Optimization the solution of the problem is represented using Particles. [Flocking birds are replaced with particles for algorithm simplicity]. Objective Function is used for the performance evaluation for each particle / agent in the current population. After a number of iterations agents / particles will find out optimal solution in the search space. Q. What is PSO? A. PSO is a computational method that Optimizes a problem. Q. How PSO will optimize? A. By Improving a Candidate Solution. Q. How PSO Solve Problems? A. PSO solved problems by having a Population (called Swarms) of Candidate Solutions (Particles). Local and global optimal solutions are used to upda
Particle swarm optimization (PSO) What is meant by PSO? PSO is a computational method that Optimizes a problem. It is a Population-based stochastic search algorithm. PSO is inspired by the Social Behavior of Birds flocking. n Particle Swarm Optimization the solution of the problem is represented using Particles. [Flocking birds are replaced with particles for algorithm simplicity]. Objective Function is used for the performance evaluation for each particle / agent in the current population. PSO solved problems by having a Population (called Swarms) of Candidate Solutions (Particles). Local and global optimal solutions are used to update particle position in each iteration. How PSO will optimize? By Improving a Candidate Solution. How PSO Solve Problems? PSO solved problems by having a Population (called Swarms) of Candidate Solutions (Particles). The population of Candidate Solutions (i.e., Particles). What is Search Space in PSO? It is the range in which the algorithm computes the op
Local Binary Pattern Introduction to Local Binary Pattern (LBP) Q. What is Digital Image? A. Digital images are collections of pixels or numbers ( range from 0 to 255). Q. What is Pixel? A. Pixel is the smallest element of any digital image. Pixel can be categorized as Dark Pixel and Bright Pixel. Dark pixels contain low pixel values and bright pixels contain high pixel values. Q. Explain Local Binary Pattern (LBP)? A. Local binary pattern is a popular technique used for image processing. We can use the local binary pattern for face detection and face recognition. Q. What is LBP Operator? A. LBP operator is an image operator. We can transform images into arrays using the LBP operator. Q. How LBP values are computed? A. LBP works in 3x3 (it contain a 9-pixel value ). Local binary pattern looks at nine pixels at a time. Using each 3x3 window in the digital image, we can extract an LBP code. Q. How to Obtain LBP operator value? A. LBP operator values can be obtained by using the simp
There are about 1000 species of Bats. Bat Algorithm is based on the echolocation behavior of Micro Bats with varying pulse rates of emission and loudness. All bats use echolocation to sense distance and background barriers. Microbats are small to medium-sized flying mammals. Micro Bats used a Sonar that is known as Echolocation to detect their prey. Bats fly randomly with the velocity at the position with a fixed frequency and loudness for prey. Q. Whats is Frequency? A. Frequency is the number of waves that pass a fixed point in unit time. Wavelength is the minimum distance between two nearest particles which are in the same phase. Here, Sound waves are used by microbats to detect prey. Q. What is Position? A. A place where something or someone is located. Q. What is Velocity? A. Speed of something in a given direction. Q. What is loudness. A. Loudness refers to how soft or loud sound seems to listeners. Q. What is pulse rate? A. Wave or vibration. In th
Grey Wolf Optimization Algorithm (GWO) Grey Wolf Optimization Grey Wolf Optimization Algorithm is a metaheuristic proposed by Mirjaliali Mohammad and Lewis, 2014. Grey Wolf Optimizer is inspired by the social hierarchy and the hunting technique of Grey Wolves. What is Metaheuristic? Metaheuristic means a High-level problem-independent algorithmic framework (develop optimization algorithms). Metaheuristic algorithms find the best solution out of all possible solutions of optimization. Who are the Grey Wolves? Wolf (Animal): Wolf Lived in a highly organized pack. Also known as Gray wolf or Grey Wolf, is a large canine. Wolf Speed is 50-60 km/h. Their Lifespan is 6-8 years (in the wild). Scientific Name: Canis Lupus. Family: Canidae (Biological family of dog-like carnivorans). Grey Wolves lived in a highly organized pack. The average pack size ranges from 5-12. 4 different ranks of wolves in a pack: Alpha Wolf, Beta Wolf, Delta Wolf, and Omega Wolf. How Grey Wolf Optimization Algorithm
Grey Wolf Optimization Algorithm Numerical Example Grey Wolf Optimization Algorithm Steps 1.) Initialize Grey Wolf Population. 2.) Initialize a, A, and C. 3.) Calculate the fitness of each search agent. 4.) 𝑿_𝜶 = best search agent 5.) 𝑿_𝜷 = second-best search agent 6.) 𝑿_𝜹 = third best search agent. 7.) while (t<Max number of iteration) 8.) For each search agent update the position of the current search agent by the above equations end for 9.) update a, A, and C 10.) Calculate the fitness of all search agents. 11.) update 𝑿_𝜶, 𝑿_𝜷, 𝑿_𝜹 12.) t = t+1 end while 13.) return 𝑿_𝜶 Grey Wolf Optimization Algorithm Numerical Example STEP 1. Initialize the Grey wolf Population [Initial Position for each Search Agent] 𝒙_(𝒊 ) (i = 1,2,3,…n) n = 6 // Number of Search Agents [ -100, 100] // Range Initial Wolf Position 3.2228 4.1553 -3.8197 4.2330 1.3554 -4.1212 STEP 2. Calculate Fitness for Each Search Agent. Objective Function: F6(x) = su
Whale Optimization Algorithm Code Implementation Whale Optimization Algorithm Code Files function obj_fun(test_fun) switch test_fun case 'F1' x = -100:2:100; y=x; case 'F2' x = -10:2:10; y=x; end end function [LB,UB,D,FitFun]=test_fun_info(C) switch C case 'F1' FitFun = @F1; LB = -100; UB = 100; D = 30; case 'F2' FitFun = @F2; LB = -10; UB = 10; D = 30; end % F1 Test Function function r = F1(x) r = sum(x.^2); end % F2 Test Function function r = F2(x) r = sum(abs(x))+prod(abs(x)); end end function Position = initialize(Pop_Size,D,UB,LB) SS_Bounds = size(UB,2); if SS_Bounds == 1 Position = rand(Pop_Size,D).*(UB-LB)+LB; end if SS_Bounds>1 for i = 1:D UB_i = UB(i); LB_i = LB(i); Position(:,i) = rand(Pop_Size,1).*(UB_i-LB_i)+LB_i; end end end function [Best_Val,Best_Pos,Convergence_Curve]=WOA(
Comments
Post a Comment