banner

Blog

Sep 30, 2023

A novel Human Conception Optimizer for solving optimization problems

Scientific Reports volume 12, Article number: 21631 (2022) Cite this article

2054 Accesses

16 Altmetric

Metrics details

Computational techniques are widely used to solve complex optimization problems in different fields such as engineering, finance, biology, and so on. In this paper, the Human Conception Optimizer (HCO) is proposed as a novel metaheuristic algorithm to solve any optimization problems. The idea of this algorithm is based on some biological principles of the human conception process, such as the selective nature of cervical gel in the female reproductive system to allow only healthy sperm cells into the cervix, the guidance nature of mucus gel to help sperm track a genital tracking path towards the egg in the Fallopian tube, the asymmetric nature of flagellar movement which allows sperm cells to move in the reproductive system, the sperm hyperactivation process to make them able to fertilize an egg. Thus, the strategies pursued by the sperm in searching for the egg in the Fallopian tube are modeled mathematically. The best sperm which will meet the position of the egg will be the solution of the algorithm. The performance of the proposed HCO algorithm is examined with a set of basic benchmark test functions called IEEE CEC-2005 and IEEE CEC-2020. A comparative study is also performed between the HCO algorithm and other available algorithms. The significance of the results is verified with statistical test methods. To validate the proposed HCO algorithm, two real-world engineering optimization problems are examined. For this purpose, a complex 14 over-current relay based IEEE 8 bus distribution system is considered. With the proposed algorithm, an improvement of 50% to 60% in total relay operating times is observed comparing with some existing results for the same system. Another engineering problem of designing an optimal proportional integral derivative (PID) controller for a blower driven patient hose mechanical ventilator (MV) is examined. A significant improvement in terms of response time, settling time is observed in the MV system by comparing with existing results.

The optimization method is a numerical computational method to find the optimal solution of a real-time problem in a diversified field such as engineering, management, finance and so on1,2,3,4,5,6. Analytical optimization methods are complex and time-consuming processes to get an optimal solution of a complex optimization problem. Again, heuristic optimization methods are problem-dependent techniques7. They need particularities of an optimization problem. They are too greedy to get trapped in a local solution. Meta-heuristic methods are problem independent. They can provide an acceptable solution without guaranteeing optimality8. A simple concept can be implemented easily to make a metaheuristic algorithm to solve a complex problem quickly. Such algorithms can be applied in any optimization problem without altering the structure of the algorithm. In comparison to analytical based optimization algorithms, a metaheuristic algorithm is free from derivation action to find optimal solution. Thus, a real time problem can be solved by any metaheuristic algorithm where it needs only the information of input and output of the system9. Therefore, researchers are giving priority to develop metaheuristic algorithms using natural concepts such as the concept of evolution, the behaviour of natural creatures and hunting procedure followed by animals, and so on9,10,11.

In metaheuristic algorithms, they start with exploring new solutions and transmitting them to exploit the best solution for a given problem11. In the exploitation phase of the metaheuristic algorithm, a new solution is produced based on the best solution available in the population. Thus, metaheuristic algorithms use an exploration and exploitation process to avoid local trapping problems and converge towards the optimal solution. Moreover, by striking a proper balance in the exploration and exploitation phases of such an algorithm, the local optimality problem of traditional methods can be avoided12,13.

In this paper, the Human Conception Optimizer (HCO) is proposed as a novel metaheuristic algorithm to solve any optimization problems. The HCO algorithm is inspired by the biological principle of natural human conception. For successful natural conception by a female, the fittest sperm of a male must fertilise a mature egg. The movement of sperm in the genital tracking path towards the egg is a unique characteristic. Sperm also use a unique technique to avoid environmental obstacles in the female reproductive system14. In Ref.15, the authors have proposed a sperm motility algorithm (SMA) using the principle of chemoattractant secreted by the ovum to guide sperm movement in the female reproductive system. In Ref.16, the authors have proposed a Sperm Swarm Optimization (SSO) algorithm based on the temperature guidance for sperm to search for the egg in the female reproductive system. In Ref.17, the authors have proposed a multi-objective sperm fertilization procedure (MOSFP) as a modified form of SSO for multiobjective optimization problems. The general principles of natural conception have been used to develop novel algorithms15,16,17. The conception process from the point of view of sperm movement towards the egg in the female reproductive system is modeled in this paper. The concept of the natural selection of healthy sperm allowed by cervical gel to enter into the cervix, their unsymmetrical trajectories during movement, sperm guidance mechanism, flaggers’ moving characteristics with very spatial hyperactivation principle of sperm during the fertilisation of a mature egg are utilised as the ideas of the proposed Human Conception Optimizer (HCO) algorithm. Such concepts are utilized for the first time to develop a metaheuristic algorithm. The efficiency of the proposed algorithm is validated with standard IEEE CEC-2005 and CEC-2020 benchmark functions. A comparative study is also performed between the HCO algorithm and some existing algorithms for the benchmark functions. The statistical significance of test results is also studied with two types of non-parametric tests, such as the Friedman test and the Wilcoxon signed rank test. The applicability of the HCO algorithm in engineering problems has been validated for two different cases. In the first case, the optimal coordination of over-current relays in a power distribution network is studied for an IEEE 8-bus system. In the second case, an optimal PID controller is designed for the human respiratory ventilation system.

In the literature, different analytical solutions are found, such as Quadratic Programming (QP), Dynamic Programming (DP), Lagrangian method for optimization problems. All such methods are based on deferential operators. They start searching for an optimal solution nearest to the initial point. Insufficient gradient data may lead them to a local solution. Thus, a limited application of such methods is found for real-world, complex optimization problems. In this regard, metaheurestic methods are found to be better than analytical methods in the literature1,2,3. There are mainly three types of metaheuristic algorithm found in literature such as: physics based, swarm intelligence based and evolutionary metaheuristic algorithm. In evolutionary metaheuristic algorithms, the laws of natural evolution are used12. The search technique begins with a random generation population where the best solutions are combined and mutated to form new solutions. Genetic algorithm (GA)18 is one of evolutionary algorithms based on the Darwinian evolution concept. Other evolutionary algorithms are such as Evolution Strategy (ES)19, Genetic Programming (GP)20 etc. Some metaheuristic algorithms are inspired by well-known physical laws of the nature. Some of them are Simulated Annealing (SA)21, Gravitational Search Algorithm (GSA)22, Big-Bang Big-Crunch (BBBC)23, Atomic orbital search (AOS)24, Charged System Search (CSS)25 etc. Some swarm-based metaheuristic algorithms are inspired by the social behaviour of animals such as Particle Swarm Optimization (PSO)6, Ant Colony Optimization (ACT)26 etc. A swarm intelligence based algorithm employs a large number of particles to cover a search space, and the optimum answer is discovered by following the best location along their pathways27. Particles with their best solutions and the best one obtained so far in the swarm are used to update the particle position. Many other swarm based algorithms are found in literature such as Whale Optimization Algorithm (WOA)28, Grey Wolf Optimization (GWO) algorithm5, Sailfish Optimizer (SFO)29, Bottlenose Dolphin Optimizer30. Some human behavior-based metaheuristic methods are also found in literature such as Teaching Learning Based Optimization (TLBO)31, Group Search Optimizer (GSO)32, Imperialist Competitive Algorithm (ICA)33, Class Topper Optimization (CTO)4, Criminal search optimization34 and so on.

Generally, a metaheuristic algorithm starts with a random initialization of the search variables within specified range. The convergence performance of such an algorithm depends on the correct selection of the initial value or position of the searching agent. An improper selection of initial value, which may be in a different direction where the actual solution may exist, can lead towards the wrong solution. Thus, the selection of the initial position of the search variable has an impact on the convergence performance of a metaheuristic algorithm12,13. Another issue in metaheuristic algorithms is the trapping problem in local solutions during the exploration and exploitation stages of such algorithms. Thus, the improper selection of initial value or position of search agents and local stack problems of such metaheuristic methods need to be solved to get an efficient optimizer. In Ref.35, authors presented a theory named the No Free Lunch (NFL) theory and proved that there is a universal best optimization method as all such methods perform similarly for all possible optimization-based problems. Therefore, many authors are involved in developing specific problem-based optimizers with the aim of getting global and local search strategies. In this regard, an attempt has been made to solve such issues by developing the Human conception optimizer (HCO). The unique features of the human conception process justify the development of such an algorithm. The HCO algorithm can solve the issues as stated above by resembling some spatial techniques of the conception process, which are discussed in the next section. A list of some existing optimization methods has been presented in Table 1.

The core of this paper is to establish a nature-inspired optimizer named the Human Conception Optimizer (HCO). Some unique features of the human conception process are utilized to develop the algorithm to solve any optimization problems. A method of generating healthy populations at the start of the HCO algorithm is modelled by replicating the concept of sperm selection by cervical gel according to the fitness of each one. A probability function is defined for this purpose. The probability function is formulated by considering a fraction of sperm (position of solutions or searching agents) lies between the best and worst positions in the population. During the generation of a healthy population, the concept of possible egg position in either ovary is also utilized. Thus, the initial generation of the population will also be based on the best combination of a randomly generated search variable and its oppositional directional (sperm positions) based search variable. Thus, the initial sperm positions (positions of solutions or searching agents) in the healthy population are already formed with the possible best direction where global solution may exist. Moreover, sperm oriented far away from the global solution or those that are towards the opposite direction of the global solution are ignored at the initial stage. Thus, the optimal solution can be searched within a healthy population with the possibility of getting the best solution quickly. Therefore, the issue of random initialization of the position of the search variables, which may be in different directions or far away from the global solution, is avoided in the HCO algorithm. Such a velocity profile will balance exploration and exploitation based on the fitness of the best sperm cell (position of search variable) and the fitness of the average sperm cells in an iteration. This will happen during the updating of positions of search variables (sperm cells). A hyperactivation function is also formulated by replicating the concept of flagellar oscillation during the hyperactivation stage of a sperm fertilizing an egg in the Fallopian tube. This function will help the algorithm escape from the local optimal solution.

In this section, the inspiration and the mathematical modeling of the Human Conception Optimizer are explained in detail.

Human conception happens when a healthy sperm cell meets the egg in the Fallopian tube54. The process begins with millions of sperm released into the female reproductive tract. All sperm cells compete to fertilize a single egg as presented in the Fig. 1a. In general, a single sperm is able to fertilize the egg in the Fallopian tube. Among the millions of sperm, a population of the most capable sperm can enter the door of the cervix. The cervical fluid called mucus, helps the spermatozoa swim through the uterus and the fallopian tube. Cervix filters out the liquid called semen which enclosed the sperm cells released into the vagina. Sperm uses a variety of mechanisms as they travel to the egg55,56. The method of sperm meeting egg for successful fertilization is explored in detail below.

Human conception occurs when a sperm cell is able to meet a mature egg, interact, and fuse in the female reproductive system57. Initially, sperm takes a random position in the vagina and stay inside the fluid called semen. According to the fitness of sperm, a swarm of the fittest sperm cells is able to enter the cervix. During their journey to the egg, sperm perform several outstanding navigational tasks. The sperm tail (flagellum) aids sperm swimming towards the egg by creating an irregular and oscillating beat pattern, as shown in Fig. 1b. While balancing the moment of force caused by flagellum motion, the cell head rotates and exerts force against the cervical fluid to move forward.

Human sperm movement. (a) Sperm cells movement in female reproductive system. (b) Sperm cells beat pattern. (c) Egg position in either side of fallopian tube. (d) Sperm cells moving trajectory.

Sperm cells move with different hydrodynamic modes (such as typical, helical, hyperactivated or chiral ribbons) on the basis of environmental conditions such as temperature and viscosity inside the female reproductive system58. Sperm can collect physical and chemical information to identify the egg in the female genital system with the help of some mechanism, such as59,60:

Rheotaxis-sperm orientation against the fluid to move upstream.

Thermotaxis-sperm sense temperature variation in the reproductive system. It swims against a temperature gradient in a higher temperature zone near oviduct.

Chemotaxis-the movement of cells up to a concentration gradient of chemoattractant. Sperm move toward increasing chemical concentration.

Chemotaxis was suggested in the literature as an active sperm guidance mechanism61. Sperm can sense the change in liquid concentration in the uterus. In thermotaxis, sperm move toward a higher temperature in the female reproductive system. The contractions of mucus in the female reproductive zone may also guide the sperm towards the egg.

Active sperm use a stroke called hyperactivation to cross the barrier of cumulus cells surrounding the egg. A fraction of sperm is able to become hyperactive. The flagellar beats of hyperactivated sperm have high curvature and a wider amplitude, leading to a highly active motility. Such a pattern of hyperactivity may create forces to facilitate sperm detachment and migration. The sperm have to pass another barrier called zonal pellucida (a layer of egg). The sperm cells undergo a process called the acrosome reaction, an enzyme deposited at the head of the sperm. It helps to break the zonal pellucida barrier to fertilise the egg62.

Among millions of sperm cells, only a single sperm cell is able to fertilize the matured egg in the challenging environment of the female reproductive system. The complete process is so challenging and unique that it motivates us to utilise the selection principles of winner sperm to develop a nature-inspired metaheuristic algorithm. In the next section, the detailed modelling of the proposed algorithm is discussed.

In this section, the biological principles of human conception are mathematically presented to develop the HCO algorithm. Generally, a set of natural facts and assumptions are considered to formulate the HCO algorithm. The concept of HCO is summarized as follows:

After being released on the vagina, sperm cells enter the cervix, where their journey begins in a hostile environment. Only healthy sperm cells can enter the uterus and fallopian tubes58 (Fig. 1a). In a fertile female, either the right or left ovary produces a mature egg for fertilization as shown in Fig. 1b. The mucus fluid in the uterus helps sperm cells swim towards the egg63. This concept will be used to find a suitable initial fittest population from a randomly generated population of the initial positions of sperm cells or search agents. During the evaluation of sperm fitness, the possible position of the mature egg (global solution) will be examined by considering the right ovary as the place in a positive movement in the search area where the egg (global solution) may be found. The left ovary is considered as the place in a negative movement in the search area where the egg (global solution) may be found. The mucus fluid dynamics will be used to model the velocity of the sperm cells (sperm) to update their position during the exploration and exploitation stage of the proposed algorithm.

The tail of the sperm creates a jerking like movement which helps the sperm move into the uterus. Sperm cells starts following the curvature path caused by flagellar movement to reach the egg62. This concept will be realised to model the sperm movement through a curvature tracking path during the searching procedure of the algorithm. At each iteration, the best position achieved by each sperm cell along the curved path will be evaluated and called the present best position or solution gained by each cell.

The tail of sperm can sense the concentration of liquid in the reproductive system. According to that, it changes the position62. This sensing technique of liquid concentration in the reproductive system will be utilised to mimic the position update of sperm with respect to the best position of sperm achieved by any sperm cell in the population till the present iteration.

Sperm cells overcome the barrier across the egg by a hyperactivation process. They have to pass another barrier called zonal pellucida. To pass such a barrier, sperm must undergo a process called acrosome reaction. This is an enzyme deposited at the top of sperm cells. It will break the zonal pellucida barrier, allowing sperm to penetrate the egg60. This concept will be used to overcome the local stuck problem of the algorithm.

The detailed modeling of the HCO algorithm is given below.

During intercourse, millions of sperm cells are discharged into the female menstrual system. All cells try to enter the cervix. The liquid inside the cervix will allow only healthy cells to enter into the cervical tracking path. Therefore, there is a natural selection of initial healthy sperm cells where only fit cells can start the journey from the cervix towards the egg64. In HCO, each search agent resembles the position of the sperm cells. In any metaheuristic algorithm, the performance of a swarm-based optimization method depends on the initialization of the population. In HCO, the initial position of sperm cells will be generated randomly within a search space with a higher population size. From the initial population, a fitter population will be produced, which will follow the other steps of the proposed algorithm.

Step 1: initial population generation Let, there are \({``N''}\) number of sperm cells ejaculated into the vagina during intercourse. It indicates the same population number in the metaheuristic algorithm. The dimension of the population will depend on the optimization problem. The position of sperm cells is the position of sperm in the HCO algorithm. Each particle in the search space is the candidate of solutions for a particular optimization problem.

Let initial position of sperm cells (X) is defined as follows:

In HCO, the initial positions of sperm cells are determined randomly as follows:

where \({i=1,2,\ldots ,N}\) and \({j=1,2,\ldots ,d}\); N is number of sperm cells or search agents inside the search space, d is the dimension of the problem, \({{r}_{1}}\) is the random number between 0 to 1, \({x_{i}^{j}}\) is the initial position of particle (sperm cells), \({x_{{{i}_{\max }}}^{j}}\) and \({x_{{{i}_{\max }}}^{j}}\) are the maximum and minimum limits of \({i\text {th}}\) sperm in \({j\text {th}}\) decision variable.

Step 2: initial fitness evaluation-Modeling of egg position in the ovary In a fertile female, either the right or left ovary produces a mature egg for fertilization, as shown in Fig. 1c. The right ovary is considered the place in positive direction in the searching area where the egg (global solution) exist. The left ovary is considered as the place in negative direction in the searching area where the egg (global solution) exist64. This concept is used in the proposed algorithm to check the solution to an optimization problem on both sides of the search space. During the evaluation of a solution candidate x for an assigned problem, the opposite solution of x may provide a better solution \(x_op\). For example, if a solution of x is − 10 and the optimal solution is 40, then the opposite solution (\(x_{op}\)) is 10 and the distance of x from the optimal solution is 50. The distance between \(x_{op}\) and the current best solution is 30. As a result, according to Ref.64, the opposite solution, \(x_{op}\), is much closer to the global solution.

The algorithm first examines the fitness of all randomly generated initial search agents. The fitness values of all initial sperm (sperm cells) are defined as follows:

where F(X) is the fitness matrix with fitness value of all sperm (sperm).

Position of opposite directional solution The population of opposite directional solution will be calculated as follows:

where, a and b are lower and upper boundary of search agent respectively.

Fitness \(F({{X}_{oppo}})\) of opposite directional population \(({{X}_{oppo}})\) will be evaluated for an objective function based of the optimization problem.

Thus, the initial population based on egg position will be as follows:

In a fertile female, a mature egg is produced by either the right or left ovary for fertilisation every month during ovulation61. Typically, a single egg is released at a time. This concept can be modelled as for a single-objective optimization HCO algorithm. In some cases, more than one egg may be released, sometimes resulting in the conception of multiples (twins). This concept leads to the multiobjective HCO algorithm. To simplify, the present paper is discussed as a single objective HCO algorithm. The twins may be produced by fertilising a mature egg with two sperm cells. In the HCO algorithm, among two close solutions, the best one will be selected, ignoring the twin solution.

Step 3: selection of healthy population In the natural fertilization process, only healthy sperm cells can enter into the cervix to fertilize a mature egg. In HCO algorithm, the initial population size is taken as high as possible from where an initial fittest population will be selected according to a probability function. The fittest population will be allowed to follow the further steps of the proposed algorithm.

The best answer is assigned as the initial best solution (fittest sperm cell). The worst solution is also identified. The fitness of others will be compared with the fitness of the initial best with a probability of \((P_{fit})\). The probability of selecting the best population to move toward one of the best solutions is then calculated as follows:

where w is a weight factor.

Therefore, the healthy population will be chosen as:

Thus,

where \({{\chi _{healthy_i}}}\) is position of \({i\text{th}}\) healthy sperm, n is the size of fit population.

The fitness of initial fit population for an objective function depending on optimization problem will be as follows:

where \({F({{\chi }_{healthy}})}\) is the fitness matrix with fitness value of all healthy sperm cells (sperm).

The healthy or fit population will be used as the fittest initial population to search for the best solution for an optimization problem. In HCO, this step to find the fittest population from the initial randomly generated population will be done only once.

Algorithm 1: Pseudo-code of HCO for generation of initial healthy population

Input: Set population size of sperm position, other constants.

\(/*\) Generate initial random particle \(*/\)

Generate initial population for each variable randomly within a range of search space by using (2).

\(/*\) Evalute fitness \(*/\)

Evaluate fitness \({f(x_i)}\) of each particle (\({x_i}\)) for each variable with an objective function for a optimization problem. Calculate fitness \({f(x_{i_{oppo}})}\) with opposite directional sperm \({x_{i_{oppo}}}\).

if \({{f(x_i)}>{f(x_{i_{oppo}})}}\)

Select \({x_i}\)

else

Select \({x_{{i}_{oppo}}}\)

end if

\(/*\) Select initial best and worst particle \(*/\)

Find the best fitness \({f_{best}(x)}\) and worst fitness (\({f_{worst}(x)}\)) from the fitness matrix (10).

Derive the probability function using (7).

if \({f({{\chi }_{i}})\le {{P}_{fit}}}\)

Update fit population using (8).

else discard and check for next healthy sperm.

end if

Output: Initial healthy population

Sperm orientation can be a replica of particle orientation. Some sperm may be towards the global solution and some may be alongside the boundary of the search space. Some of them may be in the opposite direction of the global solution. In HCO, the initial fittest population is chosen with sperm (position of search agents) oriented towards the egg (best initial solution).

The male reproductive cell, sperm, has a single flagellum or a tail. To achieve fertilization, sperm needs to move up the oviduct. The sperm's tail produces a distinctive, jerky motion that pushes the head of the sperm backward and sideways while simultaneously propelling the sperm forward. The cells migrate through the fluid in the cervix by moving backwards and sideways. The sperm cell is aided in its journey toward the egg by this combination of actions. They can't swim backwards due to the nature of flagellar movement. The moving trajectory of the sperm cell is shown in the Fig. 1d.

Human sperm use various sensing mechanisms to gather physical or chemical signals to spot the egg. During the fertilisation process, sperm cells move along the narrow cervical tracking path towards the oviduct. Mucus in the cervix helps sperm move through the uterus and oviducts62. There are three types of sperm swimming guidance mechanisms: thermotaxis (based on temperature gradient), rheotaxis (swimming against a fluid flow), and chemotaxis (based on chemoattractant concentration gradient)55. Sperm cells will move against the mucus flow, which is a rheotaxis mechanism. They assume the egg position based on the concentration of liquid change near the egg. In HCO, the rheotaxis mechanism of sperm guidance towards the egg is used to find the velocity of sperm in fluid against the flow. The flagellar asymmetric movement is taken as a sinusoidal curvature path in the HCO algorithm.

The human spermatozoa can sense a flow of liquid and change the direction of their path against the flow. It performs positive rheotaxis and orients itself against an oncoming flow. Mucus flow (like as a sperm cell flow in fluid) can be described by the Poiseuille profile, where the speed increases quadratically with the distance to the compartment boundary. The Poiseuille profile is used to find the speed of the sperm cells. It tells how fast the sperm cells are moving at each point within the uterus65,66.

In HCO, the Poiseuille velocity profile is used to model the velocity of sperm to update their position. The Poiseuille velocity profile for sperm movement in the female reproduction tracking path is shown in Fig. 2b. To model the Poiseuille velocity tracking profile, the fitness matrix (10) will be used.

Sperm velocity profile: (a) a section of tube of radius (a), Velocity of fluid at a distance r from the center of the tube. (b) Sperm velocity profile based on.

The velocity profile shows the amplitude of velocity according to the position of a particle in a fluid. According to the Poiseuille velocity profile, the velocity at a point, called specific radius (r) in the fluid can be calculated by measuring the distance of the point from the centre of the tube, as graphically shown in Fig. 2a. At the specific radius (r), the velocity is formulated as65,66:

where P is the pressure difference, L is length of a pipe with radius a, \({\eta }\) is dynamic viscosity.

In HCO, the fitness sperm is used to mimic the velocity profile. The velocity of a sperm in the current iteration is calculated by taking the sperm's current position \({{{\chi }}_{i}}\) in the healthy population and multiplying it by its fitness \({f\left( {{{\chi }}_{i}} \right) }\). The centre of the flow resembles the average position of sperm with a fitness of (\({f({\chi }_{avg})}\)). The fitness level of the present global best position is \({f({\chi }_{best})}\).

Steps to mimic Poiseuille velocity tracking profile for sperm's velocity modeling:

Assign the initial best fitness value of a sperm cell for a given optimization problem with a fitness function in a iteration as \({f({\chi }_{best})}\).

Calculate the average fitness \({f({\chi }_{avg})}\).

Calculate the velocity of \({i\text{th}}\) sperm cell with the fitness value \({f(x_i)}\)as follows:

where \({R=f\left( {{{\chi }}_{best}} \right) -f\left( {{{\chi }}_{i}} \right) }\), \({r=f\left( {{{\chi }}_{avg}} \right) -f\left( {{{\chi }}_{i}} \right) }\), \({L=f\left( {{{\chi }}_{best}} \right) -f\left( {{{\chi }}_{avg}} \right) }\), \({{\nu }_i}\) is velocity of \({i\text{th}}\) sperm cell, \({f\left( {{{\chi }}_{avg}} \right) }\) is the average health of the population, \({f\left( {{{\chi }}_{best}}\right) }\) is the health of best solution (optimal position), \({\eta }\) is a constant generated with random value in the range of 0 to 1, and \({\gamma }\) is a random number between 0 and 1.

The vector diagram of velocity profile of sperm cells is also shown graphically in Fig. 2b.

After entering into the cervix, the sperm cells grabbed an initial velocity in the cervical fluid. In HCO, sperm initial velocity is modelled according to the Poiseuille velocity tracking profile as presented in Fig. 2b. The position of a sperm cell in the current iteration will be compared with its previous position, and the best one will be assigned as the present best solution \((S_{p_{best}})\) for the sperm cell. In the healthy population, one sperm cells achieved the best position among all in an iteration and will be treated as the global best solution \((S_{g_{best}})\) in that iteration. The sperm cellc will move along a sinusoidal path, resembling the nature of the sperm movement in a curvature path with the updated velocity.

In the search space the velocity of sperm will be updated as follows:

where \({{A_1}}\) is \({(S_{p_{best}}-S_i)}\); \({{A_2}}\) is \({(S_{g_{best}}-S_i)}\); \({C_1}\) is a constant; \({C_2}\) is a constant.

Along the curvature path, the position of sperm will be updated in HCO as follows:

where \({{{\vec {{\chi }}}_{i}}(j)}\) is the position of \({i\text{th}}\) sperm at \({j\text{th}}\) iteration, \({{{\vec {V}}_{i}}(j)}\) is the velocity of \({i\text{th}}\) sperm at \({j\text{th}}\) iteration.

Algorithm 2: Pseudo-code of HCO for update sperm position

Input: Healthy population of initial sperm positions, define other constants

\(/*\) Generate initial healthy population of sperm positions \(*/\)

Generate initial healthy population of sperm position for each variable according to Algorithm 1.

\(/*\) Evalute fitness function \(*/\)

Evaluate fitness \({f({\chi }_i)}\) of each sperm (\({{\chi }_i}\)) for each variable with an objective function for a optimization problem.

Identify average fitness of sperm (\({f({\chi }_{avg})}\)) in the population, fitness of best sperm (\({f({\chi }_{best})}\)).

Identify the best sperm (global solution) \((S_{g_{best}})\) achieved at present iteration. Also, identify the current best position of each sperm \((S_{p_{best}})\) at current iteration.

\(/*\) Evalute velocity of sperm \(*/\)

Evaluate velocity of each sperm in the healthy population using (12).

\(/*\) Update velocity of sperm \(*/\)

Update velocity of sperm using (13).

\(/*\) Update position of sperm \(*/\)

Update the position of each sperm using (14).

Repeat step 5 to 13 till the termination criterion reached or maximum number of iteration.

Output: Fittest sperm or global solution.

The flowchart of the proposed algorithm is presented in Fig. 3.

In human conception, sperm cells conform a obstacle of cumulus cells around the egg. Before reaching the egg, the sperm cells are often trapped in epithelial cells in the fallopian tube. They are rendered inert unless they undergo hyperactivation67. To cross this barrier of cumulus, the sperm cells must use a very specific stroke called hyperactivation. It is characterised by an asymmetrical flagellar beat pattern which rises to a whip-like movement of the flagellum that can produce circular figure-eight swimming trajectories. The change in motion and force of the tail movement in the trajectory enable the sperm to escape from the epithelium67.

In HCO, the concept of sperm hyperactivation process will be adapted when the best solution is found stuck in a position for a long time before reaching termination criteria. The position of the hyperactivated particle will be compared with the best solution achieved before the hyperactivation process. Among the hyperactivated solutions and the non-hyperactivated solutions, the best one will be assigned as the current global solution for the population. To model the hyperactivation process, eight (8) shaped beat patterns are used. The new position of the best hyperactivation particle is modeled as follows:

where \({{x}_{globalbest}(j)}\) is the global best solution at \({j\text{th}}\) iteration, \({x_{hyperactivated}(j)}\) is hyperactivated best solution at \({j\text {th}}\) iteration. It will be used only when the global best solution get stuck at same position for more than two iteration.

Flowchart of proposed HCO algorithm.

HCO provides some advantages which makes it unique from some others algorithm. Some spatial features are as follows:

Concept of healthy initial population The HCO algorithm replicates the concept of sperm selection by the female reproductive system to allow them in the cervix and the position of the egg in either of the ovaries. The initial population in the HCO algorithm is not assigned directly to a randomly generated initial sperm within a search space. In this algorithm, a healthy population is generated at the initial stage by neglecting the sperm in the population oriented so far from the optimal position. Using the concept of egg position in the right or left ovary in the Fallopian tube, the fittest of all randomly generated sperm is evaluated along with the fitness of their opposite directional sperm in the search space. Thus, the healthy population will be based on the best possible solution in the positive or negative direction in the search space. The healthy population will ensure the best initial fit population within which the optimal solution will be found by the algorithm. Healthy populations will include initial positions of sperm based on their initial fitness and the best side of the position of the mature egg (global solution) by checking a sperm position and its opposite distortional position.

Velocity update based on Poiseuille Velocity profile During the updation of the velocity of sperm cells, the position-based velocity profile is used, called the Poiseuille Velocity profile. The advantage of using such a velocity profile in the HCO algorithm is that the velocity of each sperm or search variable at an iteration will be calculated with the fitness value of the best position of a sperm or search variable in that iteration along with the average fitness in the population. Therefore, a good balance can be maintained between the exploration and exploitation stages of the algorithm.

Hyperactivation for local optima avoidance Like sperm's hyperactivation process to fertilize egg, a hyperactivation function is used in the HCO algorithm to avoid local solution trapping problems.

A metaheuristic algorithm must have some capabilities to solve complex optimization problems. An optimizer must exhibit a good balance in exploration and exploitation stages, local optima avoidance, and smooth convergence capability. To check the achievement of the HCO algorithm, two suites of test functions are taken in the study, such as 23 numbers of classical test functions from the CEC 2005 special session and ten number of 30 and 500 dimensional benchmark or test functions from the CEC-202024.

In this section, the response of the HCO algorithm is verified with CEC 2005 BMFs24. Such functions are minimization functions. They can be grouped as: unimodal, multimodal, and fixed-dimensional multimodal. The details of such BMFs can be found in the CEC 2005 technical report24,27. The termination of the algorithm is set at a fixed iteration. The other parameters for the HCO algorithm are presented in Table 2. The LabVIEW©2015 platform is used for the simulation purposes of the algorithm. This algorithm is executed several times for each reference function. After several tests, the average and standard deviation (SD) of each BMF are examined. The convergence performance for each BMF with the HCO method is carried out and compared with PSO6, CTO4, GWO5, WHO28, and SFO (Sailfish Optimizer)29.

For each BMF, the population size is considered as 500, and the maximum iteration is 500. Other constants of the HCO algorithm are tabulated in Table 2. The HCO algorithm is executed 30 times, with 500 iterations for each function. For analysis purposes, the average and SD of objective values are examined. The output of the BMFs is presented in the Table 3. There is a single global optimum point for unimodal functions (\({F_1}\) TO \({F_7}\))27. By looking at the Table 3, it can be seen that HCO performed better for \({F_1}\), \({F_2}\), \({F_6}\) and \({F_7}\) than CTO, PSO, and SFO. For function \({F_4}\), HCO is better than CTO, PSO. For function \({F_5}\), HCO is found better than SFO, PSO. For function \({F_6}\), HCO is found better than CTO, SFO. For the function \({F_3}\), GWO is found better than HCO. For the function \({F_6}\), HCO is found better than GWO as found in Fig. 4. For the functions \({F_3}\) and \({F_6}\), HCO is found better than WHO as found in Fig. 4.

The multi-modal functions such as \({F_8}\) to \({F_{13}}\)27 exhibit multiple local optimal. The exploration feature of an optimization method may be verified with multimodal functions. By analyzing the Table 3, HCO performed better than PSO, CTO, and SFO for \({F_{11}}\) to \({F_{13}}\). Other multimodal functions also offered better performance compared to PSO, CTO, and SFO. For function \({F_8}\), the minimum value of \({F_8}\) as specified by CEC200524 is found by the HCO algorithm where others are not able to find the same. For function \({F_9}\), HCO is better than CTO, SFO. For some fixed dimensional functions such as \({F_{19}}\) to \({F_{22}}\)27, HCO is superior to SFO, CTO, and PSO. The convergence curve of the HCO algorithm for some BMFs is analyzed. Some BMFs such as \({F_{3}}\), \({F_{6}}\), \({F_{9}}\), \({F_{13}}\), \({F_{14}}\) and \({F_{17}}\) are considered to show the convergence characteristic of the HCO algorithm in Fig. 4. For the function \({F_9}\) and \({F_{13}}\), GWO is found better than HCO as found in Fig. 4. For the functions \({F_9}\) and \({F_{13}}\), HCO is found better than WHO as found in Fig. 4.

Comparison of convergence performance of HCO for CEC2005.

The HCO algorithm is verified with CEC-2020 BMFs to explore the achievement of the algorithm in terms of exploration, exploitation, convergence, and local optima avoidance. It includes unimodal, multi-modal, hybrid, and composite functions to validate the proposed algorithm. Each function has been tested with two conditions: with a 30 variable based optimization problem and another one with a 500 variable problem, and simulated 20 times in LabVIEW©2015 platform. The results in terms of average value and standard deviation are computed after 20 tests run in each of the MBF. The achievements of the HCO algorithm for each MBF are compared with some existing methods as reported in the literature, such as PSO6, CTO4. For simulation purposes, each function is tested in a population of 50 sperm and run for 500 iterations for 30 dimensional problems and 500 dimensional problems. The convergence performance of the HCO for some selected 30 dimensional CEC-2020 benchmark functions is shown in the Fig. 5. In most of the cases of 30 dimensional CEC-2020 benchmark functions, HCO performed better than the available algorithms as tabulated in Table 4. The convergence performance of the HCO for some selected 500 dimensional CEC-2020 benchmark functions is shown in the Fig. 6. In most of the cases of 500 dimensional CEC-2020 benchmark functions, HCO performed better than any available algorithms as tabulated in Table 5.

Convergence performance of HCO for 30D CEC2020 benchmark function: (a) Convergence graph for \({F_3}\) benchmark function. (b) Convergence graph for \({F_6}\) benchmark function. (c) Convergence graph for \({F_9}\) benchmark function. (d) Convergence graph for \({F_{10}}\) benchmark function.

Convergence performance of HCO for 500D CEC2020 benchmark function: (a) Convergence graph for \({F_3}\) benchmark function. (b) Convergence graph for \({F_6}\) benchmark function. (c) Convergence graph for \({F_9}\) benchmark function. (d) Convergence graph for \({F_{10}}\) benchmark function.

The use of non-parametric tests is also tested HCO algorithm for the CEC2020 benchmark functions. Such tests have a wide variety of applications in literature27. In this paper, the Friedman and Wilcoxon signed test are carried out CEC 2020 benchmark functions as presented in Tables 6 and 7 respectively. The details of such methods are provided in Ref.27. In the Wilcoxon signed test, \({r^{+}}\) is the summation of ranks when the first method is better compared to second method, and \({r^{-}}\) is the opposite condition at a significant measure of \({\alpha = 0.05}\).

In this section, the performance of the proposed HCO algorithm is validated for two constrained engineering optimization problems. In the first case, a over-current relay coordination based optimization problem is chosen. In this problem, the HCO is used to get an optimal setting of the over-current relays used in the protection scheme of a power distribution network. In the second case, an optimal PID controller is designed for the human respiratory ventilation system. In this problem, the proposed HCO algorithm is used to tune the PID controller for a blower type ventilator model.

In this optimization problem, optimal coordination is established among over-current relays used in a power distribution network to supply uninterrupted power. To validate the proposed HCO algorithm for such a real-world engineering problem, an IEEE 8-bus distribution network with 14 over-current relays is considered. It consists of 20 numbers of selectivity limitations at three phase fault. The range of relay time dial setting (TDS) is between 0.1 and 1.1 s. The co-ordination time interval is 0.3 second. Plug setting is between 0.5 and 2.5. The other specifications such as current transformer ratio, short circuit current at fault locations are taken from Ref.68. To get the optimal operating time of the protective relay system, the PS (plug setting) and TDS (Time Dial Setting) are optimized. The test system is executed with HCO in LabVIEW©2015 platform. The convergence performance for the relay coordination problem is shown in Fig. 7. In the Fig. 7, it can be observed that the algorithm started with a better initial value and converged fast towards a better solution compared to others. The optimal settings of relays found by the HCO algorithm for the IEEE-8 bus distribution system are presented in Table 8. A comparative analysis of the proposed method for the same system as discussed in68 with some existing results is carried out. For this purpose, some well-known methods, such as BBO-LP69, BIP70, HWOA71, WOA72, MWCA73 and SA-LP74 are considered for the same system and presented in Table 9. From the Table 9, the total relay operating time gained by the proposed algorithm is better than the existing results.

Convergence response curve of HCO for relay coordination problem.

In this engineering problem, the HCO algorithm is used as a tuning method for the proportional derivative integral controller (PID) used in the mechanical ventilator (MV) system used in the intensive care unit (ICU). The parameters of the conventional PID controller must be efficiently set such that the ventilator system can provide sufficient air to maintain a stable air-pressure in the lung system. The details of the mathematical modeling, and associated constraints MV are studied from Ref.75. For simulation purpose, the transfer function (\({{G}_{1}}(s)\)) of the patient-hose system with lungs compliance of 20 ml/mbar, and lungs airway resistance of 5 mbar s/l is expressed as follows75:

The transfer function of blower system (\({{G}_{2}}\) ) is taken as75:

As a desired breathing pattern, an unit pulse of period 2 is selected. The initial ranges of parameters of the PID controller (\({{K}_{p}}\), \({{K}_{i}}\) and \({{K}_{d}}\) ) are considered from27 as follows:

\({1\le {K}_{p}\le 2}\),

\({100\le {{K}_{i}}\le 200}\),

\({0\le {{K}_{d}}\le 0.1}\).

For the system in (17), a PID feedback controller is optimized with the HCO algorithm. The convergence performance of the HCO algorithm to design an optimal PID controller compared with some existing algorithms27,76 is presented in Fig. 8a. The performance of the HOC optimized PID controller for the ventilator systems is compared with some existing results as presented in Table 10 and graphically shown in Fig. 8b. It is observed that the response of the ventilator in terms of rise time and settling time with the HCO-PID controller is better than the performance of existing results27,76.

Performance of HCO for PID controller design for ventilator system: (a) Convergence graph. (b) Response of HCO-PID controller for ventilator model.

In this paper, a novel nature-inspired metaheuristic optimization algorithm named Human Conception Optimizer (HCO) is developed to solve real-world optimization problems. The proposed algorithm is simple to understand and effective. It is based on a natural process that exists because of the evolution of human beings. The ability of the proposed algorithm has been tested with classical CEC-2005 and CEC-2020 benchmark functions (BMFs). A comparative analysis of the HCO algorithm with some existing results has also been performed for both sets of benchmark functions. The simulation results showed the superiority of the proposed algorithm. As observed in Table 3, for most of the CEC-2005 BMFs, the proposed algorithm performed better than existing methods. Moreover, the statistical significance of the HCO algorithm is observed in Table 6 for 30D CEC2020 BMFs. For most complex higher-dimensional test functions, the proposed algorithm performed efficiently. It can be stated that the HCO algorithm can be applied to find solutions for different complex optimization problems as tested with CEC-2005 BMFs, 30D CEC2020 BMFs, and 500D CEC2020 BMFs. For the validation of the proposed algorithm for real world problems, an optimal overcurrent relay coordination problem in a complex distribution network and an optimal PID controller design for an artificial human ventilator system have been examined and compared with existing results. For a complex 14 over-current relay based IEEE 8 bus power distribution system, the proposed method optimized the total relay operating time with optimal coordination among all primary and secondary relays. The total optimal relay operating time achieved by the HCO algorithm for the chosen system is 1.96 seconds, as presented in the Table 9 where it is 8.56 seconds using BBO-LP69, 8.69 seconds using BIP70, 5.86 seconds using HWOA71 and 5.95 seconds using MWCA73. A significant improvement in total relay operational time is observed by the proposed algorithm. The initial random population of PS (plug setting) and TDS (Time Dial Setting) for 14 relays in the IEEE 8 bus system is efficiently selected during the generation stage of a healthy population as proposed in the algorithm (Fig. 3). With the proposed algorithm, a gain of 50% to 60% in total relay operating times is observed comparing with some existing results for the same system as presented in Table 9. Thus, the practical novelty of the proposed algorithm is found in this real system. For the other engineering problem of designing an optimal PID controller for a mechanical ventilator model. The convergence performance to find the optimal solution for the ventilator model is better in terms of response time and settling time within acceptable steady state error than existing methods, as found in Fig. 8a. It takes fewer iterations compared with the CTO algorithm to find an optimal solution. Although the PSO algorithm requires fewer iterations than the HCO, the minimum fitness value is achieved by the proposed algorithm compared with both the CTO and PSO. The transient response of the system is also significantly improved by the HCO algorithm, as observed in Fig. 8b. This clearly indicates that the proposed algorithm can perform better than some existing algorithms for handling real-world problems.

Moreover, the practical applications of the proposed algorithm have some limitations with the size of the real-world complex optimization problems, which is clearly observed from the simulation results. For the classical benchmark functions of CEC 2005, the HCO algorithm smoothly converges for most BMFs, as shown in Fig. 4. With an increase in dimension and complexity in BMFs such as 30 and 500 dimensional CEC2020, the HCO faces local stuck problems several times during the simulation and it takes more than 50 iterations to overcome the local trapping problem as observed in Figs. 5c,d, and 6d. In real world applications, the same problem is observed during the simulation. As observed in Fig. 7, with the HCO algorithm, the total relay operational time is found much better than existing result due to the selection of initial searching variables (TDS and PS) efficiently with compromising an initial local trapping problem up-to 10 iterations. Thus, for such a complex optimization problems, there is a limitation to apply the HCO algorithm. As the concept of the natural conception process is directly utilised to model the algorithm, the performance of the HCO algorithm can further be improved by adapting to other schemes like multi-level, mutation, crossover, chaotic search concepts, and so on.

The corresponding author will disclose the datasets utilized and/or processed throughout the current work upon reasonable request.

Feng, L., Sun, X., Tian, X. & Diao, K. Direct torque control with variable flux for an srm based on hybrid optimization algorithm. IEEE Trans. Power Electron. 37, 6688–6697 (2022).

Article ADS Google Scholar

Jin, Z., Sun, X., Lei, G., Guo, Y. & Zhu, J. Sliding mode direct torque control of spmsms based on a hybrid wolf optimization algorithm. IEEE Trans. Ind. Electron. 69, 4534–4544 (2021).

Article Google Scholar

Jia, Y.-H., Mei, Y. & Zhang, M. A bilevel ant colony optimization algorithm for capacitated electric vehicle routing problem. IEEE Trans. Cybern. 52, 10 (2021).

Google Scholar

Das, P., Das, D. K. & Dey, S. A new class topper optimization algorithm with an application to data clustering. IEEE Trans. Emerg. Top. Comput. 8, 948 (2018).

Google Scholar

Mirjalili, S., Mirjalili, S. M. & Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014).

Article Google Scholar

Kennedy, J. & Eberhart, R. Particle swarm optimization. In Proc. ICNN’95-International Conference on Neural Networks, Vol. 4, 1942–1948 (IEEE, 1995).

Mataifa, H., Krishnamurthy, S. & Kriger, C. Volt/var optimization: A survey of classical and heuristic optimization methods. IEEE Access. 10, 13379–13399 (2022).

Article Google Scholar

Li, D. et al. Aging state prediction for supercapacitors based on heuristic kalman filter optimization extreme learning machine. Energy 250, 123773 (2022).

Article Google Scholar

Ferro, G., Robba, M., Haider, R. & Annaswamy, A. M. A distributed optimization based architecture for management of interconnected energy hubs. IEEE Trans. Control Netw. Syst.https://doi.org/10.1109/TCNS.2022.3165022 (2022).

Article Google Scholar

Sang-To, T., Hoang-Le, M., Wahab, M. A. & Cuong-Le, T. An efficient planet optimization algorithm for solving engineering problems. Sci. Rep. 12, 1–18 (2022).

Article Google Scholar

Du, J., Zhang, Z., Li, M., Guo, J. & Zhu, K. Optimal scheduling of integrated energy system based on improved grey wolf optimization algorithm. Sci. Rep. 12, 1–19 (2022).

Article Google Scholar

Tan, K. C., Feng, L. & Jiang, M. Evolutionary transfer optimization—A new frontier in evolutionary computation research. IEEE Comput. Intell. Mag. 16, 22–33 (2021).

Article Google Scholar

Jia, H., Peng, X. & Lang, C. Remora optimization algorithm. Expert Syst. Appl. 185, 115665 (2021).

Article Google Scholar

You, J. B. et al. Machine learning for sperm selection. Nat. Rev. Urol. 18, 387–403 (2021).

Article Google Scholar

Raouf, O. A. & Hezam, I. M. Sperm motility algorithm: A novel metaheuristic approach for global optimisation. Int. J. Oper. Res. 28, 143–163 (2017).

Article MathSciNet MATH Google Scholar

Shehadeh, H. A., Ahmedy, I. & Idris, M. Y. I. Sperm swarm optimization algorithm for optimizing wireless sensor network challenges. In Proc. 6th International Conference on Communications and Broadband Networking, 53–59 (2018).

Shehadeh, H. A., Idna Idris, M. Y. & Ahmedy, I. Multi-objective optimization algorithm based on sperm fertilization procedure (mosfp). Symmetry 9, 241 (2017).

Article ADS Google Scholar

Holland, J. H. Genetic algorithms. Sci. Am. 267, 66–73 (1992).

Article ADS Google Scholar

Rechenberg, I. Evolution strategy: Nature's way of optimization. In Optimization: Methods and Applications, Possibilities and Limitations (ed. Bergmann, H. W.) 106–126 (Springer, 1989).

Chapter Google Scholar

Koza, J. R. & Poli, R. Genetic programming. In Search Methodologies (eds Burke, E. K. & Kendall, G.) 127–164 (Springer, 2005).

Chapter Google Scholar

Van Laarhoven, P. J. & Aarts, E. H. Simulated annealing. In Simulated Annealing: Theory and Applications (eds Aarts, E. & van Laarhoven, P. J.) 7–15 (Springer, 1987).

Chapter MATH Google Scholar

Rashedi, E., Nezamabadi-Pour, H. & Saryazdi, S. Gsa: A gravitational search algorithm. Inf. Sci. 179, 2232–2248 (2009).

Article MATH Google Scholar

Erol, O. K. & Eksin, I. A new optimization method: Big bang-big crunch. Adv. Eng. Softw. 37, 106–111 (2006).

Article Google Scholar

Azizi, M. Atomic orbital search: A novel metaheuristic algorithm. Appl. Math. Model. 93, 657–683 (2021).

Article MathSciNet MATH Google Scholar

Kaveh, A. & Talatahari, S. A novel heuristic optimization method: Charged system search. Acta Mech. 213, 267–289 (2010).

Article MATH Google Scholar

Dorigo, M., Birattari, M. & Stutzle, T. Ant colony optimization. IEEE Comput. Intell. Mag. 1, 28–39 (2006).

Article Google Scholar

Acharya, D. & Das, D. K. Swarm optimization approach to design pid controller for artificially ventilated human respiratory system. Comput. Methods Progr. Biomed. 198, 105776 (2020).

Article Google Scholar

Mirjalili, S. & Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 95, 51–67 (2016).

Article Google Scholar

Shadravan, S., Naji, H. R. & Bardsiri, V. K. The sailfish optimizer: A novel nature-inspired metaheuristic algorithm for solving constrained engineering optimization problems. Eng. Appl. Artif. Intell. 80, 20–34 (2019).

Article Google Scholar

Srivastava, A. & Das, D. K. A bottlenose dolphin optimizer: An application to solve dynamic emission economic dispatch problem in the microgrid. Knowl.-Based Syst. 243, 108455 (2022).

Article Google Scholar

Rao, R. V., Savsani, V. J. & Vakharia, D. Teaching-learning-based optimization: A novel method for constrained mechanical design optimization problems. Comput. Aided Des. 43, 303–315 (2011).

Article Google Scholar

He, S., Wu, Q. H. & Saunders, J. R. Group search optimizer: An optimization algorithm inspired by animal searching behavior. IEEE Trans. Evol. Comput. 13, 973–990 (2009).

Article Google Scholar

Atashpaz-Gargari, E. & Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In 2007 IEEE Congress on Evolutionary Computation, 4661–4667 (IEEE, 2007).

Srivastava, A. & Das, D. K. Criminal search optimization algorithm: A population-based meta-heuristic optimization technique to solve real-world optimization problems. Arab. J. Sci. Eng. 47, 3551–3571 (2022).

Article Google Scholar

Wolpert, D. H. & Macready, W. G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1, 67–82 (1997).

Article Google Scholar

Geem, Z. W., Kim, J. H. & Loganathan, G. V. A new heuristic optimization algorithm: Harmony search. Simulation 76, 60–68 (2001).

Article Google Scholar

Chu, S.-C., Tsai, P.-W. & Pan, J.-S. Cat swarm optimization. In Pacific Rim International Conference on Artificial Intelligence, 854–858 (Springer, 2006).

Mucherino, A. & Seref, O. Monkey search: A novel metaheuristic search for global optimization. In AIP Conference Proceedings, Vol. 953, 162–173 (American Institute of Physics, 2007).

Lu, X. & Zhou, Y. A novel global convergence algorithm: Bee collecting pollen algorithm. In International Conference on Intelligent Computing, 518–525 (Springer, 2008).

Shiqin, Y., Jianjun, J. & Guangxing, Y. A dolphin partner optimization. In 2009 WRI Global Congress on Intelligent Systems, Vol. 1, 124–128 (IEEE, 2009).

Tan, Y. & Zhu, Y. Fireworks algorithm for optimization. In International Conference in Swarm Intelligence, 355–364 (Springer, 2010).

Gandomi, A. H. & Alavi, A. H. Krill herd: A new bio-inspired optimization algorithm. Commun. Nonlinear Sci. Numer. Simul. 17, 4831–4845 (2012).

Article ADS MathSciNet MATH Google Scholar

Yang, X.-S. Flower pollination algorithm for global optimization. In International Conference on Unconventional Computing and Natural Computation, 240–249 (Springer, 2012).

Eskandar, H., Sadollah, A., Bahreininejad, A. & Hamdi, M. Water cycle algorithm—A novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput. Struct. 110, 151–166 (2012).

Article Google Scholar

Sadollah, A., Bahreininejad, A., Eskandar, H. & Hamdi, M. Mine blast algorithm: A new population based algorithm for solving constrained engineering optimization problems. Appl. Soft Comput. 13, 2592–2612 (2013).

Article Google Scholar

Ramezani, F. & Lotfi, S. Social-based algorithm (sba). Appl. Soft Comput. 13, 2837–2856 (2013).

Article Google Scholar

Zheng, Y.-J. Water wave optimization: A new nature-inspired metaheuristic. Comput. Oper. Res. 55, 1–11 (2015).

Article MathSciNet MATH Google Scholar

Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 89, 228–249 (2015).

Article Google Scholar

Kashan, A. H. A new metaheuristic for optimization: Optics inspired optimization (oio). Comput. Oper. Res. 55, 99–125 (2015).

Article MathSciNet MATH Google Scholar

Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 27, 1053–1073 (2016).

Article Google Scholar

Jaddi, N. S., Alvankarian, J. & Abdullah, S. Kidney-inspired algorithm for optimization problems. Commun. Nonlinear Sci. Numer. Simul. 42, 358–369 (2017).

Article ADS MATH Google Scholar

Saremi, S., Mirjalili, S. & Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 105, 30–47 (2017).

Article Google Scholar

Srivastava, A. & Das, D. K. A new kho-kho optimization algorithm: An application to solve combined emission economic dispatch and combined heat and power economic dispatch problem. Eng. Appl. Artif. Intell. 94, 103763 (2020).

Article Google Scholar

Saggiorato, G. et al. Human sperm steer with second harmonics of the flagellar beat. Nat. Commun. 8, 1–9 (2017).

Article CAS Google Scholar

Eisenbach, M. & Giojalas, L. C. Sperm guidance in mammals—An unpaved road to the egg. Nat. Rev. Mol. Cell Biol. 7, 276–285 (2006).

Article CAS Google Scholar

Kirkman-Brown, J. C., Sutton, K. A. & Florman, H. M. How to attract a sperm. Nat. Cell Biol. 5, 93–95 (2003).

Article CAS Google Scholar

Budrikis, Z. Sperm swimming is more complicated than thought. Nat. Rev. Phys. 2, 461–461 (2020).

Article Google Scholar

Gaffney, E. A., Ishimoto, K. & Walker, B. J. Modelling motility: The mathematics of spermatozoa. Front. Cell Dev. Biol. 9, 710825 (2021).

Article Google Scholar

Raveshi, M. R. et al. Curvature in the reproductive tract alters sperm-surface interactions. Nat. Commun. 12, 1–10 (2021).

Article Google Scholar

Suarez, S. S. Control of hyperactivation in sperm. Hum. Reprod. Update 14, 647–657 (2008).

Article CAS Google Scholar

Leung, E. T. et al. Simulating nature in sperm selection for assisted reproduction. Nat. Rev. Urol. 19, 16–36 (2022).

Article CAS Google Scholar

Ravaux, B., Garroum, N., Perez, E., Willaime, H. & Gourier, C. A specific flagellum beating mode for inducing fusion in mammalian fertilization and kinetics of sperm internalization. Sci. Rep. 6, 1–13 (2016).

Article Google Scholar

Zhang, Z. et al. Human sperm rheotaxis: A passive physical process. Sci. Rep. 6, 1–8 (2016).

Google Scholar

Tian, F.-B. & Wang, L. Numerical modeling of sperm swimming. Fluids 6, 73 (2021).

Article ADS CAS Google Scholar

Liu, Q.-Y., Tang, X.-Y., Chen, D.-D., Xu, Y.-Q. & Tian, F.-B. Hydrodynamic study of sperm swimming near a wall based on the immersed boundary-lattice Boltzmann method. Eng. Appl. Comput. Fluid Mech. 14, 853–870 (2020).

Google Scholar

Choudhary, A., Paul, S., Rühle, F. & Stark, H. How inertial lift affects the dynamics of a microswimmer in Poiseuille flow. Commun. Phys. 5, 1–9 (2022).

Article Google Scholar

Lin, S., Ke, M., Zhang, Y., Yan, Z. & Wu, J. Structure of a mammalian sperm cation channel complex. Nature 595, 746–750 (2021).

Article ADS CAS Google Scholar

Amraee, T. Coordination of directional overcurrent relays using seeker algorithm. IEEE Trans. Power Deliv. 27, 1415–1422 (2012).

Article Google Scholar

Albasri, F. A., Alroomi, A. R. & Talaq, J. H. Optimal coordination of directional overcurrent relays using biogeography-based optimization algorithms. IEEE Trans. Power Deliv. 30, 1810–1820 (2015).

Article Google Scholar

Corrêa, R., Cardoso, G. Jr., de Araújo, O. C. & Mariotto, L. Online coordination of directional overcurrent relays using binary integer programming. Electric Power Syst. Res. 127, 118–125 (2015).

Article Google Scholar

Sarwagya, K., Nayak, P. K. & Ranjan, S. Optimal coordination of directional overcurrent relays in complex distribution networks using sine cosine algorithm. Electric Power Syst. Res. 187, 106435 (2020).

Article Google Scholar

Korashy, A., Kamel, S., Jurado, F. & Youssef, A.-R. Hybrid whale optimization algorithm and grey wolf optimizer algorithm for optimal coordination of direction overcurrent relays. Electric Power Compon. Syst. 47, 644–658 (2019).

Article Google Scholar

Korashy, A., Kamel, S., Youssef, A.-R. & Jurado, F. Modified water cycle algorithm for optimal direction overcurrent relays coordination. Appl. Soft Comput. 74, 10–25 (2019).

Article Google Scholar

Kida, A. A., Rivas, A. E. L. & Gallego, L. A. An improved simulated annealing linear programming hybrid algorithm applied to the optimal coordination of directional overcurrent relays. Electric Power Syst. Res. 181, 106197 (2020).

Article Google Scholar

Hunnekens, B., Kamps, S. & Van De Wouw, N. Variable-gain control for respiratory systems. IEEE Trans. Control Syst. Technol. 28, 163–171 (2018).

Article Google Scholar

Sakthiya Ram, S., Kumar, C., Ramesh Kumar, A. & Rajesh, T. Hybrid optimization techniques based automatic artificial respiration system for corona patient. Automatika 63, 226–243 (2022).

Article Google Scholar

Download references

Department of Electrical and Electronics Engineering, National Institute of Technology Nagaland, Dimapur, 797103, India

Debasis Acharya & Dushmanta Kumar Das

You can also search for this author in PubMed Google Scholar

You can also search for this author in PubMed Google Scholar

D.A.: Hypothesis, Method and Methodology, Original-drafting. D.K.D.: Hypothesis, Supervision, Review and Correction.

Correspondence to Dushmanta Kumar Das.

The authors declare no competing interests.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

Acharya, D., Das, D.K. A novel Human Conception Optimizer for solving optimization problems. Sci Rep 12, 21631 (2022). https://doi.org/10.1038/s41598-022-25031-6

Download citation

Received: 12 September 2022

Accepted: 23 November 2022

Published: 14 December 2022

DOI: https://doi.org/10.1038/s41598-022-25031-6

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

SHARE