5CONCLUSION
This study provided a novel algorithm that arises from happiness
behavior of personal in workplace. Three criteria aredefined for whole
search space, it was adjustable approach to less and more dimensions.
The experiment result with statistical values and Wilcoxon rank test
showed HPO algorithm has more reliability, robustness, flexible and
stability than the other algorithms.This workfocused to provide
balancing between exploration and exploitation with tuning damping
operators and mentioned criterions, as well as, covering the different
search space.For future work, we are planning to adapt our work with
neural network and fuzzy systems such as multilayer perceptron and
adaptive-network-based fuzzy inference system(ANFIS)design that in order
to adjust the weight parameters. They areknowledge of system, which
provide classification, clustering and estimation task. An another hand,
by improving our method we can propose a multi-objectivealgorithm for
complex real problem.
[1] Blum C, Puchinger J, Raidl GR, Roli A. Hybrid metaheuristics in
ombinatorial. optimization: a survey. Appl Soft Comput 2011;11:4135–51.
[2] Boussaïd I, Lepagnot J, Siarry P. A survey on optimization
metaheuristics.
Inform Sci 2013;237:82–117.
[3] Gogna A, Tayal A. Metaheuristics: review and application. J
ExpTheorArtif
[4] Lazar A (2002) Heuristic knowledge discovery for archaeological
data using genetic algorithms and rough sets. In: Sarker R, Abbass H,
Newton C (eds) Heuristic and optimization for knowledge discovery. IGI
Global, Hershey, pp 263–278.
[5] X-.-S. Yang, Nature-Inspired Optimization Algorithms,
ElsevierInsight,London, 2014
[6]Holland J (1975) Adaptation in natural and artificial systems:
anintroductory analysis with application to biology. Control
andartificial intelligence. MIT press, Cambridge.
[7]D.E. Goldberg, Genetic Algorithms in Search, Optimization and
MachineLearning, Addison-Wesley, Reading, MA, USA, 1989.
[8] Wolpert DH, Macready WG. No free lunch theorems for
optimization. Evolut
Comput, IEEE Trans 1997;1:67–82.
[9] Glover F (1986) Future paths for integer programming and links
to artificial intelligence. ComputOper Res
13(5):533–549.https://doi.org/10.1016/0305-0548(86)90048-1
[10] Van Laarhoven PJ, Aarts EH (1987) Simulated annealing. In:Aart
EH, van Laarhoven PJ (eds) Simulated annealing: theoryand applications.
Springer, Berlin, pp 7–15
[11] Ghaemi M, Feizi-Derakhshi M-R (2014) Forest
optimizationalgorithm. Expert SystAppl 41(15):6676–6687.
https://doi.org/10.1016/j.eswa.2014.05.009
[12] Simon D (2008) Biogeography-based optimization. IEEE
TransEvolComput 12(6):702–713.
https://doi.org/10.1109/TEVC.
2008.919004
[13] Yao X, Liu Y, Lin G. Evolutionary programming made faster.
EvolutComput,
IEEE Trans 1999;3:82–102.
[14] Hansen N, Müller SD, Koumoutsakos P. Reducing the time
complexity of the
derandomized evolution strategy with covariance matrix adaptation
(CMAES).
EvolutComput 2003;11:1–18.
[15] Beni G, Wang J. Swarm intelligence in cellular robotic systems.
In: Robots and
biological systems: towards a new bionics?, ed. Springer; 1993. p.
703–12.
[16] Eberhart R, Kennedy JA (1995) New optimizer using particleswarm
theory. In: MHS’95. proceedings of the sixth internationalsymposium on
micro machine and human science, 4-6 Oct. 1995. pp 39–43.
https://doi.org/10.1109/mhs.1995.494215
[17] X. Li, P. Tian, X. Min, “A Hierarchical Particle Swarm
Optimization for Solving Bilevel Programming Problems,” Lecture Notes
in Computer Science , 4029, pp.1169-1178, 2006.
[18] Hamed Shah-Hosseini,”Intelligent water drops algorithm: A new
optimization method for solving the multiple knapsack problem”,
International Journal of Intelligent Computing and Cybernetics, Vol. 1
Issue: 2, pp.193-212, 2008
[19] Chu S-C, Tsai P-w, Pan J-S (2006) Cat swarm optimization.
In:Yang Q, Webb G (eds) PRICAI 2006: trends in artificial
intelligence.Springer, Berlin, pp 854–858
[20]Karaboga D, Basturk B (2007) A powerful and efficient
algorithmfor numerical function optimization: artificial bee colony
(ABC)algorithm. J Global Optim 39(3):459–471.
https://doi.org/10.1007/s10898-007-9149-x
[21]EsmatRashedi, Hossein Nezamabadi-pour, SaeidSaryazdi, “GSA: A
Gravitational Search Algorithm”, Information Sciences, Volume 179,
Issue 13, 2009, Pages
2232-2248,https://doi.org/10.1016/j.ins.2009.03.004.
[22] Fausto F, Cuevas E, Valdivia A, Gonza´lez A (2017) A
globaloptimization algorithm inspired in the behavior of selfish
herds.Biosystems 160:39–55.
https://doi.org/10.1016/j.biosystems.2017.07.010
[23] A. Kaveh, N. Farhoudi, A new optimization method: Dolphin
echolocation, Adv.Eng. Softw. 59 (2013) 53–70.
[24]R. A. Formato, “Central force optimization: a new metaheuristic
with applications in applied electromagnetics,” Progress in
Electromagnetics Research, vol. 77, pp. 425–491, 2007.
[25]L. Xie, J. Zeng, and Z. Cui, “General framework of artificial
physics optimization algorithm,” in Proceedings of the World Congress
on Nature and Biologically Inspired Computing (NaBIC ’09), pp.
1321–1326, IEEE, December 2009.
[26]E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “GSA: a
gravitational search algorithm,” Information Sciences, vol. 179, no.
13, pp. 2232–2248, 2009.
[27] J. Flores, R. Lopez, and J. Barrera, “Gravitational
interactions ´ optimization,” in Learning and Intelligent Optimization,
pp. 226–237, Springer, Berlin, Germany, 2011
[28]Wolpert DH, Macready WG. No free lunch theorems for
optimization. EvolutComput, IEEE Trans 1997;1:67–82.
[29] May 24, 2012, Q&A with the authors of Engineering Happiness:
”A new approach for building a joyful life” Engineering Happiness: “a
new approach for building a joyful life”.
[30] De Juan, R., Mochón, F., Rojas, M.: Expectations and happiness:
evidence from Spain. J. Soc. Res. Policy 5(2), 101–114 (2014).
[31]https://medium.com/@h_liyan/an-equation-for-happiness-71f5b9a22836