Obtaining Anti-Missile Decoy Launch Solution from a Ship Using Machine Learning Techniques.
DOI:
https://doi.org/10.9781/ijimai.2021.11.001Keywords:
Machine Learning, Missile, Decoys, Multilayer Perceptron, Support Vector MachineAbstract
One of the most dangerous situations a warship may face is a missile attack launched from other ships, aircrafts, submarines or land. In addition, given the current scenario, it is not ruled out that a terrorist group may acquire missiles and use them against ships operating close to the coast, which increases their vulnerabilitydue to the limited reaction time. One of the means the ship has for its defense are decoys, designed to deceive the enemy missile. However, for their use to be effective it is necessary to obtain, in a quick way, a valid launching solution. The purpose of this article is to design a methodology to solve the problem of decoy launching and to provide the ship immediately with the necessary data to make the firing decision. To solve the problem machine learning models (neural networks and support vector machines) and a set of training data obtained in simulations will be used. The performance measures obtained with the implementation of multilayer perceptron models allow the replacement of the current procedures based on tables and launching rules with machine learning algorithms that are more flexible and adaptable to a larger number of scenarios.
Downloads
References
R. Lord, “Advances in Anti Ship Missile Protection - Naval Countermeasures,” Chemring Naval Countermeasures. Salisbure, England, 2006.
W. Sun, “Maneuvering Calculation of Ship Centroid Jamming,” in 9th International Conference on Information and Social Science, 2019, pp. 386–390.
L. F. Galle, “Royal Netherlands Navy The Survivable Frigate,” 1o Eur. Surviv. Work., 2002.
N. Manju, B. S. Harish, and N. Nagadarshan, “Multilayer Feedforward Neural Network for Internet Traffic Classification,” International Journal of Interactive Multimedia and Artificial Intelligence, vol. 6, no. 1, p. 117, 2020.
W. S. McCulloch and W. Pitts, “A logical calculus of the ideas immanent in nervous activity,” The Bulletin of Mathematical Biophysics, vol. 5, pp. 115–133, 1943.
B. Widrow and M. E. Hoff, “Adaptive switching circuits,” In IRE WESCON Convention Record, Volume 4, pp. 96–104. 1960.
F. Rosenblatt, Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Washington DC: Spartan Books, 1962.
M. Award and R. Khanna, Efficient Learning Machines: Theories, Concepts, and Applications for Engineers and System Designers. Apress Media, LLC, 2015.
D. J. C. Mackay, “A Practical Bayesian Framework for Backprop Networks,” Neural Comput., vol. 4, no. 3, pp. 448–472, 1992.
F. D. Foresee and M. T. Hagan, “Gauss-Newton approximation to bayesian learning,” Proc. Int. Conf. Neural Networks (ICNN’97), Houston, TX, USA, vol. 3, pp. 1930–1935, 1997.
A. Suliman and B. Omarov, “Applying Bayesian Regularization for Acceleration of Levenberg Marquardt based Neural Network Training,” International Journal of Interactive Multimedia and Artificial Intelligence, vol. 5, no. 1, p. 68-72, 2018.
K. K. Aggarwal, Y. Singh, P. Chandra, and M. Puri, “Bayesian Regularization in a Neural Network Model to Estimate Lines of Code Using Function Points,” Journal of Computer Science, vol. 1, no. 4, pp. 505–509, 2005.
R. Reed and R. J. Marksll, Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks. Cambridge, Massachusetts, 1999.
J. Heaton, Introduction to Neural Networks for Java, 2nd Editio. Heaton Research, Inc., 2008.
D. Stathakis, “How many hidden layers and nodes?,” International Journal of Remote Sensing, vol. 30, no. 8, pp. 2133–2147, 2009.
C. Cortes and V. Vapnik, “Support-Vector Networks,” Kluwer Academic Publishers, 1995.
C. M. Bishop, Pattern recognition and machine learning. Springer, 2006.
B. E. Boser, I. M. Guyon, and V. N. Vapnik, “A training algorithm for optimal margin classifiers,” in Proceedings of the fifth annual workshop on computatinal learning theory, 1992, pp. 144–152.
H. Drucker, C. J. C. Surges, L. Kaufman, A. Smola, and V. Vapnik, “Support vector regression machines,” Advances in Neural Information Processing Systems, no. June 2013, pp. 155–161, 1997.
V. Cherkassky and Y. Ma, “Practical selection of SVM parameters and noise estimation for SVM regression,” Neural Networks, vol. 17, no. 1, pp. 113–126, 2004.
Downloads
Published
-
Abstract205
-
PDF30






