Chunlei Li, Chunming Rong and Jayachander Surbiryala, University of Bergen, Norwary
Outsourcing data storage and computation to cloud service providers (CSPs) is a popular practice in cloud computing. Although the cloud computing technology has undergone rapid developments in recent years, the security issue has not been properly addressed so far, and it presents as a roadblock in large deployment of cloud computing. Recently an idea of securing big data by protecting its storage path was proposed by Rong et al. While the general idea was clearly described in their work, the problem of generating a storage path secured by a trapdoor function remain unsolved. In this paper, we further extend the idea by introducing a middleware between a tenant and the cloud. The roles of the middleware in the scheme are twofold: it acts as a reliable communicator between the tenant and the cloud, and it integrates a toolkit that generates secure storage paths according to the inputs from the cloud and the tenant. More specifically, the idea of further partitioning dataset into client-defined basic data units and algorithms that efficiently generate parameterized big permutations according to the tenant’s configurations on secret parameters are integrated into the middleware. In this way, the storage path cannot be recovered without the knowledge of the exact client’a inputs. Therefore, this paper not only addresses the unsolved crucial problem in their work, but also contributes to increasing security and flexibility of the scheme. Our analysis indicates that the proposed scheme is a secure, efficient and flexible approach to storing, accessing and sharing big data in the cloud.
cloud storage, cryptography, security, storage allocation, data partition.
Junfeng Xu,Jin Yi and Weihua Hu, China Information Technology Security Evaluation Center, Beijing, China
With the rapid development of 3G/4G technology and the wide popularity of mobile terminals, the mobile Internet and its massive applications have become the most rapidly developing business of the Internet. The high integration of mobile communication technology and Internet technology not only brings unprecedented operation experience to users, but also brings new security problems. In particular, malicious traffic on mobile terminals consumes illegal plug-ins and Trojan virus rampant, which makes mobile data traffic surge and brings losses to users. Therefore, it is important to study the mobile Internet security detection technology of network traffic. Based on summarizing the traditional network traffic security detection technology, this paper discusses the network traffic mobile security detection technology systematically from three perspectives, including network traffic measurement and identification technology, network traffic characteristics security detection technology and network traffic content security detection technology. At the same time, this paper also pointed out the current research network traffic mobile Internet security detection technology difficult problems, and compared and analyzed the relevant technology. Finally, wesummarize the main content of this paper and the future research direction in the direction of network traffic mobile Internet security detection.
Mobile software security, network traffic, security detection.
Huafei Zhu, Nanyang Technology University, Singapore
Electronic health record (EHR) greatly enhances the convenience of cross-domain sharing and has been proven effectively to improve the quality of healthcare. On the other hand, the sharing of sensitive medical data is facing critical security and privacy issues, which become an obstacle that prevents EHR being widely adopted. In this paper, we address several challenges in very important patients’ (VIPs) data privacy, including how to protect a VIP’s identity by using pseudonym, how to enable a doctor to update an encrypted EHR with the VIP’s absence, how to help a doctor link up and decrypt historical EHRs of a patient for secondary use under a secure environment, and so on. Then we propose a framework for secure EHR data management. In our framework, we use a transitive pseudonym generation technique to allow a patient to vary his/her identity in each hospital visit. We separate metadata from detailed EHR data in storage, so that the security of EHR data is guaranteed by the security of both the central server and local servers in all involved hospitals. Furthermore, in our framework, a hospital can encrypt and upload a patient’s EHR when he/she is absent; a patient can help to download and decrypt his/her previous EHRs from the central server; and a doctor can decrypt a patient’s historical EHRs for secondary use under the help and audit by several proxies.
Electronic health record, pseudonym, semantic security, transitive pseudonym.
Chhaya S Dule1, Dr Girijamma H A2, Dr Rajasekharaiah K.M3, 1KG Reddy College of Engg. and Tech , Hydrabad (JNTHU) ,India 2RNS Institute of Technology, Bangaluru (VTU Belgaum), India and 3KG Reddy College of Engg. and Tech , Hydrabad (JNTHU) ,India
In the world various methods being adopted to create a systematic identification for their citizens. In year 2009 UIDAI a government body of India initiated a 12-digit number called Aadhar number generated out of biometric and demographic fusion of an individuals as their identity. In order to ensure a highest level of data protection as well as privacy preservation of the Aadhar card propagation through a network require an efficient model of security which are synchronous with cloud infrastructure. This paper initially investigates the existing approaches and their limitations towards the Aadhar card security then proposes a security model namely ECrypto-AaDhaar based on the cryptography approach synchronous to the cloud architecture.The privacy preservation of the AAdhar card associative demographic and biometric information is performed considering a statistical crypto-blocking operation prior propagating it through the network in the context of cloud infrastructure. The study later also presented an experimental analysis to demonstrate the performance of ECrypto-AaDhaar technique from a time complexity perspective.
Aadhar, Image Cryptography, Statistical crypto-blocking mechanism.
Kan Luo1 Siyuan Wang1 An Wei2 Wei Yu1 Kai Hu1, 1School of Computer Science and Engineering, Beihang University, Beijing, China and 2China Mobile(Hangzhou) Information Technology Co.,Ltd
Soft-as-a-Service(SaaS) is a software delivery model that contains composition, development and execution on cloud platforms. And massive SaaS applications need to verifying before deployed. To get the verify results of a large quantity of applications in a tolerate time, verify algebra(VA) is used to cut down the number of combinations to be verified. VA is an effective way to acquire the verify statue by using previous results. In VA, the verify result is calculated without knowing the process of verification. In this way, the verification task can be distributed to servers and executed in any order. This paper proposes method called component disassembly tree to decompose a complex SaaS application. And designs a parallel verification in cloud environment. The Optimization of execution is discussed. The proposed parallel schema is simulated in MapReduce
Verification, SaaS, Components Combinations
Sami Ouali, College of Applied Sciences, lbri, Oman
Software Product Line (SPL) is a paradigm used to improve reuse. This capacity is ensured by the management of the common and variable parts of a set of products. Although an SPL aims to increase productivity, quality and to reduce maintenance costs and time to market. That’s why an inappropriate implementation of SPL can conduct to code smells or code anomalies. These code smells are symptoms of something can be wrong in source code which can have an impact on the quality of the derived products of a SPL. These practices can have a domino effect because the same problem can be present in many products due to reuse. Refactoring can be a solution to this problem. This process improves the internal structure of source code without altering external behavior. This paper proposes an approach to reduce code smells in SPL with the use of refactoring. This approach takes into account code transformation to design through reverse engineering.
Software Product Line, Code smells, Refactoring, Reverse Engineering.
Mabroukah Amarif1 and Ibtusam Alashoury2, 1,2Department of Computer Sciences, Sebha University, Sebha, Libya
The shortest path between two points is one of the greatest challenges facing the researchers nowadays. There are many algorithms and mechanisms that are designed and still all according to the certain approach and adopted structural. The most famous and widely used algorithm is Dijkstra algorithm, which is characterized by finding the shortest path between two points through graph data structure. It’s obvious to find the implicit path from the solution path; but the searching time varies according to the type of data structure used to store the solution path. This paper improves the development of Dijkstra algorithm using linked hash map data structure for storing the produced solution shortest path, and then investigates the subsequent implicit paths within this data structure. The result show that the searching time through the given data structure is much better than restart the algorithm again to search for the same path.
Dijkstra algorithm, data structure, linked hash map, time complexity, implicit path, graph.
Mabroukah Amarif1 and Sakeenah Ahmed2, 1,2Department of Computer Sciences, Sebha University, Sebha, Libya
The role of any variable is interpreted as the required task or performance of it in any part of a program. This role contributes to the easy understanding of the program and thus formulates it clearly and unambiguously. Many novice programmers face various difficulties in understanding programming, especially Object Oriented Programming. This research adopts the design of a visualization tool which includes visual model that shows the role of the reference variable (an object) within a Java program to enhance comprehension understanding for novice programmers. The model enables them to interact and thus formulate an object-oriented program in an intuitive and clear way. Based on the actual experimentation, the effectiveness of this model is improved and the importance of this research in the field of object programming is demonstrated.
Role of variable, object oriented programming, visualization, understanding
Mohamed Elmassry and Saad Al-Ahmadi, Department of Computer Science, King Saud University, Riyadh, Saudi Arabia
Enterprise Resource Planning (ERP) is a software that manages and automate the internal processes of an organization. Process speed and quality can be increased, and cost reduced by process automation. Odoo is an open source ERP platform including more than 15000 apps. ERP systems such as Odoo are all-in-one management systems. Odoo can be suitable for small and medium organizations, but duo to efficiency limitations, it is not suitable for the large ones. Furthermore, Odoo can be implemented on both local or public servers in which each has some advantages and disadvantages such as; the speed of internet, synced data or anywhere access. In many cases, there is a persistent need to have more than one synchronized Odoo instance in several physical places. We modified Odoo to support this kind of requirements and improve its efficiency by replacing its standard database with a distributed one, namely CockroachDB.
Odoo, ERP, distributed ERP systems, distributed database, CockroachDB. Open Source ERP.
Rina Komatsu and Tad Gonsalves, Sophia University, Tokyo, Japan
Digital images often contain “noise” which takes away their clarity and sharpness. Most of the existing denoising algorithms do not offer the best solution because there are difficulties such as removing strong noise while leaving the features and other details of the image intact. Faced with the problem of denoising, we tried solving it with a Convolutional Neural Network architecture called the “U-Net”. This paper deals with the training of a U-Net to remove 3 different kinds of noise: Gaussian, Blockiness, and Camera shake. Our results indicate the effectiveness of U-Net in denoising images while leaving their features and other details intact.
Deep Learning, Image Processing, Denoising, Convolutional Neural Network, U-Net.
Junta Watanabe and Tad Gonsalves, Sophia University, Tokyo, Japan
Moving object detection is one of the fundamental technologies necessary to realize autonomous driving. In this study, we propose the prediction of an in-vehicle camera image by Generative Adversarial Network (GAN). From the past images input to the system, it predicts the future images at the output. By predicting the motion of a moving object, it can predict the destination of the moving object. The proposed model can predict the motion of moving objects such as cars, bicycles, and pedestrians.
Deep Learning, Image Processing, Convolutional Neural Network, GAN, DGAN
Jie Luo and Ziyang Zhou, Chinese Medical University, Hangzhou, China
We propose an improved analysis algorithm based on learning and searching methods in the field of Big Data to optimize TCM information management. We use TF-IDF theory in document clustering to cluster the name of Chinese Medical prescripts, construct word bag model, and combine universal hash with perfect hash, in order to establish a special key-value hashing band between Chinese Medical Data and diseases.
TCM, Chinese Medical Massive Data, Big Data, Hashing, Index, Mapping.
Khdega A.Yosef Galala, Al Jufrah University, Waddan, Libya
The main aim of the study is to develop advisory system for the early diagnosis of date palm diseases. The web based expert system presented in this work is divided into two aspects: advisory system and information system. This system is the rule based agricultural system and it covers five fungi diseases occurring in the date palm. Knowledge base in this system consists of data integrated from a variety of date palm knowledge. This intelligent system was developed by using the PHP as the web programming language. The result of the diagnosis process showed the expert system running well in diagnosing the date palm’s diseases. It could be concluded that, this system can be a bit much better to help farmers to diagnosing date palm diseases early and it works as an assistant tool to produce a better quality of date palm product.
Expert Systems, Date Palm Diseases, Agricultural Diagnosis, Knowledge Based System
Rim Gasmi(a), Makhlouf Aliouat(a), Hamida Seba(b), (a)Ferhat Abbas University Setif 1,LRSD Setif Algeria and (b)University of Lyon CNRS Lyon 1, LIRIS Lyon France
In recent years, advanced technologies have made vehicle networks one of the most active research scopes in networking paradigms. Internet of Vehicle (IoV) plays an essential role in resolving several driving and traffic problems. To suitably support multimedia applications and efficiently update road information, it would be important that a network can provide a convenient Quality of Service (QoS). Therefore, an efficient IoV routing protocol should provide the best response time to the requested vehicle without incurring any excessive extra load in the network as this may drive to harmful results, especially in emergency circumstances. Although, routing protocols are becoming the most addressed issue in IoV, their impact remains limited if they do not take into account the expected QoS. Among the different routing protocol categories namely : Proactive, Reactive and Hybrid, Zone Routing Protocol (ZRP) is one of the hybrid routing protocols in which every node proactively updates routing information about its routing zone, while using a reactive method to get routes to destinations outside its routing area. In this paper, we propose Geographic Destination aware Zone Routing Protocol (GDZRP), which is an enhancement of Zone Routing Protocol in order to ensure QoS in IoV applications using QoS function based on speed, final region and time period to find the stable routes which in turn reduce the response time and network overhead. We evaluate the performance of GDZRP with various appropriate performance metrics. Furthermore, GDZRP and ZRP performances are compared regarding Packets Delivery Ratio, Network Overhead and End-to-End Delay metrics.
IoV, Routing protocols, QoS, ZRP, GDZRP
Diganta Misra, KIIT, Bhubaneswar, India
With the recent spike in cases of Malaria especially in developing or war-stricken countries, the need for the development of advanced techniques for monitoring malaria cases to stop epidemic has been a high priority. This paper is a comprehensive study and approach towards the construction of an automated pipeline with image processing features and techniques for classification and segmentation of malariainfected microscopic cellular images.
Image Processing, Malaria, Image Classification & Biomedical Image Segmentation.
Opeoluwa Ore Akinsanya and Maria Papadaki, University of Plymouth, United Kingdom
This research investigates the effective assessment of healthcare cybersecurity maturity models for healthcare organizations actively using cloud computing. Healthcare cybersecurity maturity models designates a collection of capabilities expected in a healthcare organization and facilitates its ability to identify where their practices are weak or not noted and where their practices are truly embedded. However, these assessment practices are considered not completely effective because sole compliance to standards do not produce objective assessment outputs, and the performance measurements of individual IS components does not depict the overall security posture of a healthcare organization. They also do not take into cognizance the effect of the characteristics of cloud computing on healthcare. In this paper, a literature review of maturity models utilized for cloud security assessment in healthcare is presented, and proposing the need and approach for a cloud security maturity model for healthcare organizations. This review is seeking to articulate the present lack of research in this area and present relevant healthcare cloud-specific security concerns.
Healthcare, Cybersecurity, Maturity Model, Cloud Computing
1Durga Prasad Kondisetty, 2Dr. Mohammed Ali Hussain, 1Bharathiar University, Tamilnadu, India and 2KLEF, Guntur Dist., AP, India
The self-organizing maps (SOM) is a supervised neural network(NN) studying technology but there is some data remaining to extract to analysis the neural network which has been frequently used for the analysis and organization of data files having a large size. In this same manner fuzzy c-means (FCM) is also a supervised methodology to segment the image. Here, introducing a novel path to deal with consequent section of Magnetic Resonance (MR) imaging of the human brain into anatomical locales. This paper presents an analysis segmentation of microarray brain image in an unsupervised methodology by combines the supervised FCM and SOM methodologies.
Image Segmentation, FCM, SOM, Microarray, MRI.
S.M. Samiul Salehin1 and Md Rasel Miah2 and Md Saiful Islam3, 1Sylhet Engineering College, Sylhet, Bangladesh, 2Sylhet Engineering College, Sylhet, Bangladesh and 3Shahjalal University of Science & Technology, Sylhet, Bangladesh
The gradual proliferation of social media supplies plethora of texts and drawn interest already. Sentiment Analysis (SA) or Opinion Mining uses such data to extract useful information from it. Maximum researches on SA have been accomplished in the English, but other languages demand obligation for the fact that users post in their native languages too; such as Bengali. It seems crucial to work in Bengali social media posts because it is the seventh  most spoken language by population and sixth  most spoken language by native speakers. Despite having such a mass, little work has been done on Bengali SA. This paper approaches to automatically extract the overall polarity of sentiments expressed in Bengali posts by Facebook  users in five classes: Positive, Strong Positive, Negative, Strong Negative and Neutral. 3200 samples(sentences and paragraphs) are collected through Facebook's Graph API  and later divided into two types of corpora containing 1600 pre-processed and 1600 unprocessed posts. Each post in both corpora is annotated with a polarity class (positive, strong positive, negative, strong negative or neutral) by a team of three independent annotators. We follow a supervised machine learning and a deep learning approach to this classification problem for both corpora. In the machine learning approach, relative comparison of three well-known classification algorithms namely Naive Bayes, Support Vector Machine and Logistic Regression for both corpora are done. For each classifier, two different input features (unigram and bigram) are experimented with. In the deep learning approach, a popular Recurrent Neural Network namely Long Short-Term Memory is tested. SVM with a combination of unigram and bigram proves to be the best for the pre-processed corpus producing an accuracy of 86.7 %. Deep learning approach gains poorer result than machine learning approach for the pre-processed corpus, but proves to be the best fit for unprocessed data producing an accuracy of 72.86%.
sentiment analysis, text mining, feature extraction, supervised learning, machine learning, support vector machine, naive bayes, logistic regression, deep learning, recurrent neural network.
Mohammad Hassan Anjom AHoa, Department of Mathematics, Vli E Asr University of Rafsanjan, Rafsanjan, Iran
In this paper, we try to express the strengths and weaknesses of each of intelligence techniques to form a suitable combination of these techniques for a moment of maximum intelligence, as well as more and more accurate mathematical structures. A new technique is a combination of applied methods. In this new technique for all previous techniques such as a coherent method, actors, etc., a matrix has been proposed to establish a matrix of elements and parameters which communicate with the help of network analysis techniques between their parameters. While using the weak assumptions that are used in the competing assumptions analysis technique, the other matrices can be used, and with the help of the network, parameters that have the greatest relevance to other parameters are selected as a job preference for analysis.
competitor analysis technique, coherent effect analysis model, model of analysis of actors, intelligence network analysis technique, parameter matrix, and component.
Sadegh Hosseini1 and Mehdi Ahmadi Najafabadi2 1Department of Mechanical Engineering, Varamin Pishva Branch, Islamic Azad University, Varamin, Iran 2Department of Mechanical Engineering, Amirkabir University of Technology, Tehran, Iran
This paper presents the results of Acoustic Emission (AE) method for estimating the journal bearing defect size under different working conditions. An experimental test bed was provided to investigate relation between registered AE signals and the defect in the test bearings. The AE signal features in time-frequency domain were extracted by Continuous Wavelet Transform (CWT). Then, Particle Swarm Optimization (PSO) was implemented to identify the optimum coefficients of the CWT as an input of Artificial Neural Network (ANN) classifier. Metaparameters of the PSO and architecture of the ANN were tuned using Non-dominated Sorting Genetic Algorithm II (NSGA II). Thus, the eight-class classification problem of the defect size in the journal bearings was solved.
Acoustic Emission; Particle Swarm Optimization; Artificial Neural Network; Non-dominated Sorting Genetic Algorithm; Defect Size; Journal Bearing;
Abdullah A. Al-Shaher Department of Computer and Information Systems College of Business Studies, Public Authority for Applied Education and Training, Kuwait
In this paper, we demonstrate how regression curves can be used to recognize 2D non-rigid handwritten shapes. Each shape is represented by a set of non-overlapping uniformly distributed landmarks. The underlying models utilize 2nd order of polynomials to model shapes within a training set. To estimate the regression models, we need to extract the required coefficients which describe the variations for a set of shape class. Hence, a least square method is used to estimate such modes. We proceed then, by training these coefficients using the apparatus Expectation Maximization algorithm. Recognition is carried out by finding the least error landmarks displacement with respect to the model curves. Handwritten isolated Arabic characters are used to evaluate our approach.
Shape Recognition, Arabic Handwritten Characters, Regression Curves, Expectation Maximization Algorithm.
Dr. Ahmad A. Al-Hajji, Fatimah M. AlSuhaibani and Nouf S. AlHarbi, Qassim University, Saudi Arabia
Artificial Intelligence (AI) is a field of computer science, concerned with symbolic reasoning and problem solving. Expert systems (ESs) are one of the prominent research domains of AI. Online Knowledge-Based Expert system (KBES) for psychological diseases diagnosis and classification is an online based program system that helps psychology practitioner and doctors to diagnose the condition of a patient efficiently and in short time. The system is also very useful for the patients who cannot go to a doctor because they cannot afford the cast, or they do not have a psychological clinic in their area, or they are ashamed of discussing their situation with a doctor. The system consists of program codes that make a logic decision to classify the problem of the patient. The user of the system will enter the symptoms of the patients through the user interface and the program executes. Then the program links the symptoms to the pre-programmed psychological diseases, and will classify the disease and recommend treatment. The common types of treated psychiatric diseases are: depression, anxiety disorder, obsessive-compulsive disorder, and hysteria.
Artificial Intelligence, Expert systems, Medical Diagnosis, Rules, Symbolic Reasoning
Khaled Elmenshawy, Elshourok Academy, Cairo, Egypt
English-Arabic Machine translation systems have been taking place in machine translation projects in recent years and so, many projects have been carried out to improve the quality of translation into and from Arabic. This research focuses on machine translation from the source language (English) to the target language (Arabic) using an English-Arabic electronic dictionary. The challenges of this research are the difficulty of delivering the appropriate meaning of the source language (SL) in the target language (TL), different sentence structure between two languages, word agreement, ordering problem, verbal forms and linguistic structures. The aim of this research was to design and build an automatic translation system from English to Arabic based on a dictionary of English roots using rule-based method. The proposed machine translation system uses a transfer strategy which is divided into three phases: analysis, transfer and generation of sentences in the target language. The system was evaluated by selecting a set of English language sentences to cover all the structures of sentences as a first stage, and then selecting another set of long sentences that contained more than one structure. All the results of the system were compared with the results of the various translation web on the Internet.
Machine translation, rule-based approach, Arabic language, English language, sentence structure, morphological analysis.
Raina Zakir, Middlesex University, Dubai, UAE
Cybersecurity in the field of robotics has not been a priority until the recent dawn of cyber-attacks on various autonomous systems highlighting the beginning of security compromises in robotic systems that our general computing systems have been facing for a long time. Think about a surgical robot getting hacked during an operation or a military robot being hijacked by the opponents. Security and privacy in robotics is becoming a major issue, even in the ones used for household and entertainment purposes. As a result, this area of research is under constant development and progress now. This review aims to interpret the existing trends and solutions made in this field by examining existing literature to classify the domains in this relatively budding field and in turn identify the gaps to be filled in the future researches.
Cybersecurity, Robotics, Cyber-attacks, Privacy, Hack, Classify, Review
Li Hui, Tian Ya-dan and Zhou Ming-jie, School of Economic and Management, Xidian University, Xi’an 710126, Shannxi Province, P.R. China
This article analyze the evolution of relative research in the field of deep learning according to the papers in Web of Science core collection. We use SciMAT to detect the hidden topics of 1136 papers and visualize the evolution process for exploring the pattern of the evolution. Also, we aim to find the distribution of the topics from different institutions. The results show that, in the past 11 years, deep learning researches focus on two main topic evolution lines: Learning Approaches and Strategies and Neural Networks. Then we identify the important topics for the last four years: Boltzmann machine and Neural networks. Based on the evolution rule and the research topic distribution of institutions in deep learning, we propose suggestions about the future development of deep learning. The methods and research result presented in this study could also help academics identify the research topic in other fields and master the technological frontier, so as to further provide information basis for scientific and technological decision-making.
Deep learning; Scientometric analysis; Topic evolution; SciMAT.
Era Desti Ramayani, Fadila Amelia Futri, Moch Mogi Ibrahim H and Gigih Forda Nama, Department of Engineering, Lampung University, Lampung, Indonesia
The University is an institution of higher education and research, which provides academic degrees in various fields. A university provides undergraduate and postgraduate education. There are several admissions paths for University of Lampung students such as SBMPTN, SNMPTN, Mandiri, PMAP, Pararel or special achievements. New students who are accepted also come from various regions and various ages. From various regions and various ages, the chosen faculties also varied. Based on this, the journal was made to analyze the level of interest of selected faculty from various regions and ages. The data to be analyzed is the data of new University of Lampung students in 2018 using the Decision Tree method. Data processing is done by using the Rapid Miner. Based on the results of the analysis, the faculty with the highest interest is the faculty of K.I.P and the quota of new students comes from the city of Bandar Lampung.
Data mining, Decision Tree, Rapid Miner
Areej Al-Hassan and Hmood Al-Dossari, King Saud University, Riyadh, Saudi Arabia
In social media platforms, hate speech can be a reason of “cyber conflict” which can affect social life in both of individual-level and country-level. Hateful and antagonistic content propagated via social networks has the potential to cause harm and suffering on an individual basis and lead to social tension and disorder beyond cyber space. However, social networks cannot control all the content that users post. For this reason, there is a demand for automatic detection of hate speech. This demand particularly raises when the content is written in complex languages (e.g. Arabic). Arabic text is known with its challenges, complexity and scarcity of its resources. This paper will present a background on hate speech and its related detection approaches. In addition, the recent contributions on hate speech and its related anti-social behaviour topics will be reviewed. Finally, challenges and recommendations for the Arabic hate speech detection problem will be presented.
Text Mining, Social Networks, Hate Speech, Natural Language Processing, Arabic NLP