**3. Common EEG patterns**

After connecting the device with the EmotivApp, the EmotiveBCI application is used to train mental commands. After training, the live mode is switched ON to control a cube with our Imagery thoughts. Next those live mental commands should be extracted from the application to integrate with the drone. This is achieved through cortex API documentation, for the ease I have edited the code in python language. Then simultaneously those commands should be integrated to a dji tello drone. This is achieved through dji tello API documentation. Finally a link between the drone and Insight brainware is achieved.

#### **3.1 Mental command training**

Giving mental command to the EmotivBCI application will result in the movement of the object in the desired direction. Initially numerous training needs to be done to the device in the initial stage to get good desired output. For example during an object movement test when the subject is thinking to move the cube towards the right direction we can observe this in the **Figures 5** and **6** represented below that there is a movement of the cube towards the right direction. Each of the Neutral, moving left, lift, drop mental commands were trained for 10 times.

#### **3.2 Extraction of mental commands and assigning them to the drone**

The python code is built in Atom software editor in Dell Inspiron laptop. Jason, websocket, ssl, time, win32, requests, pyautogui, socket, keyboard, threading are

**Figure 5.** *The object movement test of the subject, live mode in BCI App when the person is neutral.*

#### **Figure 6.**

*The object movement test of the subject, live mode in BCI App when the person thinks of moving the object left from neutral, the cube has moved left.*

**Figure 7.** *Procedure of interfacing EEG with a drone.*

## *Brain Computer Interface Drone DOI: http://dx.doi.org/10.5772/intechopen.97558*

the libraries used in the code. The control of the drone with the computer is achieved using DJI Tello drone protocol **Figure 7**, where the drone is connected to the local wifi and the laptop should also be connected to the same wifi using a TP-Link USB Wi-Fi Adapter which is used for PC(TL-WN725N) it's an N150 Wireless Network Adapter for laptop, UDP protocol is used to make an interface between the computer system and dji tello drone, it should be explicit and bind to a local port on our computer where tello can send messages. The functions that listen for messages from tello will be printed on the screen. Connection between the insight and the laptop is done according to the Emotiv App protocol, connection can be established either using Insight dongle or through the Bluetooth connection. After connecting the device with the EmotivApp, the EmotiveBCI application is used to train mental commands. After training, the live mode is switched ON to control a cube with our Imagery thoughts. Next those live mental commands should be extracted from the application to integrate with the drone. This is achieved through cortex API documentation. Then simultaneously those commands should be integrated to a dji tello drone. This is achieved through dji tello API documentation. Finally a link between the drone and Insight brainware is achieved. Coming to the Insight, InsightHandler class is used then the web connection is acquired, In order to get the approval form the Emotiv app we have to generate client id and secret then we have to give the approval in the emotive app. After approving then authorisation happens to generate the token this token will create the session and loads the profile from the cortex app, then we are going to start and stream the mental commands data in our terminal, these commands will be integrated with the our keyboard clicks, these clicks will intern control the drone as a controller, so basically as we are thinking of a mental command which will be controlling our computers keyboard this will intern control the drone.
