Till KTH:s startsida Till KTH:s startsida

Thesis project proposals

==== ALL PROPOSALS BELOW ARE ALREADY TAKEN ====

3D local motion planning

In this project we will explore how to develop a system that does local motion planning for a drone in a 3D environment. It will continue work that started last year with the aim to develop a real-world implementation of an obstacle avoidance system. Obstacle avoidance is of highest importance for drones if they are intended to work autonomously.  

Required skills: Programming and huge plus if you have worked with ROS
Resources: KTH CAS will supply a drone to work with.
Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.
Contact: Patric Jensfelt (patric@kth.se)

Real-world 3D exploration

This project will continue the work on 3D exploration that was started last year as a MSc thesis. It resulted in a submission to a journal and opened up for further research. The primary goal would be to investigate the real-world performance of the system and how the method might have to be modified based on finding from such experiments.

Required skills: Programming and huge plus if you have worked with ROS
Resources: KTH CAS will supply a drone to work with.
Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.
Contact: Patric Jensfelt (patric@kth.se)

Heterogeneous Leader-follower 3D Exploration with UAVs

Indoor exploration using UAVs is a trending research topic. Although single robots can perform exploration, the amount of sensors one has to carry requires the use of a larger robot due to payload limitation. Therefore, this project aims to explore the possibility of having an heterogeneous team exploring the environment: the leader carrying a 2D lidar and the follower an RGB-D camera. This would allow smaller robots and/or longer lifetime.

Required skills: Programming and huge plus if you have worked with ROS
Resources: KTH CAS will supply a drone to work with.
Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.
Contact: Daniel Duberg (dduberg@kth.se) or Fernando dos Santos (fdsb@kth.se)

Battery- and WiFi-aware 3D Exploration based on Temporal Logics

This project aims at enhancing the resoning behind the environment exploration algorithm with temporal logics so that the robot can decide wether it is better to keep exploring or go to the nearest recharging station. Additionally, it can also be used to decide which parts of the environment are not reachable due to low or no wifi connectivity.

Required skills: Programming and huge plus if you have worked with ROS
Resources: KTH CAS will supply a drone to work with.
Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.
Contact: Daniel Duberg (dduberg@kth.se) or Fernando dos Santos (fdsb@kth.se)

3D autonomous exploration with a drone

In this project we will investigate how to design and develop a system for 3D autonomous exploration with a drone. There have been a lot of research about autonomous exploration in a 2D environment, but to fully utilize the capabilities of a drone we would like to be able to explore a 3D environment. The goal in this project is to develop a method that commands the drone how to move to best explore and gather data about a previously unknown area, such that, for example, a map of the environment can be built. The system should be able to adapt dependent on which sensors are used to gather the data about the environment.

Required skills: Programming and huge plus if you have worked with ROS

Resources: KTH CAS will supply a drone to work with.

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

Path planning for drones

In this project we will implement a 3D path planning system for a drone. Given a goal and a partial complete map of the environment (e.g. from a SLAM system), a path that takes the drone from the current position to the goal should be generated. The system that is developed in this project should ensure that the path generated is being followed and it should be able to adjust the path in real time as more information from different sensors is gathered. It is appreciated if the generated path is "smoothed", meaning that the drone moves in a continuous motion (Note: It can be scary for humans if the drone move fast then completely stops then turns a bit and then moves fast in the new direction again).

There are a number of challenges developing a path planning system for a drone that does not exist for wheeled robots, such as:

  • A drone is never completely still in the air.
  • A drone is often very fragile so the smallest hit can be fatal.
  • When moving close to walls the drone can be sucked in towards the wall because of the air difference.

Therefore it might be a good idea that the developed system tries to minimize the time the drone is flying close to walls, such that the drone only does that when it is need (e.g. when the goal is close to a wall).

Required skills: Programming and huge plus if you have worked with ROS

Resources: KTH CAS will supply a drone to work with.

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

3D obstacle avoidance

In this project we will explore how to develop a system that does obstacle avoidance for a drone in a 3D environment. A lot of work has been done through the years regarding obstacle avoidance, however most of the works only consider a 2D environment. Since a drone can move in three dimensions it is therefore of interest to develop a system that ensures safe flight in these three dimensions. Obstacle avoidance is of highest importance for drones if they are intended to work autonomously and around humans, since they often are fragile and have fast moving propellers that can hurt humans. It should be possible to use the developed system in both tele-operation tasks were a human is controlling the drone and for fully autonomously flights. 

Required skills: Programming and huge plus if you have worked with ROS

Resources: KTH CAS will supply a drone to work with.

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

Autonomous recharging for indoor drones

In this project we will investigate the design and implementation of a system for autonomous recharging of drones. Drone typically have a flight time of 3-10 min and this means that for any larger task the drone will have to recharge.

Required skills: Programming and huge plus if you have worked with ROS

External collaboration: The project will be conducted in collaboration with the company Loligo which will be responsible for the recharging solution and hardware.

Resources: KTH CAS will supply a drone to work with and Loligo will supply the recharging hardware.

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

Ego motion estimation on a drone using visual inertial sensors

One of the key components in a drone system is the ability to estimate the position. In this project we investigate methods for ego motion estimation with the aim to generate a position estimate that is robust and accurate for periods on the orders of tens of seconds. We want a solution that is light weight both in terms of physical weight and computational cost. The aim is this algorithm can be part of the basic infrastructure for the drones would be used at KTH CAS. There are examples of this in the literature. The first part of the thesis would be identifying the most promising solution and then implement/evaluate it with one drone at KTH CAS. The hope is that the algorithm will support ego motion estimation also for high speed motion. Given the current projects at KTH CAS, it is estimated that light weight and robust is more important that being able to handle high speeds.

Literature

Required skills: ROS, programming, DD2425

Resources: KTH CAS will supply a drone to work with

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

Deep learning for optical flow estimation for a drone

In this project we want to investigate the use of deep convolutional networks for calculating optical flow. Optical flow can be used for detecting obstacles, motion control and stereo vision and as such an essential component of a vision based drone. In the project we would start by looking at the state of the art methods in the area, do test with data from a drone with the task of motion estimation and evaluate it in terms of accuracy, robustness and computational cost. Then we will see how the method can be scaled down to run with the same resources available on a drone and see if a trade off can be found in terms of the aforementioned characteristics.

Literature

Required skills: Deep learning, computer vision, programming.

Resources: KTH CAS will supply data from a drone for testing. If time and results permit tests will also be performed on a real drone.

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

Safe navigation of a teleoperated drone

Current rules for operating drones dictate that you need to have line of sight to the drone. However, for many the applications that we foresee with drone this will not be the case, for example, when drones are used for search and rescue. In such applications the operator will not be able to see the drone and the decision for how to control it has to be based on the sensor data gathered by the robot. In this project we will start with the assumption that the operator has a video feed from the drone and the task is to be able to move safely in the environment. This means that the drone should not run into objects and the interface should provide intuitive feedback to assist the operator in the control of the motion. At the lowest level the project is about obstacle avoidance, but we foresee that the project would also include considerations for how to present the operator with information about the environment, the available controls and how to actually do the control. The system should be useable for drone with a minimum amount of sensors so one question to ask what is that minimum set of sensors to guarantee safety. Can higher safety and a less requirements on sensors be achieved by limiting the allowed types of motion. If so how? The output of this thesis is a method that can be used on a drone that has only local positioning (i.e. no global map) and limited sensing and which can be scaled up when more sensors and processed data (such as a map) are available. The video below shows a user with VR goggles trying to control a drone to move out through a door. He spends almost 40s to get to get to the door but never manages to go through. Based on the results in this thesis this task should be simple

Required skills: ROS, programming,  DD2425

Resources: KTH CAS will supply a drone to work with

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

Autonomous exploration/exploitation and data gathering with a drone

The scenario here is a drone that is sent into an environment to gather data. The environment is either completely unknown in which case the drone needs to explore or it is already known in which case it can use the prior information to define how to move. For the former setting, the drone should focus more on how to effectively explore the unknown environment. And after initial map is built or obtained by other ways, for the latter setting, the drone should consider how to better exploit the environment in details. In both scenarios, how to move safely in constrained space should be with highest priority. And in that way, it is related to the project "User interface for safe navigation of a teleoperated drone" with the difference that here the drone has to make the decisions autonomously. 

Required skills: ROS, programming, DD2425

Resources: KTH CAS will supply a drone to work with

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

Semantic labeling and object recognition from a drone

In this project the aim is to investigate methods for semantically labelling indoor scenes using drone and to recognise objects in those scenes. The scenario is a drone that moves around in an indoor environment in need of renovation. The algorithms developed in this thesis has the aim of identifying the type of carpet/floor that is on the ground and what type/model of appliances (fridge, stove, oven, etc) are in an apartment so as to be able to assess the level of renovation needed. The assumption is that the drone moves through the environment and collects/stores the data. The processing of the data can be done offline. Insight for how to fly to collect good data for this processed is also an expected outcome of the project.

Required skills: Computer vision, machine learning, programming

Resources: KTH CAS will supply a drone to work with

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

Control of a drone with faulty motors

In this project we study the control of a drone when one or more motors have failed. This would typically result in the drone crashing. However, it has been shown that the drone can stay in the air even when such failures occur and that the platform can detect such failures automatically. It is encouraged to also consider more extreme scenario, e.g., the drone is heavily damaged and have to land in curtain area. So in this setting, the basic obstacle avoidance is required while try make the drone land as smooth as possible.This project is about investigate this problem with the aim to come up with an algorithm that would allow a drone to handle a failure gracefully.

The video below illustrates the idea well

Literature

Required skills: Solid knowledge of control theory and systems modelling, programming, 

Resources: KTH CAS will supply a drone to work with

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

UWB positioning of drones

In this project we study ultra wideband (UWB) as a means for positioning of a drone. Examples in the literature show that it should be possible to use UWB combined with information from an IMU to position the drone well enough to control its flight. In this project we will study the use of different setups of the UWB system, to generate, for example, ToA or TDoA  measurements. The study should consider  i) scalability (can it be used for many drones or just one?), ii) update rate (how often can we get data?), iii) accuracy (how accurate are the measurements?), iv) reliability (how reliable is the signal that we get? are there man outliers?). If and in what situations an IMU is needed should also be investigated. The output of the project is expected to be one or several algorithms for positioning using UWB.

Literature

Required skills: Programming, Kalman filtering and similar, control theory

External collaboration: The project will be conducted in collaboration with the company Loligo which will be responsible for the UWB hardware.

Resources: KTH CAS will supply a drone to work with and Loligo will provide the UWB hardware

Where to work: It is foreseen that the majority of the time will be spent at KTH CAS.

Administratör Patric Jensfelt skapade sidan 8 november 2016

Administratör Patric Jensfelt ändrade rättigheterna 27 oktober 2017

Kan därmed läsas av alla och ändras av administratörer.