Biorobotics Laboratory BioRob

Project Database

This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects. To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.

Search filter: only projects matching the keyword Machine learning are shown here. Remove filter

Amphibious robotics
Computational Neuroscience
Dynamical systems
Human-exoskeleton dynamics and control
Humanoid robotics
Mobile robotics
Modular robotics
Neuro-muscular modelling
Quadruped robotics

Human-exoskeleton dynamics and control

735 – Hip exoskeleton to assist walking - multiple projects
Category:semester project, master project (full-time), bachelor semester project, internship
Keywords:Bio-inspiration, C, C++, Communication, Compliance, Control, Data Processing, Dynamics Model, Electronics, Experiments, Inverse Dynamics, Kinematics Model, Learning, Locomotion, Machine learning, Optimization, Programming, Python, Robotics, Treadmill
Type:30% theory, 35% hardware, 35% software
Responsible: (MED 3 1015, phone: 31153)
Description:Exoskeletons have experienced an unprecedented growth in recent years and hip-targeting active devices have demonstrated their potential in assisting walking activities. Portable exoskeletons are designed to provide assistive torques while taking off the added weight, with the overall goal of increasing the endurance, reducing the energetic expenditure and increase the performance during walking. The design of exoskeletons involves the development of the sensing, the actuation, the control, and the human-robot interface. In our lab, a hip-joint active hip orthosis (“eWalk”) has been prototyped and tested in recent years. Currently, multiple projects are available to address open research questions. Does the exoskeleton reduce the effort while walking? How can we model human-exoskeleton interaction? How can we design effective controls? How can we optimize the interfaces and the control? Which movements can we assist with exoskeletons? To address these challenges, the field necessitates knowledge in biology, mechanics, electronics, physiology, informatics (programming, learning algorithms), and human-robot interaction. If you are interested in collaborating in one of these topics, please send an email to with your CV, previous experiences that could be relevant to the project, and what interests you the most about this research topic (to be discussed during the interview).

Last edited: 19/04/2024

Mobile robotics

732 – Mobile furniture motion control using human body language
Category:semester project
Keywords:C++, Control, Machine learning, Python, Vision
Type:35% theory, 10% hardware, 55% software
Responsible: (undefined, phone: 37432)
Description:Furniture are evolving. From static objects in the home, they are become active and mobile. These new capabilities open novel interaction opportunities and raise questions about ways furniture can communicate with users. Building on recent developments in mobile furniture in the Biorob, this project will explore how they can communicate with their user by adapting their motions to achieve defined communication goals. This work will follow exploration studies from the human-robot interaction field using mostly Wizard-of-Oz paradigms (a human is actually controlling the “robot”) to add autonomy to these systems. This will be a following-up project based on existing systems. A human pose is detected as a 3D joints skeleton, using Kinect camera (RGB-Depth camera) and OpenPifPaf (a learning-based human pose detection algorithm). Human motions, or sequence of human poses, can be categorized into different meanings based on current studies of human body language, and can further be classified by the provided visual perception system using either geometrical regulations or learning-based motion recognition algorithm (for example, spatial-temporal graph neural network). Once the user commands are correctly identified, these commands can be sent to the mobile furniture robot using robot operating system (ROS) to execute the commands in order to meet the user requirements in the assistive environment. Further real-world experiments will also be needed to verify the functionality and performance of this system.

Last edited: 23/06/2024

2 projects found.