Biorobotics Laboratory BioRob

Project Database

This page contains the database of possible research projects for master and bachelor students in the Biorobotics Laboratory (BioRob). Visiting students are also welcome to join BioRob, but it should be noted that no funding is offered for those projects. To enroll for a project, please directly contact one of the assistants (directly in his/her office, by phone or by mail). Spontaneous propositions for projects are also welcome, if they are related to the research topics of BioRob, see the BioRob Research pages and the results of previous student projects.

Amphibious robotics
Computational Neuroscience
Dynamical systems
Human-exoskeleton dynamics and control
Humanoid robotics
Miscellaneous
Mobile robotics
Modular robotics
Neuro-muscular modelling
Quadruped robotics


Amphibious robotics

741 – Development of a series elastic actuator with torque/force control
Category:semester project, master project (full-time)
Keywords:Compliance, Control, Embedded Systems, Prototyping, Robotics
Type:20% theory, 60% hardware, 20% software
Responsible: (MED 1 1626, phone: 38676)
Description:This project aims to continue the development of a series elastic actuator (SEA) for a salamander robot. Despite the availability of various off-the-shelf servo motors, it is difficult to find one that can provide accurate torque/force control to validate advanced control methods involving musculoskeletal models, provide large torque output, and be compact in size. SEAs are promising in satisfying these requirements, see this paper as an example: http://biorobotics.ri.cmu.edu/papers/paperUploads/DSCC2013-3875.pdf A preliminary design of the geared motor that will drive the elastic component has been completed. This project will mainly focus on the design and manufacturing of the elastic component, the programming of the electronics, and the design of the feedback controller. Multiple iterations of testing and improvement will be needed, so the student is expected to have great time management skills. If there is sufficient time, the following topics can be explored: (1) Modify the design to test hypotheses about multiarticular muscles. (2) Integrate the motor to a salamander robot and test various scientific hypotheses. (3) Waterproofing the motor module for amphibious applications. Students with a solid background in mechanical design and control theory are preferred. Interested students could send CVs, transcripts, materials that can demonstrate project experience (videos, slides, reports, etc.), if possible, and several potential time slots for a quick meeting to qiyuan.fu@epfl.ch.

Last edited: 24/06/2024
736 – Firmware development for a sensorized Pleurobot
Category:semester project, master project (full-time)
Keywords:C, C++, Communication, Control, Embedded Systems, Firmware, Linux, Programming, sensor
Type:5% theory, 10% hardware, 85% software
Responsible: (MED 1 1626, phone: 38676)
Description:In this project, the student is expected to continue developing the existing firmware for high-performance low-level control of the new Pleurobot (our amphibious legged robot modeling Pleurodeles waltl) and its multiple sensors. The major objectives include: (1) Improve the sampling speed and robustness of the microcontrollers that collect data from multiple sensors. (2) Increase the bandwidth of and reduce the latency in the communication between the onboard computer and multiple microcontrollers. (3) (For full-time students) Develop low-latency wireless communication between the onboard computer and the user's laptop for remote control. The student is expected to be familiar with (1) communication protocols including SPI, UART, and CAN, and (2) programming of embedded systems using C/C++. Knowledge about signal processing, wireless network protocols, and/or GUI development can be a bonus. The student who is interested in this project could send his/her transcript, CV, and description of their past project experience to qiyuan.fu@epfl.ch. A student who can work full-time in the summer or the autumn semester is preferred.

Last edited: 16/05/2024

Human-exoskeleton dynamics and control

735 – Hip exoskeleton to assist walking - multiple projects
Category:semester project, master project (full-time), bachelor semester project, internship
Keywords:Bio-inspiration, C, C++, Communication, Compliance, Control, Data Processing, Dynamics Model, Electronics, Experiments, Inverse Dynamics, Kinematics Model, Learning, Locomotion, Machine learning, Optimization, Programming, Python, Robotics, Treadmill
Type:30% theory, 35% hardware, 35% software
Responsible: (MED 3 1015, phone: 31153)
Description:Exoskeletons have experienced an unprecedented growth in recent years and hip-targeting active devices have demonstrated their potential in assisting walking activities. Portable exoskeletons are designed to provide assistive torques while taking off the added weight, with the overall goal of increasing the endurance, reducing the energetic expenditure and increase the performance during walking. The design of exoskeletons involves the development of the sensing, the actuation, the control, and the human-robot interface. In our lab, a hip-joint active hip orthosis (“eWalk”) has been prototyped and tested in recent years. Currently, multiple projects are available to address open research questions. Does the exoskeleton reduce the effort while walking? How can we model human-exoskeleton interaction? How can we design effective controls? How can we optimize the interfaces and the control? Which movements can we assist with exoskeletons? To address these challenges, the field necessitates knowledge in biology, mechanics, electronics, physiology, informatics (programming, learning algorithms), and human-robot interaction. If you are interested in collaborating in one of these topics, please send an email to giulia.ramella@epfl.ch with your CV, previous experiences that could be relevant to the project, and what interests you the most about this research topic (to be discussed during the interview).

Last edited: 19/04/2024

Miscellaneous

742 – Create synthetic salamander dataset with domain randomization and unsupervised generative attentional networks
Category:semester project, master project (full-time)
Keywords:3D, C++, Computer Science, Data Processing, Machine learning, Programming, Vision
Type:20% theory, 80% software
Responsible: (MED 1 1611, phone: 36620)
Description:

Powerful deep-learning based tracking method for animal behaviors requires large-scale curated and annotated data. Several recent papers [1,2] revealed the possibility to leverage the data requirement by rendering animated synthetic animals such as mice and ants.

In this project, the student will work on an existing biomechanical model of Salamander Pleurodeles Waltl to create a synthetic dataset for marklerless keypoint tracking tasks. The dataset would help improve the performance of a salamander tracking network, which would ultimately provide invaluable kinematics data for designing muscle models, neural controllers and validating neuroscience hypothesis.

For a PdM, this project involves:

  • Improve the realism of the current salamander model in Blender and add diversity with procedurally generated noise and domain randomization.
  • Generate a synthetic image dataset for markerless tracking tasks.
  • Train an Image Domain Translator (e.g. U-GAT-IT[3]) to increase the dataset fidelity and reduce the reality gap
  • Bonus: evaluate the dataset power on a markerless tracking network (e.g. DLC[4])

For a semester project, work packages will be optionally dropped and tailored according to student’s skills and interests.

The student is expected to have good programming skills and previous experience/knowledge in deep learning. Knowledge in 3D modeling and computer graphics is a plus but not required. If interested, please send an email to Chuanfang Ning with your motivation, CV, transcripts and most relevant experience.

[1] Bolaños, Luis A., et al. "A three-dimensional virtual mouse generates synthetic training data for behavioral analysis."

[2] Plum, Fabian, et al. "replicAnt: a pipeline for generating annotated images of animals in complex environments using Unreal Engine." 

[3] Kim, Junho, et al. "U-gat-it: Unsupervised generative attentional networks with adaptive layer-instance normalization for image-to-image translation." 

[4] Mathis, Alexander, et al. "DeepLabCut: markerless pose estimation of user-defined body parts with deep learning." 



Last edited: 27/06/2024
737 – Development of a treadmill with closed-loop control of speed for recording optical and X-ray videos
Category:semester project, master project (full-time)
Keywords:Control, Electronics, Embedded Systems, Experiments, Firmware, Image Processing, Mechanical Construction, Motion Capture, Prototyping, Treadmill, Vision
Type:60% hardware, 40% software
Responsibles: (MED 1 1611, phone: 36620)
(MED 1 1626, phone: 38676)
Description:When recording animal behaviors using optical or X-ray videos, there is a tradeoff between having a large field of view and having a high resolution of the animal body. This limits the ability to obtain animal kinematics for a long time and with high accuracy simultaneously. One solution is to let the animal run on a treadmill such that it can stay inside the field of view. However, the animal often varies its speed during movement, and the radio-opaque components in the common treadmills add difficulty in placing X-ray cameras. In this project, the student will develop a treadmill to be used with optical and X-ray tracking setup. The treadmill is expected to have the following features: (1) The major components should be constructed using radio-transparent materials such as plastics. (2) The slope of the treadmill can be adjusted. (3) The speed of the treadmill can be controlled in closed loops to keep the animal in the center of the view. To realize this, a camera may be used to track the animals. See this video for an example: https://www.youtube.com/watch?v=0GyovqfQj2g&ab_channel=TerradynamicsLab (Note that the treadmill in this project does not need to move in 2 dimensions.) If there is sufficient time, the following features would be desirable: (4) Being able to move in two dimensions (omnidirectional treadmill). (5) Allowing integration with force/torque sensors below the surface. Students with knowledge of designing mechanical structures and embedded systems, computer vision, and feedback control are preferred. Interested students can send their resumes, transcripts, and materials that can show their project experience to the assistants.

Last edited: 20/06/2024
739 – Radio communication tests on 169.4 MHz
Category:semester project
Keywords:Electronics, Embedded Systems, Firmware, Radio
Type:10% theory, 70% hardware, 20% software
Responsible: (MED 1 1025, phone: 36630)
Description:

Mobile robots often communicate over the 2.4 GHz band using standard off-the-shelf technologies as WiFi or Bluetooth, or sometimes custom radio protocols either on the 2.4 GHz or 868 MHz ISM bands, both on the UHF part of the radio spectrum. This project aims at evaluating the possibility of using the 169.4 MHz band (VHF) for controlling robots and obtaining telemetry, as it might give much better results in terms of range and transmission through obstacles or water, even if the available bandwidth is much more restricted.

The project involves:

  • Identifying an appropriate RF module/chip
  • Creating a printed circuit if necessary
  • Using a microcontroller to control the RF module and obtain bidirectional communication
  • Experiments for range in open air, through obstacles and underwater

Requirements: experience with digital electronics and basic understanding of radio communications and related concepts (e.g. transmission lines, antennas). Previous experience with radio frequency and/or PCB design is a plus.



Last edited: 11/06/2024

Mobile robotics

732 – Mobile furniture motion control using human body language
Category:semester project, master project (full-time)
Keywords:C++, Control, Machine learning, Python, Vision
Type:35% theory, 10% hardware, 55% software
Responsible: (undefined, phone: 37432)
Description:Furniture are evolving. From static objects in the home, they are become active and mobile. These new capabilities open novel interaction opportunities and raise questions about ways furniture can communicate with users. Together with Prof. Emmanuel Senft from Human-centered Robotics and AI group, EPFL IDIAP, and building on recent developments in mobile furniture in BioRob, this project will explore how they can communicate with their user by adapting their motions to achieve defined communication goals. This work will follow exploration studies from the human-robot interaction field using mostly Wizard-of-Oz paradigms (a human is actually controlling the “robot”) to add autonomy to these systems. This will be a following-up project based on existing systems. A human pose is detected as a 3D joints skeleton, using Kinect camera (RGB-Depth camera) and OpenPifPaf (a learning-based human pose detection algorithm). Human motions, or sequence of human poses, can be categorized into different meanings based on current studies of human body language, and can further be classified by the provided visual perception system using either geometrical regulations or learning-based motion recognition algorithm (for example, spatial-temporal graph neural network). Once the user commands are correctly identified, these commands can be sent to the mobile furniture robot using robot operating system (ROS) to execute the commands in order to meet the user requirements in the assistive environment. Further real-world experiments will also be needed to verify the functionality and performance of this system.

Last edited: 25/06/2024

7 projects found.