Available Student and Research Projects

From MMVLWiki
(Difference between revisions)
Jump to: navigation, search
m (External Links)
m
 
(158 intermediate revisions by 6 users not shown)
Line 1: Line 1:
 
[[Image:Sheafsteelgates.jpg|thumb|240px|right|Steel gates of entrance to City Campus next to Sheaf Building]]
 
[[Image:Sheafsteelgates.jpg|thumb|240px|right|Steel gates of entrance to City Campus next to Sheaf Building]]
=Student Projects=
+
These are the projects currently being offered by the [[MMVL]]. If you wish to be supervised then please contact the listed supervisors directly for an interview. Note that we support projects at the undergraduate, ERASMUS, M.Sc. and Ph.D. levels.
We always offer projects to motivated students (first degree, Masters of Science, ERASMUS student, ...).
+
If you are interested in doing a project in computer vision, let us know. Here is a description of ongoing research areas to give you an idea of possible projects. You can also suggest a research topic yourself.
+
  
If you work with us, you can learn a lot of skills which are relevant for a career as a software developer:
+
The projects fall into the following categories:
* Computer Vision, Signal processing, Robotics
+
* [[#Machine vision projects|Machine vision]]
* Linear Algebra, Analysis
+
* [[#Robotics projects|Robotics]]
* Software Engineering
+
* [[#Medical projects|Medical applications]]
 +
* [[#Industrial robotics & automation projects|Industrial robotics & automation]]
  
We are using state-of-the-art platform-independent software tools:
+
See sections "[[#External Links|External Links]]" for thesis marking sheet and sample reports from previous years.
* Source-code documentation with [[Image:Doxygen logo.png|80px|]] [http://www.stack.nl/~dimitri/doxygen/ doxygen]
+
* Cross-platform user-interfaces with [[Image:Qt logo.png|30px|]] [http://www.trolltech.com/ Qt]. You can develop full-featured GUI-software which runs under [[Image:Tux.jpg|30px|]] GNU/Linux, [[Image:Ms-windows logo.png|40px|]] Microsoft Windows, and [[Image:Macos.gif|54px|]] MacOS!
+
* Platform-independent [[Image:Stl logo.gif|30px|]] [http://www.sgi.com/tech/stl/ Standard Template Library]
+
* Platform-independent [[Image:C--boost logo.gif|80px|]] [http://www.boost.org/ Boost Library]
+
  
=Project areas=
+
=Machine vision projects=
==Stitching for microscopes==
+
{|align="center"
+
|-
+
|[[Image:Feather.jpg|thumb|200px|A bird's feather (reflected light, darkfield) (7.2 MByte [http://vision.eng.shu.ac.uk/jan/feather1.avi video], 10.1 MByte [http://vision.eng.shu.ac.uk/jan/feather2.avi video])]]||[[Image:Mapping.png|250px|thumb|Stitching using the feedback of the microscope's drive]]
+
|-
+
|}
+
===Premise===
+
# A microscope-video of an object being moved in x-, and y-direction (parallel to the focussed plane)
+
# Later a microscope-video of an object being moved in x-, y-, and z-direction (i.e. including depth changes)
+
  
===To Do===
+
==Embedded Machine Vision System==
* Generate stitched image from the input-video (linear complexity desirable) without feedback from microscope-drive
+
The project requires the design and fabrication of a small embedded machine vision board comprising a CMOS camera chip and a micro-controller unit (ARM7/ARM9/Cortex/xScale) for on-board image processing. The work is in conjunction with a European project in reconfigurable robotics.
* Cross-compare images to avoid a drift of the estimated shift
+
* Later provide extended depth of field by maximising a focus measure.
+
===See Also===
+
* [[Depth from Focus]]
+
  
Contact [[User:Engjw|Jan Wedekind]] for a project in this area.
+
Supervisor: [mailto:f.caparrelli@REMOVETHISshu.ac.uk Fabio Caparrelli]
==Automated photo stitching==
+
{|align="center"
+
|-
+
|[[Image:panorama1.jpg|thumb|160px|Picture of [http://en.wikipedia.org/wiki/Canary_Wharf Canary Wharf, London]]]||[[Image:panorama2.jpg|thumb|160px|Another picture of Canary Wharf]]||[[Image:panoramaRes.jpg|thumb|180px|Manually stitched pictures (badly aligned)]]
+
|-
+
|}
+
  
Contact [[User:Engjw|Jan Wedekind]] for a project in this area.
+
=Robotics projects=
===Premise===
+
* A set of images taken with the same camera-settings (aperture, exposure time, focal length) and center of projection but different viewing directions (mainly yaw, pitch)
+
* Manually selected correspondences.
+
  
===To Do===
+
==Formation control of Khepera III robots==
* Improve the correspondences using 2D cross-correlation (roll-angle will assumed to be low).
+
The project consists of implementing formation control on a group of 3 Khepera robots. Inputs for the control are the US and IR sensors mounted on the khepera III robots, the control consists of a set of attraction and repulsion forces.
* Use these correspondences to optimize the parameters (rotations, common focal length, rotation of virtual camera).
+
Supervisor: [mailto:l.alboul@shu.ac.uk Lyuba Alboul], Alan Holloway, (Jacques Penders)
* Transform and merge the images into the resulting '''panorama image'''.
+
<math>
+
\lambda\,\begin{pmatrix}m^\prime_{1}\\m^\prime_{2}\\f\end{pmatrix}=
+
\begin{pmatrix}h_{11}&h_{12}&h_{13}\\h_{21}&h_{22}&h_{23}\\h_{31}&h_{32}&h_{33}\end{pmatrix}\,
+
\begin{pmatrix}m_{1}\\m_{2}\\f\end{pmatrix}
+
</math>
+
  
===External Links===
+
==Wall following behaviours for mobile robots==
* [http://www.ptgui.com/ Photo stitching software PTGUI]
+
Abstract: the project consists of implementing wall following behaviour on a Khepera III robot using US and/or IR sensors mounted on the robot. Additionally a map of the environment is produced or 2 other khepera follow the wall follower in a formation.
* [http://en.wikipedia.org/wiki/Rotation_matrix Rodriguez Matrix]
+
* [http://en.wikipedia.org/wiki/Panorama_Tools_(software%) Panorama Tools software suite]
+
* [http://hugin.sourceforge.net/ Hugin]
+
  
==Interactive projector-camera interface==
+
Supervisor: [mailto:l.alboul@shu.ac.uk Lyuba Alboul], Alan Holloway, (Jacques Penders)
[[Image:Camera-projector.jpg|thumb|320px|right|Interactive camera-projector system]]
+
Contact [[User:Engjw|Jan Wedekind]] for a project in this area.
+
===Premise===
+
* A camera is given images of a projected image (or a TFT screen)
+
* The system already calibrates itself (two-dimensional homography) using projected patterns
+
  
===To Do===
+
==Design of a I2C sensor interface for Khepera robots using an embedded microcontroller==
* Improved method for recognition of hand/fingers
+
Abstract: The project consists of designing, building and testing a sensor interface to be mounted on a Khepera robot. The system will be based around an embedded microcontroller which will interface to the Khepera using the I2C interface.
* 3D calibration, depth perception using projected patterns or shadows
+
  
===See Also===
+
Supervisors:  [mailto:engafh@exchange.shu.ac.uk Alan Holloway], (Jacques Penders)
* [[Interactive Camera-Projector System]]
+
  
===External Links===
+
==Design of miniature multi channel frequency counter for remote QCM sensing applications==
* [http://mrl.nyu.edu/~jhan/ftirtouch/ Multi-Touch Interaction Research]
+
Abstract: The project will involve research into a suitable method of measuring the frequency of an array of frequency based sensors. It is anticipated that either an embedded microcontroller or FPGA will be used. A prototype design should be made and suitable validation of the system performed.  
* [http://iihm.imag.fr/demos/magicboard/ Magic board project by IIHM]
+
* [http://www.research.ibm.com/ed/ed_technology.htm IBM Research on interactive projector]
+
* [http://research.microsoft.com/%7Ezhang/calib/ A Flexible New Technique for Camera Calibration: Zhengyou Zhang]
+
  
==Physics Engine==
+
Supervisors: [mailto:engafh@exchange.shu.ac.uk Alan Holloway], A Nabok, (Jacques Penders)
[[Image:Output2.jpg|thumb|right|160px|Demonstration of the [http://www.ode.org/ Open Dynamics Engine] [http://vision.eng.shu.ac.uk/jan/output2.avi (217 kByte video)]]]
+
Contact [[User:Engjw|Jan Wedekind]] for a project in this area.
+
===Premise===
+
A physics engines is useful for simulating robots and testing computer vision algorithms. The ''Open Dynamics Engine'' was used in two projects already.
+
===To Do===
+
The ''Open Dynamics Engine'' is not numerically stable. An investigation into numerical algorithms for simulating
+
multiple rigid bodies is required. The rigid bodies can be connected by joints, which are limiting their degrees of freedom.
+
===See Also===
+
* [[Robot simulation with Gazebo]]
+
===External Links===
+
* [http://www.ode.org/ Open Dynamics Engine]
+
* [http://www.spiderland.org/breve/ Breve simulation environment]
+
* [http://jsbsim.sourceforge.net/ Open Source Flight Dynamics Model]
+
  
==RANSAC==
+
==Design of a microcontroller based Electronic nose for use on Khepera robots==
[[Image:Penguin.jpg|thumb|right|200px|Recognition and tracking with three or four degrees-of-freedom. [[Microscope Vision Software|More ...]]]]
+
Abstract: The project is based around the design of  both a hardware and software interface for a small array of sensors which can be mounted on a Khepera robot. The project will involve some embedded microcontroller programming, electronic circuit design and PCB design/fabrication
[http://en.wikipedia.org/wiki/RANSAC Random sample consensus] is a method for object recognition. This project is about recognising
+
 
macroscopic rigid objects (e.g. household and office articles like cups, stapler, ...
+
Supervisor: [mailto:engafh@exchange.shu.ac.uk Alan Holloway]
===To Do===
+
 
* Select point-features and a suitable similarity measure
+
=Medical projects=
* Implement RANSAC algorithm and apply to 3 and 4 degrees-of-freedom problem.
+
 
* Extend RANSAC implementation to 6 degrees-of-freedom problem (maybe use line- and point-features)
+
==Baby breathing monitor==
* Demonstrate algorithm on real object.
+
 
==See Also==
+
The purpose of the project is to measure the breathing rate of babies without any physical contact.
* [[British Life and Culture Module]]
+
* Solution 1: Using sensors - A proof of concept working version has already been done using a single sensor. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.
* [[Microscope Vision Software]]
+
*Solution 2 : Using a web camera based vision system to detect the breathing movement. A proof of concept working version has already been developed. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.
* [[Mimas Camera Calibration]]
+
 
 +
Reference : [http://ieeexplore.ieee.org/iel5/7635/20843/00965238.pdf?tp=&isnumber=&arnumber=965238 Development of non-restrictive sensing system for sleeping person using fiber grating vision sensor] (Journal paper)
 +
 
 +
Skill set that will be developed : Electronics, PIC micro-controller programming. Exposure to Linear algebra. MATLAB and/or C++ programing knowledge will be useful
 +
 
 +
Supervisors: [mailto:r.saatchi@shu.ac.uk Reza Saatchi] and [mailto:a.n.selvan@shu.ac.uk Arul Selvan]
 +
 
 +
==Medical Ultrasound Training Simulator==
 +
 
 +
[http://www.sensegraphics.com/index.php?page=shop.product_details&flypage=shop.flypage_sensegraphics&product_id=20&category_id=7&manufacturer_id=0&option=com_virtuemart&Itemid=83 Using augmented virtual reality and haptic force feedback system] to  simulate the Ultrasound medical examination of a patient. An M.Sc. student has previously developed most of [http://vision.eng.shu.ac.uk/mmvlwiki/index.php/Medical_Image_Processing the simulator]. The enhancements to be done to the existing project are related to acquire new data set(Medical Ultrasound images) to incorporate into the existing project.
 +
 
 +
Skill set that will be developed : Python programming. Some prior programming experience is required.
 +
 
 +
Supervisors: [mailto:r.saatchi@shu.ac.uk Reza Saatchi] and [mailto:a.n.selvan@shu.ac.uk Arul Selvan]
 +
 
 +
=Industrial robotics & automation projects=
 +
 
 +
==Precise Food Decoration Using a Robot-Controlled System==
 +
The project requires the design and construction of a mechanical rig to be attached to a robot platform and mounting a laser-optical displacement sensor. The aim is to use the sensor to keep the robot end-effector to a constant distance from a non-flat surface. The targeted application is in the food robotics industry, eg automatic decoration of confectionery foods (cakes!).
 +
 
 +
Supervisor: [mailto:f.caparrelli@REMOVETHISshu.ac.uk Fabio Caparrelli]
  
 
=See Also=
 
=See Also=
* Have a look at [[Mimas#Software_Engineering|Mimas/Software engineering]] to get a general impression of our working environment.
+
* [[MMVL]]
* [[Nanorobotics]]
+
* [[:Category:Projects|Projects]]
* [[I-Swarm]]
+
 
 +
=External Links=
 +
* [http://vision.eng.shu.ac.uk/bala/msc/MSc-ProjectMarkingSheet.doc M.Sc. Thesis marking sheet]
 +
* [http://vision.eng.shu.ac.uk/bala/msc/ M.Sc. Theses from previous years]
 +
* [http://www.shu.ac.uk/research/meri/postgrad-research/ MERI PhD adverts]
 +
* [https://wiki.ubuntu.com/Training (K)Ubuntu Student Guide]
 +
* [http://www.ubuntupocketguide.com/ (K)Ubuntu Pocket Guide]
 +
* [[Image:GoogleCodeSearch.gif|60px]] [http://codesearch.google.com/ Google code search]
  
 
[[Category:Projects]]
 
[[Category:Projects]]

Latest revision as of 12:56, 12 August 2011

Steel gates of entrance to City Campus next to Sheaf Building

These are the projects currently being offered by the MMVL. If you wish to be supervised then please contact the listed supervisors directly for an interview. Note that we support projects at the undergraduate, ERASMUS, M.Sc. and Ph.D. levels.

The projects fall into the following categories:

See sections "External Links" for thesis marking sheet and sample reports from previous years.

Contents

[edit] Machine vision projects

[edit] Embedded Machine Vision System

The project requires the design and fabrication of a small embedded machine vision board comprising a CMOS camera chip and a micro-controller unit (ARM7/ARM9/Cortex/xScale) for on-board image processing. The work is in conjunction with a European project in reconfigurable robotics.

Supervisor: Fabio Caparrelli

[edit] Robotics projects

[edit] Formation control of Khepera III robots

The project consists of implementing formation control on a group of 3 Khepera robots. Inputs for the control are the US and IR sensors mounted on the khepera III robots, the control consists of a set of attraction and repulsion forces. Supervisor: Lyuba Alboul, Alan Holloway, (Jacques Penders)

[edit] Wall following behaviours for mobile robots

Abstract: the project consists of implementing wall following behaviour on a Khepera III robot using US and/or IR sensors mounted on the robot. Additionally a map of the environment is produced or 2 other khepera follow the wall follower in a formation.

Supervisor: Lyuba Alboul, Alan Holloway, (Jacques Penders)

[edit] Design of a I2C sensor interface for Khepera robots using an embedded microcontroller

Abstract: The project consists of designing, building and testing a sensor interface to be mounted on a Khepera robot. The system will be based around an embedded microcontroller which will interface to the Khepera using the I2C interface.

Supervisors: Alan Holloway, (Jacques Penders)

[edit] Design of miniature multi channel frequency counter for remote QCM sensing applications

Abstract: The project will involve research into a suitable method of measuring the frequency of an array of frequency based sensors. It is anticipated that either an embedded microcontroller or FPGA will be used. A prototype design should be made and suitable validation of the system performed.

Supervisors: Alan Holloway, A Nabok, (Jacques Penders)

[edit] Design of a microcontroller based Electronic nose for use on Khepera robots

Abstract: The project is based around the design of both a hardware and software interface for a small array of sensors which can be mounted on a Khepera robot. The project will involve some embedded microcontroller programming, electronic circuit design and PCB design/fabrication

Supervisor: Alan Holloway

[edit] Medical projects

[edit] Baby breathing monitor

The purpose of the project is to measure the breathing rate of babies without any physical contact.

  • Solution 1: Using sensors - A proof of concept working version has already been done using a single sensor. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.
  • Solution 2 : Using a web camera based vision system to detect the breathing movement. A proof of concept working version has already been developed. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.

Reference : Development of non-restrictive sensing system for sleeping person using fiber grating vision sensor (Journal paper)

Skill set that will be developed : Electronics, PIC micro-controller programming. Exposure to Linear algebra. MATLAB and/or C++ programing knowledge will be useful

Supervisors: Reza Saatchi and Arul Selvan

[edit] Medical Ultrasound Training Simulator

Using augmented virtual reality and haptic force feedback system to simulate the Ultrasound medical examination of a patient. An M.Sc. student has previously developed most of the simulator. The enhancements to be done to the existing project are related to acquire new data set(Medical Ultrasound images) to incorporate into the existing project.

Skill set that will be developed : Python programming. Some prior programming experience is required.

Supervisors: Reza Saatchi and Arul Selvan

[edit] Industrial robotics & automation projects

[edit] Precise Food Decoration Using a Robot-Controlled System

The project requires the design and construction of a mechanical rig to be attached to a robot platform and mounting a laser-optical displacement sensor. The aim is to use the sensor to keep the robot end-effector to a constant distance from a non-flat surface. The targeted application is in the food robotics industry, eg automatic decoration of confectionery foods (cakes!).

Supervisor: Fabio Caparrelli

[edit] See Also

[edit] External Links

Personal tools
Namespaces
Variants
Actions
Navigation
Toolbox