Available Student and Research Projects

From MMVLWiki
(Difference between revisions)
Jump to: navigation, search
m (To Do)
m
 
(146 intermediate revisions by 6 users not shown)
Line 1: Line 1:
 
[[Image:Sheafsteelgates.jpg|thumb|240px|right|Steel gates of entrance to City Campus next to Sheaf Building]]
 
[[Image:Sheafsteelgates.jpg|thumb|240px|right|Steel gates of entrance to City Campus next to Sheaf Building]]
=Student Projects=
+
These are the projects currently being offered by the [[MMVL]]. If you wish to be supervised then please contact the listed supervisors directly for an interview. Note that we support projects at the undergraduate, ERASMUS, M.Sc. and Ph.D. levels.
We always offer projects to motivated students (first degree, Masters of Science, ERASMUS student, ...).
+
If you are interested in doing a project in computer vision, let us know. Here is a description of ongoing research areas to give you an idea of possible projects. You can also suggest a research topic yourself.
+
  
If you work with us, you can learn a lot of skills which are relevant for a career as a software developer:
+
The projects fall into the following categories:
* Computer Vision, Signal processing, Robotics
+
* [[#Machine vision projects|Machine vision]]
* Linear Algebra, Analysis
+
* [[#Robotics projects|Robotics]]
* Software Engineering
+
* [[#Medical projects|Medical applications]]
 +
* [[#Industrial robotics & automation projects|Industrial robotics & automation]]
  
We are using state-of-the-art platform-independent software tools:
+
See sections "[[#External Links|External Links]]" for thesis marking sheet and sample reports from previous years.
* Source-code documentation with [[Image:Doxygen logo.png|80px|]] [http://www.stack.nl/~dimitri/doxygen/ doxygen]
+
* Cross-platform user-interfaces with [[Image:Qt logo.png|30px|]] [http://www.trolltech.com/ Qt]. You can develop full-featured GUI-software which runs under [[Image:Tux.jpg|30px|]] GNU/Linux, [[Image:Ms-windows logo.png|40px|]] Microsoft Windows, and [[Image:Macos.gif|54px|]] MacOS!
+
* Platform-independent [[Image:Stl logo.gif|30px|]] [http://www.sgi.com/tech/stl/ Standard Template Library]
+
* Platform-independent [[Image:C--boost logo.gif|80px|]] [http://www.boost.org/ Boost Library]
+
  
=Project areas=
+
=Machine vision projects=
==Stitching for microscopes==
+
{|align="center"
+
|-
+
|[[Image:Feather.jpg|thumb|200px|A bird's feather (reflected light, darkfield) (7.2 MByte [http://vision.eng.shu.ac.uk/jan/feather1.avi video], 10.1 MByte [http://vision.eng.shu.ac.uk/jan/feather2.avi video])]]||[[Image:Mapping.png|250px|thumb|Stitching using the feedback of the microscope's drive]]
+
|-
+
|}
+
===Premise===
+
# A microscope-video of an object being moved in x-, and y-direction (parallel to the focussed plane)
+
# Later a microscope-video of an object being moved in x-, y-, and z-direction (i.e. including depth changes)
+
  
===To Do===
+
==Embedded Machine Vision System==
* Generate stitched image from the input-video (linear complexity desirable) without feedback from microscope-drive
+
The project requires the design and fabrication of a small embedded machine vision board comprising a CMOS camera chip and a micro-controller unit (ARM7/ARM9/Cortex/xScale) for on-board image processing. The work is in conjunction with a European project in reconfigurable robotics.
* Cross-compare images to avoid a drift of the estimated shift
+
* Later provide extended depth of field by maximising a focus measure.
+
===See Also===
+
* [[Depth from Focus]]
+
===External Links===
+
* [http://www.hadleyweb.pwp.blueyonder.co.uk/ CombineZ]
+
  
Contact [[User:Engjw|Jan Wedekind]] for a project in this area.
+
Supervisor: [mailto:f.caparrelli@REMOVETHISshu.ac.uk Fabio Caparrelli]
  
==Automated photo stitching==
+
=Robotics projects=
{|align="center"
+
|+ '''West Kirby input images'''
+
|-
+
|[[Image:mews1.jpg|80px]]||[[Image:mews2.jpg|80px]]||[[Image:mews3.jpg|80px]]||[[Image:mews4.jpg|80px]]||[[Image:mews5.jpg|80px]]||[[Image:mews6.jpg|80px]]||[[Image:mews7.jpg|80px]]||[[Image:mews8.jpg|80px]]||[[Image:mews9.jpg|80px]]||[[Image:mewsa.jpg|80px]]||[[Image:mewsb.jpg|80px]]
+
|-
+
|}
+
  
{|align="center"
+
==Formation control of Khepera III robots==
|+ '''Resulting panorama image created with [http://hugin.sourceforge.net/ Hugin]'''
+
The project consists of implementing formation control on a group of 3 Khepera robots. Inputs for the control are the US and IR sensors mounted on the khepera III robots, the control consists of a set of attraction and repulsion forces.
|-
+
Supervisor: [mailto:l.alboul@shu.ac.uk Lyuba Alboul], Alan Holloway, (Jacques Penders)
|[[Image:mewspanorama.jpg|900px]]
+
|-
+
|}
+
  
{|align="center"
+
==Wall following behaviours for mobile robots==
|+ '''Hilbre Island input images'''
+
Abstract: the project consists of implementing wall following behaviour on a Khepera III robot using US and/or IR sensors mounted on the robot. Additionally a map of the environment is produced or 2 other khepera follow the wall follower in a formation.
|-
+
|[[Image:hilbreisland1.jpg|100px]]||[[Image:hilbreisland2.jpg|100px]]||[[Image:hilbreisland3.jpg|100px]]||[[Image:hilbreisland4.jpg|100px]]||[[Image:hilbreisland5.jpg|100px]]||[[Image:hilbreisland6.jpg|100px]]||[[Image:hilbreisland7.jpg|100px]]
+
|-
+
|}
+
  
{|align="center"
+
Supervisor: [mailto:l.alboul@shu.ac.uk Lyuba Alboul], Alan Holloway, (Jacques Penders)
|+ '''Resulting panorama image created with [http://hugin.sourceforge.net/ Hugin]'''
+
|-
+
|[[Image:hilbreislandpanorama.jpg|520px]]
+
|-
+
|}
+
  
===Premise===
+
==Design of a I2C sensor interface for Khepera robots using an embedded microcontroller==
* A set of images taken with the same camera-settings (aperture, exposure time, focal length) and center of projection but different viewing directions (mainly yaw, pitch)
+
Abstract: The project consists of designing, building and testing a sensor interface to be mounted on a Khepera robot. The system will be based around an embedded microcontroller which will interface to the Khepera using the I2C interface.
* Manually selected correspondences.
+
  
===To Do===
+
Supervisors:   [mailto:engafh@exchange.shu.ac.uk Alan Holloway], (Jacques Penders)
* Improve the correspondences using 2D cross-correlation (roll-angle will assumed to be low).
+
* (Re)implement the [http://panotools.sf.net/ panotools]-solver using [http://www.boost.org/libs/numeric/ublas/doc/index.htm boost::ublas] and [http://vision.eng.shu.ac.uk/jan/mimas/docs/group__lapack.html Mimas-lapack]
+
** Use correspondences to optimize the parameters (rotations, common focal length, rotation of virtual camera).
+
** Transform and merge the images into the resulting '''panorama image'''.
+
<math>
+
\lambda\,\begin{pmatrix}m^\prime_{1}\\m^\prime_{2}\\f\end{pmatrix}=
+
\begin{pmatrix}h_{11}&h_{12}&h_{13}\\h_{21}&h_{22}&h_{23}\\h_{31}&h_{32}&h_{33}\end{pmatrix}\,
+
\begin{pmatrix}m_{1}\\m_{2}\\f\end{pmatrix}
+
</math>
+
* Transform and map the images into the resulting '''panorama image'''.
+
** Adopt existing method for removing fringes (transitions/blending)
+
Optional:
+
* Automatically find correspondences
+
  
Contact [[User:Engjw|Jan Wedekind]] for a project in this area.
+
==Design of miniature multi channel frequency counter for remote QCM sensing applications==
 +
Abstract: The project will involve research into a suitable method of measuring the frequency of an array of frequency based sensors. It is anticipated that either an embedded microcontroller or FPGA will be used.  A prototype design should be made and suitable validation of the system performed.  
  
===External Links===
+
Supervisors: [mailto:engafh@exchange.shu.ac.uk Alan Holloway], A Nabok, (Jacques Penders)
* [http://www.ptgui.com/ Photo stitching software PTGUI]
+
* [http://en.wikipedia.org/wiki/Rotation_matrix Rodriguez Matrix]
+
* [http://en.wikipedia.org/wiki/Panotools Panorama Tools software suite] ([http://www.path.unimelb.edu.au/~dersch/ mirror of Helmut Dersch's former web-page])
+
* [http://hugin.sourceforge.net/ Hugin]
+
* [http://www.janrik.net/ptools/ExtendedFocusPano12/index.html Depth of focus for panorama tools]
+
* [http://www.cs.ubc.ca/~mbrown/autostitch/autostitch.html Autostitch (commercial)]
+
  
==Interactive projector-camera interface==
+
==Design of a microcontroller based Electronic nose for use on Khepera robots==
[[Image:Camera-projector.jpg|thumb|320px|right|Interactive camera-projector system]]
+
Abstract: The project is based around the design of both a hardware and software interface for a small array of sensors which can be mounted on a Khepera robot. The project will involve some embedded microcontroller programming, electronic circuit design and PCB design/fabrication
Contact [[User:Engjw|Jan Wedekind]] for a project in this area.
+
===Premise===
+
* A camera is given images of a projected image (or a TFT screen)
+
* The system already calibrates itself (two-dimensional homography) using projected patterns
+
  
===To Do===
+
Supervisor: [mailto:engafh@exchange.shu.ac.uk Alan Holloway]
* Improved method for recognition of hand/fingers
+
* 3D calibration, depth perception using projected patterns or shadows
+
  
===See Also===
+
=Medical projects=
* [[Interactive Camera-Projector System]]
+
  
===External Links===
+
==Baby breathing monitor==
* [http://mrl.nyu.edu/~jhan/ftirtouch/ Multi-Touch Interaction Research]
+
* [http://iihm.imag.fr/demos/magicboard/ Magic board project by IIHM]
+
* [http://www.research.ibm.com/ed/ed_technology.htm IBM Research on interactive projector]
+
* [http://research.microsoft.com/%7Ezhang/calib/ A Flexible New Technique for Camera Calibration: Zhengyou Zhang]
+
* [http://www.e-chalk.de E-Chalk]
+
  
==Physics Engine==
+
The purpose of the project is to measure the breathing rate of babies without any physical contact.
[[Image:Output2.jpg|thumb|right|160px|Demonstration of the [http://www.ode.org/ Open Dynamics Engine] [http://vision.eng.shu.ac.uk/jan/output2.avi (217 kByte video)]]]
+
* Solution 1: Using sensors - A proof of concept working version has already been done using a single sensor. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.
Contact [[User:Engjw|Jan Wedekind]] for a project in this area.
+
*Solution 2 : Using a web camera based vision system to detect the breathing movement. A proof of concept working version has already been developed. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.
===Premise===
+
A physics engines is useful for simulating robots and testing computer vision algorithms. The ''Open Dynamics Engine'' was used in two projects already.
+
===To Do===
+
The ''Open Dynamics Engine'' is not numerically stable. An investigation into numerical algorithms for simulating
+
multiple rigid bodies is required. The rigid bodies can be connected by joints, which are limiting their degrees of freedom.
+
===See Also===
+
* [[Robot simulation with Gazebo]]
+
===External Links===
+
* [http://www.ode.org/ Open Dynamics Engine]
+
* [http://www.spiderland.org/breve/ Breve simulation environment]
+
* [http://jsbsim.sourceforge.net/ Open Source Flight Dynamics Model]
+
  
==RANSAC==
+
Reference : [http://ieeexplore.ieee.org/iel5/7635/20843/00965238.pdf?tp=&isnumber=&arnumber=965238 Development of non-restrictive sensing system for sleeping person using fiber grating vision sensor] (Journal paper)
[[Image:Penguin.jpg|thumb|right|200px|Recognition and tracking with three or four degrees-of-freedom. [[Microscope Vision Software|More ...]]]]
+
 
[http://en.wikipedia.org/wiki/RANSAC Random sample consensus] is a method for object recognition. This project is about recognising
+
Skill set that will be developed : Electronics, PIC micro-controller programming. Exposure to Linear algebra. MATLAB and/or C++ programing knowledge will be useful
macroscopic rigid objects (e.g. household and office articles like cups, stapler, ...
+
 
===To Do===
+
Supervisors: [mailto:r.saatchi@shu.ac.uk Reza Saatchi] and [mailto:a.n.selvan@shu.ac.uk Arul Selvan]
* Select point-features and a suitable similarity measure
+
 
* Implement RANSAC algorithm and apply to 3 and 4 degrees-of-freedom problem.
+
==Medical Ultrasound Training Simulator==
* Extend RANSAC implementation to 6 degrees-of-freedom problem (maybe use line- and point-features)
+
 
* Demonstrate algorithm on real object.
+
[http://www.sensegraphics.com/index.php?page=shop.product_details&flypage=shop.flypage_sensegraphics&product_id=20&category_id=7&manufacturer_id=0&option=com_virtuemart&Itemid=83 Using augmented virtual reality and haptic force feedback system] to  simulate the Ultrasound medical examination of a patient. An M.Sc. student has previously developed most of [http://vision.eng.shu.ac.uk/mmvlwiki/index.php/Medical_Image_Processing the simulator]. The enhancements to be done to the existing project are related to acquire new data set(Medical Ultrasound images) to incorporate into the existing project.
==See Also==
+
 
* [[British Life and Culture Module]]
+
Skill set that will be developed : Python programming. Some prior programming experience is required.
* [[Microscope Vision Software]]
+
 
* [[Mimas Camera Calibration]]
+
Supervisors: [mailto:r.saatchi@shu.ac.uk Reza Saatchi] and [mailto:a.n.selvan@shu.ac.uk Arul Selvan]
 +
 
 +
=Industrial robotics & automation projects=
 +
 
 +
==Precise Food Decoration Using a Robot-Controlled System==
 +
The project requires the design and construction of a mechanical rig to be attached to a robot platform and mounting a laser-optical displacement sensor. The aim is to use the sensor to keep the robot end-effector to a constant distance from a non-flat surface. The targeted application is in the food robotics industry, eg automatic decoration of confectionery foods (cakes!).
 +
 
 +
Supervisor: [mailto:f.caparrelli@REMOVETHISshu.ac.uk Fabio Caparrelli]
  
 
=See Also=
 
=See Also=
* Have a look at [[Mimas#Software_Engineering|Mimas/Software engineering]] to get a general impression of our working environment.
+
* [[MMVL]]
* [[Nanorobotics]]
+
* [[:Category:Projects|Projects]]
* [[I-Swarm]]
+
 
 +
=External Links=
 +
* [http://vision.eng.shu.ac.uk/bala/msc/MSc-ProjectMarkingSheet.doc M.Sc. Thesis marking sheet]
 +
* [http://vision.eng.shu.ac.uk/bala/msc/ M.Sc. Theses from previous years]
 +
* [http://www.shu.ac.uk/research/meri/postgrad-research/ MERI PhD adverts]
 +
* [https://wiki.ubuntu.com/Training (K)Ubuntu Student Guide]
 +
* [http://www.ubuntupocketguide.com/ (K)Ubuntu Pocket Guide]
 +
* [[Image:GoogleCodeSearch.gif|60px]] [http://codesearch.google.com/ Google code search]
  
 
[[Category:Projects]]
 
[[Category:Projects]]

Latest revision as of 11:56, 12 August 2011

Steel gates of entrance to City Campus next to Sheaf Building

These are the projects currently being offered by the MMVL. If you wish to be supervised then please contact the listed supervisors directly for an interview. Note that we support projects at the undergraduate, ERASMUS, M.Sc. and Ph.D. levels.

The projects fall into the following categories:

See sections "External Links" for thesis marking sheet and sample reports from previous years.

Contents

[edit] Machine vision projects

[edit] Embedded Machine Vision System

The project requires the design and fabrication of a small embedded machine vision board comprising a CMOS camera chip and a micro-controller unit (ARM7/ARM9/Cortex/xScale) for on-board image processing. The work is in conjunction with a European project in reconfigurable robotics.

Supervisor: Fabio Caparrelli

[edit] Robotics projects

[edit] Formation control of Khepera III robots

The project consists of implementing formation control on a group of 3 Khepera robots. Inputs for the control are the US and IR sensors mounted on the khepera III robots, the control consists of a set of attraction and repulsion forces. Supervisor: Lyuba Alboul, Alan Holloway, (Jacques Penders)

[edit] Wall following behaviours for mobile robots

Abstract: the project consists of implementing wall following behaviour on a Khepera III robot using US and/or IR sensors mounted on the robot. Additionally a map of the environment is produced or 2 other khepera follow the wall follower in a formation.

Supervisor: Lyuba Alboul, Alan Holloway, (Jacques Penders)

[edit] Design of a I2C sensor interface for Khepera robots using an embedded microcontroller

Abstract: The project consists of designing, building and testing a sensor interface to be mounted on a Khepera robot. The system will be based around an embedded microcontroller which will interface to the Khepera using the I2C interface.

Supervisors: Alan Holloway, (Jacques Penders)

[edit] Design of miniature multi channel frequency counter for remote QCM sensing applications

Abstract: The project will involve research into a suitable method of measuring the frequency of an array of frequency based sensors. It is anticipated that either an embedded microcontroller or FPGA will be used. A prototype design should be made and suitable validation of the system performed.

Supervisors: Alan Holloway, A Nabok, (Jacques Penders)

[edit] Design of a microcontroller based Electronic nose for use on Khepera robots

Abstract: The project is based around the design of both a hardware and software interface for a small array of sensors which can be mounted on a Khepera robot. The project will involve some embedded microcontroller programming, electronic circuit design and PCB design/fabrication

Supervisor: Alan Holloway

[edit] Medical projects

[edit] Baby breathing monitor

The purpose of the project is to measure the breathing rate of babies without any physical contact.

  • Solution 1: Using sensors - A proof of concept working version has already been done using a single sensor. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.
  • Solution 2 : Using a web camera based vision system to detect the breathing movement. A proof of concept working version has already been developed. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.

Reference : Development of non-restrictive sensing system for sleeping person using fiber grating vision sensor (Journal paper)

Skill set that will be developed : Electronics, PIC micro-controller programming. Exposure to Linear algebra. MATLAB and/or C++ programing knowledge will be useful

Supervisors: Reza Saatchi and Arul Selvan

[edit] Medical Ultrasound Training Simulator

Using augmented virtual reality and haptic force feedback system to simulate the Ultrasound medical examination of a patient. An M.Sc. student has previously developed most of the simulator. The enhancements to be done to the existing project are related to acquire new data set(Medical Ultrasound images) to incorporate into the existing project.

Skill set that will be developed : Python programming. Some prior programming experience is required.

Supervisors: Reza Saatchi and Arul Selvan

[edit] Industrial robotics & automation projects

[edit] Precise Food Decoration Using a Robot-Controlled System

The project requires the design and construction of a mechanical rig to be attached to a robot platform and mounting a laser-optical displacement sensor. The aim is to use the sensor to keep the robot end-effector to a constant distance from a non-flat surface. The targeted application is in the food robotics industry, eg automatic decoration of confectionery foods (cakes!).

Supervisor: Fabio Caparrelli

[edit] See Also

[edit] External Links

Personal tools
Namespaces
Variants
Actions
Navigation
Toolbox