Available Student and Research Projects

From MMVLWiki
(Difference between revisions)
Jump to: navigation, search
m (Micro-Manipulation)
m
 
(126 intermediate revisions by 6 users not shown)
Line 1: Line 1:
 
[[Image:Sheafsteelgates.jpg|thumb|240px|right|Steel gates of entrance to City Campus next to Sheaf Building]]
 
[[Image:Sheafsteelgates.jpg|thumb|240px|right|Steel gates of entrance to City Campus next to Sheaf Building]]
=Student Projects=
+
These are the projects currently being offered by the [[MMVL]]. If you wish to be supervised then please contact the listed supervisors directly for an interview. Note that we support projects at the undergraduate, ERASMUS, M.Sc. and Ph.D. levels.
We always offer projects to motivated students (first degree, Masters of Science, ERASMUS student, ...).
+
If you are interested in doing a project in computer vision, let us know. Here is a description of ongoing research areas to give you an idea of possible projects. You can also suggest a research topic yourself.
+
  
If you work with us, you can learn a lot of skills which are relevant for a career as a software developer:
+
The projects fall into the following categories:
* Computer Vision, Signal processing, Robotics
+
* [[#Machine vision projects|Machine vision]]
* Linear Algebra, Analysis
+
* [[#Robotics projects|Robotics]]
* Software Engineering
+
* [[#Medical projects|Medical applications]]
 +
* [[#Industrial robotics & automation projects|Industrial robotics & automation]]
  
We are using state-of-the-art [[Iterated_Function_System#Cross-platform_software|cross-platform]] software tools:
+
See sections "[[#External Links|External Links]]" for thesis marking sheet and sample reports from previous years.
* Source-code documentation with [[Image:Doxygen logo.png|80px|]] [http://www.stack.nl/~dimitri/doxygen/ doxygen]
+
* Cross-platform user-interfaces with [[Image:Qt logo.png|30px|]] [http://www.trolltech.com/ Qt]. You can develop full-featured GUI-software which runs under [[Image:Tux.jpg|30px|]] GNU/Linux, [[Image:Ms-windows logo.png|40px|]] Microsoft Windows, and [[Image:Macos.gif|54px|]] MacOS!
+
* Platform-independent [[Image:Stl logo.gif|30px|]] [http://www.sgi.com/tech/stl/ Standard Template Library]
+
* Platform-independent [[Image:C--boost logo.gif|80px|]] [http://www.boost.org/ Boost Library]
+
* Scripting using the dynamically-typed object-oriented programming language [[Image:Ruby.png|25px]] [http://www.ruby-lang.org/ Ruby]
+
  
=Project areas=
+
=Machine vision projects=
==Stitching for microscopes==
+
{|align="center"
+
|-
+
|[[Image:Feather.jpg|thumb|200px|A bird's feather (reflected light, darkfield) (7.2 MByte [http://vision.eng.shu.ac.uk/jan/feather1.avi video], 10.1 MByte [http://vision.eng.shu.ac.uk/jan/feather2.avi video])]]||[[Image:Mapping.png|250px|thumb|Stitching using the feedback of the microscope's drive]]
+
|-
+
|}
+
===Premise===
+
# A microscope-video of an object being moved in x-, and y-direction (parallel to the focussed plane)
+
# Later a microscope-video of an object being moved in x-, y-, and z-direction (i.e. including depth changes)
+
  
===To Do===
+
==Embedded Machine Vision System==
* Generate stitched image from the input-video (linear complexity desirable) without feedback from microscope-drive
+
The project requires the design and fabrication of a small embedded machine vision board comprising a CMOS camera chip and a micro-controller unit (ARM7/ARM9/Cortex/xScale) for on-board image processing. The work is in conjunction with a European project in reconfigurable robotics.
* Cross-compare images to avoid a drift of the estimated shift
+
* Later provide extended depth of field by maximising a focus measure.
+
===See Also===
+
* [[Depth from Focus]]
+
===External Links===
+
* [http://www.hadleyweb.pwp.blueyonder.co.uk/ CombineZ]
+
  
==Automated photo stitching==
+
Supervisor: [mailto:f.caparrelli@REMOVETHISshu.ac.uk Fabio Caparrelli]
{|align="center"
+
|+ '''Hilbre Island input images'''
+
|-
+
|[[Image:hilbreisland1.jpg|100px]]||[[Image:hilbreisland2.jpg|100px]]||[[Image:hilbreisland3.jpg|100px]]||[[Image:hilbreisland4.jpg|100px]]||[[Image:hilbreisland5.jpg|100px]]||[[Image:hilbreisland6.jpg|100px]]||[[Image:hilbreisland7.jpg|100px]]
+
|-
+
|}
+
  
{|align="center"
+
=Robotics projects=
|+ '''Resulting panorama image created with [http://hugin.sourceforge.net/ Hugin]'''
+
|-
+
|[[Image:hilbreislandpanorama.jpg|520px]]
+
|-
+
|}
+
  
===Premise===
+
==Formation control of Khepera III robots==
* A set of images taken with the same camera-settings (aperture, exposure time, focal length) and center of projection but different viewing directions (mainly yaw, pitch)
+
The project consists of implementing formation control on a group of 3 Khepera robots. Inputs for the control are the US and IR sensors mounted on the khepera III robots, the control consists of a set of attraction and repulsion forces.
* Manually selected correspondences.
+
Supervisor: [mailto:l.alboul@shu.ac.uk Lyuba Alboul], Alan Holloway, (Jacques Penders)
  
===To Do===
+
==Wall following behaviours for mobile robots==
<math>
+
Abstract: the project consists of implementing wall following behaviour on a Khepera III robot using US and/or IR sensors mounted on the robot. Additionally a map of the environment is produced or 2 other khepera follow the wall follower in a formation.
\lambda\,\begin{pmatrix}m^\prime_{1}\\m^\prime_{2}\\f\end{pmatrix}=
+
\begin{pmatrix}h_{11}&h_{12}&h_{13}\\h_{21}&h_{22}&h_{23}\\h_{31}&h_{32}&h_{33}\end{pmatrix}\,
+
\begin{pmatrix}m_{1}\\m_{2}\\f\end{pmatrix}
+
</math>
+
  
* Improve the correspondences using 2D cross-correlation (roll-angle will assumed to be low).
+
Supervisor: [mailto:l.alboul@shu.ac.uk Lyuba Alboul], Alan Holloway, (Jacques Penders)
* (Re)implement the [http://panotools.sf.net/ panotools]-solver preferably in [http://www.ruby-lang.org/ Ruby]. Use given correspondences to optimize camera-parameters: rotations, common focal length.
+
* Adopt existing method for removing fringes (transitions/blending)
+
Optional:
+
* Automatically find correspondences
+
* Optimize more camera-parameters (distortion,...)
+
  
Also see [[Panorama Viewer|panorama viewer]].
+
==Design of a I2C sensor interface for Khepera robots using an embedded microcontroller==
 +
Abstract: The project consists of designing, building and testing a sensor interface to be mounted on a Khepera robot. The system will be based around an embedded microcontroller which will interface to the Khepera using the I2C interface.
  
===External Links===
+
Supervisors:   [mailto:engafh@exchange.shu.ac.uk Alan Holloway], (Jacques Penders)
* [http://www.ptgui.com/ Photo stitching software PTGUI]
+
* [http://en.wikipedia.org/wiki/Rotation_matrix Rodriguez Matrix]
+
* [http://en.wikipedia.org/wiki/Panotools Panorama Tools software suite] ([http://www.path.unimelb.edu.au/~dersch/ mirror of Helmut Dersch's former web-page])
+
* [http://hugin.sourceforge.net/ Hugin]
+
* [http://www.janrik.net/ptools/ExtendedFocusPano12/index.html Depth of focus for panorama tools]
+
* [http://www.cs.ubc.ca/~mbrown/autostitch/autostitch.html Autostitch (commercial)]
+
  
==RANSAC==
+
==Design of miniature multi channel frequency counter for remote QCM sensing applications==
[[Image:Penguin.jpg|thumb|right|180px|Recognition and tracking with three or four degrees-of-freedom. [[Microscope Vision Software|More ...]]]]
+
Abstract: The project will involve research into a suitable method of measuring the frequency of an array of frequency based sensors. It is anticipated that either an embedded microcontroller or FPGA will be used. A prototype design should be made and suitable validation of the system performed.  
[http://en.wikipedia.org/wiki/RANSAC Random sample consensus] is a method for object recognition. This project is about recognising
+
macroscopic rigid objects (e.g. household and office articles like cups, stapler, ...
+
===To Do===
+
* Select point-features and a suitable similarity measure
+
* Implement RANSAC algorithm and apply to at least 3 degrees-of-freedom problem.
+
* Demonstrate algorithm on real object.
+
Optional
+
* Extend RANSAC implementation to more degrees-of-freedom problem (maybe use line- and point-features)
+
  
==HDR imaging==
+
Supervisors: [mailto:engafh@exchange.shu.ac.uk Alan Holloway], A Nabok, (Jacques Penders)
{|align="center"
+
|+ '''Stanage Edge'''
+
|-
+
|[[Image:stanage1.png|160px]]||[[Image:stanage2.png|160px]]||[[Image:stanage3.png|160px]]
+
|-
+
|}
+
{|align="center"
+
|+ '''Processed image after composing HDR and tonemapping'''
+
|[[Image:stanage.png|240px]]
+
|-
+
|}
+
  
Merging pre-aligned 8-bit colour photos into a [http://en.wikipedia.org/wiki/High_dynamic_range_imaging High dynamic range] image
+
==Design of a microcontroller based Electronic nose for use on Khepera robots==
will be of high importance as long as low-cost HDR-cameras are not available for the consumer market.
+
Abstract: The project is based around the design of both a hardware and software interface for a small array of sensors which can be mounted on a Khepera robot. The project will involve some embedded microcontroller programming, electronic circuit design and PCB design/fabrication
===Premise===
+
* A pre-aligned exposure-series of pictures
+
===To Do===
+
* Detect over- and undersaturated colour-components
+
* Estimate camera sensivity
+
* Merge images into a single HDR image
+
Optional:
+
* Map to reasonably realistic-looking low-range image
+
  
===See Also===
+
Supervisor: [mailto:engafh@exchange.shu.ac.uk Alan Holloway]
* [http://en.wikipedia.org/wiki/High_dynamic_range_imaging Wikipedia page on HDR]
+
* [http://www.flickr.com/groups/hdr/ HDR images at flickr.com]
+
* [http://www.openexr.com/ OpenEXR library]
+
* [http://qtpfsgui.sourceforge.net/ HDR workflow with Qtpfsgui] to create HDR images and tonemapping
+
* [http://wiki.panotools.org/HDR_workflow_with_hugin HDR workflow with hugin] to create HDR panoramas
+
  
 +
=Medical projects=
  
 +
==Baby breathing monitor==
  
==Micro-Manipulation==
+
The purpose of the project is to measure the breathing rate of babies without any physical contact.
[[Image:Demarest_manipulation.jpg|thumb|right|180px|Micro manipulator picking up a crumb of ground coffee]]
+
* Solution 1: Using sensors - A proof of concept working version has already been done using a single sensor. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.
This project is about manipulating objects which can be seen under a microscope. The size of the objects typically is up to about 750 micrometers.
+
*Solution 2 : Using a web camera based vision system to detect the breathing movement. A proof of concept working version has already been developed. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.
  
===Premise===
+
Reference : [http://ieeexplore.ieee.org/iel5/7635/20843/00965238.pdf?tp=&isnumber=&arnumber=965238 Development of non-restrictive sensing system for sleeping person using fiber grating vision sensor] (Journal paper)
An optical microscope with a motorized stage and a low-cost firewire video camera. There is an early prototype of a gripper mounted on a microtranslation stage. Parts with limited accuracy can be manufactured using [http://en.wikipedia.org/wiki/Rapid_prototyping rapid prototyping] or in the lab.
+
 
 +
Skill set that will be developed : Electronics, PIC micro-controller programming. Exposure to Linear algebra. MATLAB and/or C++ programing knowledge will be useful
  
===To Do===
+
Supervisors: [mailto:r.saatchi@shu.ac.uk Reza Saatchi] and [mailto:a.n.selvan@shu.ac.uk Arul Selvan]
The task is to construct and build a more advanced gripper. Possible ideas are:
+
* Use [http://en.wikipedia.org/wiki/Strain_gauge strain gauges] to have feedback
+
* Develop gripper with more degrees of freedom
+
  
===See Also===
+
==Medical Ultrasound Training Simulator==
* [[Micromanipulators]]
+
  
===External Links===
+
[http://www.sensegraphics.com/index.php?page=shop.product_details&flypage=shop.flypage_sensegraphics&product_id=20&category_id=7&manufacturer_id=0&option=com_virtuemart&Itemid=83 Using augmented virtual reality and haptic force feedback system] to  simulate the Ultrasound medical examination of a patient. An M.Sc. student has previously developed most of [http://vision.eng.shu.ac.uk/mmvlwiki/index.php/Medical_Image_Processing the simulator]. The enhancements to be done to the existing project are related to acquire new data set(Medical Ultrasound images) to incorporate into the existing project.
* John Speich, Michael Goldfarb: [http://journals.cambridge.org/article_S0263574799001903 A compliant-mechanism-based three degree-of-freedom manipulator for small-scale manipulation] (PDF)
+
* Ying-Chien Tsai, Sio Hou Lei, Hendra Sudin: [http://www.iop.org/EJ/abstract/0960-1317/15/1/022 Design and analysis of planar compliant microgripper based on kinematic approachYing-Chien Tsai, Sio Hou Lei and Hendra Sudin]
+
  
==See Also==
+
Skill set that will be developed : Python programming. Some prior programming experience is required.
* [[British Life and Culture Module]]
+
 
* [[Microscope Vision Software]]
+
Supervisors: [mailto:r.saatchi@shu.ac.uk Reza Saatchi] and [mailto:a.n.selvan@shu.ac.uk Arul Selvan]
* [[Mimas Camera Calibration]]
+
 
 +
=Industrial robotics & automation projects=
 +
 
 +
==Precise Food Decoration Using a Robot-Controlled System==
 +
The project requires the design and construction of a mechanical rig to be attached to a robot platform and mounting a laser-optical displacement sensor. The aim is to use the sensor to keep the robot end-effector to a constant distance from a non-flat surface. The targeted application is in the food robotics industry, eg automatic decoration of confectionery foods (cakes!).
 +
 
 +
Supervisor: [mailto:f.caparrelli@REMOVETHISshu.ac.uk Fabio Caparrelli]
  
 
=See Also=
 
=See Also=
* [[Mimas]]
+
* [[MMVL]]
* [[Hornetseye]]
+
* [[:Category:Projects|Projects]]
  
 
=External Links=
 
=External Links=
 +
* [http://vision.eng.shu.ac.uk/bala/msc/MSc-ProjectMarkingSheet.doc M.Sc. Thesis marking sheet]
 +
* [http://vision.eng.shu.ac.uk/bala/msc/ M.Sc. Theses from previous years]
 +
* [http://www.shu.ac.uk/research/meri/postgrad-research/ MERI PhD adverts]
 +
* [https://wiki.ubuntu.com/Training (K)Ubuntu Student Guide]
 +
* [http://www.ubuntupocketguide.com/ (K)Ubuntu Pocket Guide]
 
* [[Image:GoogleCodeSearch.gif|60px]] [http://codesearch.google.com/ Google code search]
 
* [[Image:GoogleCodeSearch.gif|60px]] [http://codesearch.google.com/ Google code search]
  
 
[[Category:Projects]]
 
[[Category:Projects]]

Latest revision as of 12:56, 12 August 2011

Steel gates of entrance to City Campus next to Sheaf Building

These are the projects currently being offered by the MMVL. If you wish to be supervised then please contact the listed supervisors directly for an interview. Note that we support projects at the undergraduate, ERASMUS, M.Sc. and Ph.D. levels.

The projects fall into the following categories:

See sections "External Links" for thesis marking sheet and sample reports from previous years.

Contents

[edit] Machine vision projects

[edit] Embedded Machine Vision System

The project requires the design and fabrication of a small embedded machine vision board comprising a CMOS camera chip and a micro-controller unit (ARM7/ARM9/Cortex/xScale) for on-board image processing. The work is in conjunction with a European project in reconfigurable robotics.

Supervisor: Fabio Caparrelli

[edit] Robotics projects

[edit] Formation control of Khepera III robots

The project consists of implementing formation control on a group of 3 Khepera robots. Inputs for the control are the US and IR sensors mounted on the khepera III robots, the control consists of a set of attraction and repulsion forces. Supervisor: Lyuba Alboul, Alan Holloway, (Jacques Penders)

[edit] Wall following behaviours for mobile robots

Abstract: the project consists of implementing wall following behaviour on a Khepera III robot using US and/or IR sensors mounted on the robot. Additionally a map of the environment is produced or 2 other khepera follow the wall follower in a formation.

Supervisor: Lyuba Alboul, Alan Holloway, (Jacques Penders)

[edit] Design of a I2C sensor interface for Khepera robots using an embedded microcontroller

Abstract: The project consists of designing, building and testing a sensor interface to be mounted on a Khepera robot. The system will be based around an embedded microcontroller which will interface to the Khepera using the I2C interface.

Supervisors: Alan Holloway, (Jacques Penders)

[edit] Design of miniature multi channel frequency counter for remote QCM sensing applications

Abstract: The project will involve research into a suitable method of measuring the frequency of an array of frequency based sensors. It is anticipated that either an embedded microcontroller or FPGA will be used. A prototype design should be made and suitable validation of the system performed.

Supervisors: Alan Holloway, A Nabok, (Jacques Penders)

[edit] Design of a microcontroller based Electronic nose for use on Khepera robots

Abstract: The project is based around the design of both a hardware and software interface for a small array of sensors which can be mounted on a Khepera robot. The project will involve some embedded microcontroller programming, electronic circuit design and PCB design/fabrication

Supervisor: Alan Holloway

[edit] Medical projects

[edit] Baby breathing monitor

The purpose of the project is to measure the breathing rate of babies without any physical contact.

  • Solution 1: Using sensors - A proof of concept working version has already been done using a single sensor. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.
  • Solution 2 : Using a web camera based vision system to detect the breathing movement. A proof of concept working version has already been developed. The proposed project will explore the possibility of improving the accuracy and the robustness of the system.

Reference : Development of non-restrictive sensing system for sleeping person using fiber grating vision sensor (Journal paper)

Skill set that will be developed : Electronics, PIC micro-controller programming. Exposure to Linear algebra. MATLAB and/or C++ programing knowledge will be useful

Supervisors: Reza Saatchi and Arul Selvan

[edit] Medical Ultrasound Training Simulator

Using augmented virtual reality and haptic force feedback system to simulate the Ultrasound medical examination of a patient. An M.Sc. student has previously developed most of the simulator. The enhancements to be done to the existing project are related to acquire new data set(Medical Ultrasound images) to incorporate into the existing project.

Skill set that will be developed : Python programming. Some prior programming experience is required.

Supervisors: Reza Saatchi and Arul Selvan

[edit] Industrial robotics & automation projects

[edit] Precise Food Decoration Using a Robot-Controlled System

The project requires the design and construction of a mechanical rig to be attached to a robot platform and mounting a laser-optical displacement sensor. The aim is to use the sensor to keep the robot end-effector to a constant distance from a non-flat surface. The targeted application is in the food robotics industry, eg automatic decoration of confectionery foods (cakes!).

Supervisor: Fabio Caparrelli

[edit] See Also

[edit] External Links

Personal tools
Namespaces
Variants
Actions
Navigation
Toolbox