Vahid received his BEng in computer engineering and MSc in artificial intelligence from the National University of Iran and Amirkabir University of Technology – Tehran Polytechnic in 2002 and 2005, respectively. After graduation, he served as a hardware engineer in Hoorpendar Computer Technology Industries for two years and designed several electronic measuring and registering devices such as voltage and current transducers, dam gate controllers and all-in-one data loggers.
He joined Razi University as a lecturer in 2008 which he continued until 2014, before starting his PhD at University of Bristol. Currently, he is working towards his PhD in the visual information laboratory at the University of Bristol under the supervision of Professor Majid Mirmehdi and Dr. Dima Damen. Vahid has been awarded a prestigious funding by the Alumni and Friends Foundation to complete his PhD at the University of Bristol. His PhD research focuses on Remote, Vision-based Pulmonary Function Testing. Below are some of the works he has been doing during his PhD.
His main research areas are computer vision and image processing, biomedical engineering and signal processing. He is a member of University of Bristol SPHERE project and collaborating with the video monitoring team.
Remote Pulmonary Function Testing Using a Depth Sensor
Pulmonary Function Testing (PFT) is a group of clinical tests that evaluate human respiratory status. We propose a remote non-invasive approach to PFT using a time-of-flight depth sensor, and correlate our results to clinical-standard spirometry results. Given point clouds, we approximate 3D models of the subject’s chest, estimate the volume throughout a sequence and construct volume-time and flow-time curves for two prevalent spirometry tests: Forced Vital Capacity (FVC) and Slow Vital Capacity (SVC). From these, we compute clinical measures, such as FVC, FEV1, VC and IC. We correlate automatically extracted measures with clinical spirometry tests on 40 patients in an outpatient hospital setting. These demonstrate high within-test correlations.
This work won the first poster prize in British Machine Vision Summer School 2015.
Remote, Depth-based Lung Function Assessment
We propose a remote, noninvasive approach to develop lung function assessment using a single depth sensor. Chest volume-time data, obtained by temporally modeling of the chest volume variation throughout a sequence, is automatically analysed, and multiple keypoints are computed. To reduce the effects of a subject’s torso motion during the test, tidal volume and main effort segments of volume-time data are calibrated separately using scaling factors learnt during a training phase. Seven FVC (FVC, FEV1, PEF, FEF25%, FEF50%, FEF75%, and FEF25-75%) and four SVC measures (VC, IC, TV, and ERV) are computed and then validated against measures obtained from a spirometer. Evaluation results on a dataset of 85 patients (529 sequences in total), attending respiratory outpatient service for spirometry, show high correlation for intra-test and intra-subject FVC and SVC measures between the proposed method and the spirometer.
3D Data Acquisition and Registration using Two Opposing Kinects
We present an automatic, open source data acquisition and calibration approach using two opposing RGBD sensors (Kinect V2) and demonstrate its efficacy for dynamic object reconstruction in the context of monitoring for remote lung function assessment. First, the relative pose of the two RGBD sensors is estimated through a calibration stage and rigid transformation parameters are computed. These are then used to align and register point clouds obtained from the sensors at frame level. We validated the proposed system by performing experiments on known-size box objects with the results demonstrating accurate measurements. We also report on dynamic object reconstruction by way of human subjects undergoing respiratory functional assessment.
The source code for the single Kinect and the dual Kinect data acquisition (RGB + Depth + Infrared + Skeleton) is available from [BristolVisualPFT] on GitHub.