Static and dynamic abstract formal models for 3D sensor images

  • Authors:
  • Ioan Jivet;Alin Brindusescu;Ivan Bogdanov

  • Affiliations:
  • Applied Electronics Department, University "Politechnica" Timisoara, Timisoara, Romania;Applied Electronics Department, University "Politechnica" Timisoara, Timisoara, Romania;Applied Electronics Department, University "Politechnica" Timisoara, Timisoara, Romania

  • Venue:
  • WSEAS TRANSACTIONS on SYSTEMS
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

The paper presents a perception oriented linguistic formal model for 3D sensors depth images. Both the static object extraction and short term dynamic evolution in the scene are analyzed. The target applications are subsystems in action involved in independent environment exploration and learning be it human or machine. Field of view depth images obtained with recently developed CMOS 3D sensors are analyzed in their capacity to provide immediate action oriented data. For the 3D scene images a selective segmented method is proposed in terms of salient objects in the depth image. The model as proposed uses a representation of the scene depth image in terms of object area and mean center location. An original abstract formal language representation is proposed. The extension of the context free grammar with attributes ads structure to the model. It is also shown that the generated language translates directly depth labeling into action planning on the environment. The performance of the proposed abstract representation method is analyzed in terms of estimated computation time and direct semantic relevance for a sample application. For applications of object motion detection and tracking the formal model was extended with attributes for direction and speed. The object position drift based on segment correspondence for speed determination is shown to be compatible to the formal model as proposed. Further development of the model for multi layered representations for more complex applications areas is also outlined.