
Learning Preposition Priors to Generate Scene from Text Using Contact Constraints
Author(s) -
S Yashaswini,
S S Shylaja
Publication year - 2021
Publication title -
european journal of electrical engineering and computer science
Language(s) - English
Resource type - Journals
ISSN - 2736-5751
DOI - 10.24018/ejece.2021.5.4.342
Subject(s) - computer science , rendering (computer graphics) , artificial intelligence , sentence , natural language processing , prior probability , object (grammar) , context (archaeology) , computer vision , paleontology , bayesian probability , biology
In this paper, we propose the method of generating a 3D scene from text with respect to interior designing by considering the orientation of every object present in the scene. Thousands of interiors designing related sentences are generated using RNN to preserve context between sentences. The BiLSTM-RNN-WE method is used for POS Tagging, blender is used to generate 3D scene based on query. This paper focuses on interior designing and has considered objects placement with respect to the preposition in the Sentence. Our approach uses Natural Language processing to extract useful information from the user text, which will aid the rendering engine generate better scene.