Towards Digital Nature: Bridging the Gap between Turing Machine Objects and Linguistic Objects in LLMMs for Universal Interaction of Object-Oriented Descriptions

Authors: Yoichi Ochiai, Naruya Kondo, Tatsuki Fushimi

10 pages, 6 figures
License: CC BY 4.0

Abstract: In this paper, we propose a novel approach to establish a connection between linguistic objects and classes in Large Language Model Machines (LLMMs) such as GPT3.5 and GPT4, and their counterparts in high level programming languages like Python. Our goal is to promote the development of Digital Nature: a worldview where digital and physical realities are seamlessly intertwined and can be easily manipulated by computational means. To achieve this, we exploit the inherent abstraction capabilities of LLMMs to build a bridge between human perception of the real world and the computational processes that mimic it. This approach enables ambiguous class definitions and interactions between objects to be realized in programming and ubiquitous computing scenarios. By doing so, we aim to facilitate seamless interaction between Turing Machine objects and Linguistic Objects, paving the way for universally accessible object oriented descriptions. We demonstrate a method for automatically transforming real world objects and their corresponding simulations into language simulable worlds using LLMMs, thus advancing the digital twin concept. This process can then be extended to high level programming languages, making the implementation of these simulations more accessible and practical. In summary, our research introduces a groundbreaking approach to connect linguistic objects in LLMMs with high level programming languages, allowing for the efficient implementation of real world simulations. This ultimately contributes to the realization of Digital Nature, where digital and physical worlds are interconnected, and objects and simulations can be effortlessly manipulated through computational means.

Submitted to arXiv on 10 Apr. 2023

Explore the paper tree

Click on the tree nodes to be redirected to a given paper and access their summaries and virtual assistant

Also access our AI generated Summaries, or ask questions about this paper to our AI assistant.

Look for similar papers (in beta version)

By clicking on the button above, our algorithm will scan all papers in our database to find the closest based on the contents of the full papers and not just on metadata. Please note that it only works for papers that we have generated summaries for and you can rerun it from time to time to get a more accurate result while our database grows.