Home → News 2021 July
 
 
News 2021 July
   
 

03.July.2021

17:41 UTC+2
SOPR #328

Topics
In this issue we publicate our conclusions and considerations regarding the topics:

  • Legal matter [Next Generation]
  • Legal matter [Ownership regulation]
  • Infrastructure
  • Infrastructure [(M⁵UI)]
  • Cybersecurity and cybersafety
  • Cyber sovereignty, cybersecurity, and cybersafety
  • Healthcare System (HS)

    Legal matter [Next Generation]
    Our Society for Ontological Performance and Reproduction (SOPR) has concluded that the latest major revision of the Articles of Association (AoA) and the Terms of Services (ToS) with the License Model (LM) of our SOPR, also designated as SOPR 2.0, was not a major revision at all, but more a review, because only the LM was revised, but not the matter discussed

  • under the terms vertical and horizontal markets or economy sectors of technologies (e.g. systems, platforms, and also Service-Oriented technologies (SOx), and as a Service (aaS) capability models and operational models and platforms (aaSx)), goods (e.g. applications, devices, and vehicles), and services, and
  • in relation to the exclusive infrastructures with their subsystems and platforms of our SOPR (hereafter also designated as exclusives).

    As we already mentioned in the last past, in this regard, the differentiation between vertical and horizontal

  • markets or economy sectors of technologies, goods, and services,
  • infrastructures, and
  • integrations of supply chains

    is useful.

    In general, there are vertical and horizontal parts provided by

  • the companies, which are custom and proprietary technologies, goods, and services based on our OS, including
    • horizontal parts for the end users or consumers, and other customers and the end users related to one field or market sector, and
    • vertical parts for our SOPR and our other Societies across fields or market sectors,

    and

  • our SOPR, which are customary and common technologies, goods, and services managing and orchestrating, operating, and aligning and complementing the custom and proprietary items.

    Alternatively, one could say that the custom and proprietary technologies, goods, and services are

  • based on the basic properties and core component features of our OS, and
  • woven into the customary and common fabric of our SOPR,

    like for example in the cases of (sub)architectures and (sub)systems of

    The following regulations are already included in the AoA and the ToS of our SOPR and effective:

  • If a part of our OS is common to all members and licensees of our SOPR, then said part belongs to the exclusives of our SOPR and is managed and operated by our SOPR
    • together with the main contractors, suppliers, and service providers of our SOPR or
    • alone.
  • If a part of our OS is not performed and reproduced by at least two independent members and licensees in individual respectively custom and proprietary ways, then said part is provided by our SOPR and is managed and operated by our SOPR
    • together with the main contractors, suppliers, and service providers of our SOPR or
    • alone.

    Obviously, a technology, good, and service, which

  • is common to all members and licensees of our SOPR is also a vertical technology, good, and service, so that no additional regulation or revision of a regulation is required at all, and
  • is performed and reproduced by at least two independent members and licensees in the same way is also not performed and reproduced in individual respectively custom and proprietary ways.

    In particular, infrastructures are always vertical and infrastructures with their related vertical subsystems and platforms, and vertical SOx and aaSx, which are customary and common to all members and licensees of our SOPR, belong to the exclusive infrastructures of our SOPR (see the issue SOPR #327 of the 7th of June 2021).

    Please note that

  • operating system (os) and Operating system aaSx (OpsaaSx),
  • Network aaSx (NaaSx), and
  • Connectivity Management aaSx (CMaaSx),

    and also the

  • SoftBionics (SB) and SoftBionics aaSx (SBaaSx),
  • Smart Contract aaSx (SCaaSx), and
  • Blockchain aaSx (BaaSx or BlaaSx)

    are all already affected.

    Legal matter [Ownership regulation]
    In relation to the cyber sovereignty and the cybersecurity law of the P.R.China our SOPR has concluded that if we are not allowed to hold certain computing and networking infrastructure, specifically certain parts of the exclusive infrastructures of our SOPR, due to the cybersecurity law of the P.R.China, then the 51% + 49% regulation applies, which means

  • 51% of the voting shares of an affected company outside the mainland of the P.R.China, and in addition
  • 49% of the voting shares of an affected company inside the mainland of the P.R.China, or said in other words, we will demand to hold 49% from related Chinese companies, subsidiaries, or business unit, or they will get no license to perform and reproduce certain parts of our OS in the P.R.China.

    This means that 49% of those parts of companies, like for example Alibaba and Tencent, which are affected by the cybersecurity law of the P.R.China and the 51% + 49% regulation of our SOPR would have to be sold to our corporation basically for the self-cost price, because we do not pay for our rights and properties infringed by other entities. :)
    We love 100% Chinese win-win policy. :)

    Infrastructure
    In the last days, we noted a certain convergence and centralization in relation to the

  • Ontologic Core (OC or OntoCore) component,
  • Operating system (os or Ops) as as a Service (aaS) capability and operational models (OpsaaSx),
  • Ontologic Geographic Information System (OntoGIS)
    • OntoMap, and
    • OntoGlobe or OntoEarth,
  • Geographic Information System (GIS) as as a Service (aaS) capability and operational models (GISaaSx), and
  • SoftBionics (SB) Service-Oriented technologies (SBSOx) and SB as a Service (aaS) capability and operational models (SBaaSx), and also
  • Communication and Collaboration System (Co²S), and
  • Hyper Connectivity System (HCS)

    of the infrastructures of our SOPR

    What we are considering since some months already, is to make the OpsSOx and OpsaaSx some kind of a Windows One or better said OC One or common Computing and Networking System (CNS), which provides the common OC and its basic variants, for example for the

  • infrastructure facilities,
  • access places,
  • mobile and stationary access devices,
  • vehicles,
  • fields of
    • Cyber-Physical System (CPS), Ubiquitous Computing (UC or UbiC), Internet of Things (IoT), and Networked Embedded System (NES),
    • Autonomous System (AS) and Robotic System (RS),
    • and so on.

    This move

  • would also increase the cybersecurity and cybersafety, because some operating systems (oss) are as old as the old and outdated Internet and the old and outdated World Wide Web (WWW), or even older (see also the section Cybersecurity and cybersafety),
  • we would start with a blank sheet or blank cell in a blank development environment or blank space, so to say, but
  • we would retain best functionalities and practices, and
  • would also include the transition from illegal Free and Open Source Software (FOSS) to legal Publicly Validated and Verified Open Software (PVVOSs).

    For sure, such a decision

  • would not mean the end of horizontal technologies, goods, and services, but
  • would require to rebalance the awarding of main contracts in relation to the infrastructures of our Society for Ontological Performance and Reproduction (SOPR) accordingly.

    For example, the

  • Google Android could become more responsibe for GIS and HCS, and Microsoft less responsible for these systems,
  • Apple os could become more responsible for end user devices respectively access places (e.g. terminals) and access devices, and other companies less responsible for these means of access
  • and so on.

    Infrastructure [(M⁵UI)]
    Our SOPR was crystal clear in this relation: Members and licensees of our SOPR are not allowed to use

  • in particular the wake word "Computer" for a smartspeaker, smartdisplay, virtual assistant, and any other technology, good, and service in the field of computing and networking and based on Natural Language Processing (NLP), and
  • in general any other wake word, that is equal to the common designation(s) of an object with voice-based functionalities (for example "Car", "House", "Home", "Watch", "Phone", etc.).

    These wake words are exclusive for the related technologies, goods, and services of our Society for Ontological Performance and Reproduction (SOPR), Ontonics, and other business units of our corporation for the benefit of all members and licensees.
    Allowed wake words for members and licensees are fantasy names, brand names, and so on.

    The same holds also for the other modalities of our Multidimensional Multilingual Multiparadigmatic Multimodal Multimedia User Interface (M⁵UI) (e.g. Visual Language Processing (VLP)).

    Cybersecurity and cybersafety
    We have said this before as well: C.S. has also created the Ontologic System (OS) as a solution for the problems with HardWare (HW) and SoftWare (SW), which are not secure and safe.

    And as in the cases of the fields of

  • HardBionics (HB) and SoftBionics (SB) (e.g. Artificial Intelligence (AI), Machine Learning (ML), Computer Vision (CV), Simultaneous Localization And Mapping (SLAM), Natural Language Processing (NLP), Cognitive Agent System (CAS), Multi-Agent System (MAS), Evolutionary Computing (EC), Swarm Intelligence (SI) or Swarm Computing (SC), etc.), and
  • Cyber-Physical System (CPS), Ubiquitous Computing (UC or UbiC), Internet of Things (IoT), and Networked Embedded System (NES),

    and also the

  • Superstructure,
  • further environmental reconstruction and climate change reversal,
  • etc., etc., etc.,

    our OS being kept under the power of control and management of our SOPR is the solution.
    Other entities are just not able to provide the required quantities and qualities, because they never wanted to provide them, they are never wanting to provide them, and they will never want to provide them.

    Our OS has all basic properties and features, subsystems and platforms, and applications and services, as well as devices.
    Our SOPR has all rights and properties, as well as societal, political, legal, technological, and economical rules, requirements, and environments.
    All are required to establish freedom, law and order, and rule-based order.

    As we also have said in relation to Traffic Management Systems (TMSs), smart city, smart car, smart this and smart that, and so on, the fact is that the

  • requirements of the overall system and its dedicated subsystems and even of such a specialized subsystems are too complex and demanding,
  • system architectures of us are too complex, elaborated, and matured, and
  • other foundations and implementations required for the realization of such systems are too complex, developed, and matured in the last 4 decades, as well as
  • societal, political, legal, technological, and economical reasons are too clear and governing

    that only our SOPR together with its main contractors, suppliers, and service providers is able to put this technology with its goods and services into reality and practice.

    And when we say once again that the illegal cryptocurrencies Bitcoin and its derivatives, and the illegal platform Ethereum and so on based on those illegal digital and virtual ledger technologies based on for example the smart contract protocol and the blockchain technique, and illegal digital and virtual currencies have to be prohibited, then this is not only due the infringements of our copyright, but also due to the prevention that our solutions against cyberattacks and so on are weakened or even broken and made ineffective only due to play money and fun of all those technoclowns and opportunists.
    The same holds for the Ontologic Applications and Ontologic Services (OAOS), which belong to the subsystems and platforms of the exclusive infrastructures of our SORP. This is nothing a service provider, including the related cloud space service business units of other companies, can handle, are handling, and will handle.
    The same holds for the Service-Oriented technologies (SOx) and as a Service (aaS) capability models and operational models (aaSx), including vertical aspects of

  • Infrastructure aaSx (IaaSx),
  • Integration Infrastructure aaSx (IIaaSx),
  • Platform aaSx (PaaSx), and
  • Integration Platform aaSx (IPaaSx),

    including

  • Operating system aaSx (OpsaaSx),
  • Network aaSx (NaaSx), and
  • Connectivity Management aaSx (CMaaSx),

    and

  • SoftBionics aaSx (SBaaSx),
  • Trust aaSx (TaaSx),
  • Smart Contract aaSx (SCaaSx),
  • Blockchain aaSx (BaaSx or BlaaSx), and
  • IDentity aaSx (IDaaSx)

    of our SOPR, which were not chosen due to the protection of our rights and properties, but also due to the establishment of a safe and secure foundation for our Ontoverse respectively the transformation of the societies (see also the section Infrastructure above).
    All are required to establish freedom, law and order, and rule-based order.

    The world cannot just follow us and always take what it sees. There are certain hurdles, thresholds, and limits, which demand more than many persons, more than many money, and more than many intergovernmental agreements and contracts.

    Cyber sovereignty, cybersecurity, and cybersafety
    In relation to cyber sovereignty, cybersecurity, and cybersafety our SOPR has remembered that the IDentity spaces of the management structure of our Ontoverse also comprise real spaces due to the

  • foundational Caliber/Calibre of our New Reality (NR) respectively our Ontologic System (OS) with its Ontoverse (Ov), and
  • fields of Cyber-Physical System (CPS), Internet of Things (IoT), and Networked Embedded System (NES)

    and not only due to cyber sovereignty, cybersecurity, and cybersafety.

    This can be viewed with the Ontologic eXchange (OntoX or OX) facilities of our common backbone, core network, or fabric (not to be confused with the Ontologic Exchange (OEx, OntoEx, or OntoExchange) of our Ontologic Financial System (OFinS)), which will be erected and connected with many wires or cables of others and us following a specific strategy and plan, so that bad actors have just no access at all. :)
    We also added a new measures to stop ransomware, which are quite effective. :)

    Healthcare System (HS)
    The transition to our Telematics Infrastructure of the Next Generation (TING) of the Health System (HS) or Healthcare 4.0 System of our Society for Ontological Performance and Reproduction (SOPR) has already begun a year or longer ago.

    Because the electronic Health Record (eHR) or electronic Patient Record (ePR) is based on a type of Ontoscope and intelliTablet (iTablet), also wrongly called smartphone, tablet computer, and touch-based Personal Computer, the AoA and the ToS of our SOPR already do apply.


    09.July.2021

    Comment of the Day

    "New Mobility with New Energy and New Car", [C.S., Today]


    11.July.2021

    15:23, 16:17, 16:20, 21:59, and 22:57 UTC+2
    Investigations::Multimedia

    *** Work in progress - better explanation, some facts (e.g. CVE, including CARE and CVRE,including VOS and CoVE) and links missing ***

  • Epic Games: It was a situation once again where we have not seen the forest due to the trees.
    Since some years, we are observing that the companies Epic Games and Tencent are copying essential parts of our original and unique, copyrighted work of art titled Ontologic System and created by C.S., specifically our Ontoverse. But today we found out that the scope of their serious fraudulent activities is much broader.

    We quote a report, which is about Epic Games' videogame and Virtual Environment (VE) called Fortnite and its live coding, gameplay, and storytelling features, and was publicated on the 18th of March 2021: "Fortnite's experimental story is an attempt to create 'the entertainment experience of the future'
    [...]
    But within this multiplayer [video game], Fortnite has steadily turned into one of the biggest and boldest storytelling experiments in history. From live events experienced by millions to cinematics [...], the team at Epic has continuously tested new ideas, injecting narrative elements in a way that feels natural and meaningful. Fortnite is among the biggest games in the world and arguably the most culturally impactful. [...]
    "For me, it was an opportunity to almost create a new medium," [the chief creative officer chief plagiarist] tells [a magazine and fake news provider].
    [...]
    The idea of telling a story in Fortnite isn't new. [The chief plagiarist] says it was part of the plan from the beginning, ever since Fortnite's battle royale mode first launched in September 2017. And it wasn't just an afterthought - it was a key element of the experience. "Our approach since the start, or our goal, has been how do we create truly mass-scale, broad-based entertainment. And I always think that the way to do that is through narrative conceit," [the chief plagiarist] explains. "It might not necessarily be story in the traditional character-driven, three-act structure. But the conceits, and the why of what's going on and what's happening, are critical for people to be emotionally attached to an entertainment experience. That's our guiding principle and philosophy," he says. "Fortnite has a story because all great entertainment has a good story."
    [...]
    [...] There isn't an actual protagonist - or at least, not an obvious one. [...]
    [...]
    In Fortnite, that main character isn't Peely or Agent Jones or any of the other now-iconic faces from the game. Instead, it's the island itself. "The world of Fortnite is the main character," says [the chief plagiarist]. [...]
    [...]
    One of the most powerful narrative tools used in Fortnite is environmental storytelling. It's a game of big events [...] and each leaves a permanent impression on the island. [...] This island is a place where players come together to hang out and shoot at each other, but it's also a virtual world brimming with its own strange history.
    [The chief plagiarist] says the game does have a plot with a beginning and end, and that he knows exactly where everything's headed. But because of the nature of Fortnite, the path to those big story beats can change dramatically depending on what players do or say. [...]
    "We're trying as much as possible to tell you a story that feels like it's being told in a moment," [the chief plagiarist] explains. [...] The story feels like it's being shaped in the moment, and it's being shaped by the way you're reacting to the story as it's being told to you. [...] It's a little more improv."
    [...] "That was us changing the story, or changing the experience, based on how we saw everyone reacting," [the chief plagiarist] says. "And we do that a lot, as much as we can."
    More recently, starting with the Marvel-themed season last year, Fortnite's narrative has also increasingly involved other entertainment properties. In season 5, a mysterious phenomenon called the Zero Point created rifts in reality where characters from different fictional universes could come together on the Fortnite island. It's a great way to sell skins for everything from [Star Wars] The Mandalorian to Terminator to God of War to The Walking Dead. But [the chief plagiarist] says that it's also a critical part of the storyline, nodding to something he and others at Epic have eagerly talked about creating - the Metaverse our original and unique multiverse and metaverse Ontoverse, comprised of characters and storylines from countless films, shows, and games all in one place.
    "I knew, ultimately, that a big part of the story we would tell is these overlapping realities," he explains. "It's about the Zero Point and what that is and why that is and how it tethers reality together. I knew the only way to do that right was to somehow convince all these other people to come play with us, to come play in our fictional universe." As for how those licensed tie-ins come about? "It's pretty much me going to [...] all these amazing creators, and saying: 'Here's the story of what we're trying to tell in Fortnite, the vision of what it is. Do you want to be part of it with us?' And a lot of people have said yes."
    Between live events, single-player missions, environmental storytelling, and more traditional techniques like cinematics, audio logs, and nonplayer characters you can chat with, Epic is trying all kinds of ways to get this story across. There's even an upcoming Batman comic series that will provide even more detail on the island. [...]
    "Our lofty goal is to create the entertainment experience of the future. I think some of that is feeling our way into what feels like it's going to be a new medium, where it's this blended entertainment experience that has interactive elements. It has linear elements to it. It has things that would look more like a concert," [the chief plagiarist] explains. "We've discovered some of the ways to do that in a cool way, but I think there's a lot more for us to discover and experiment with and try. I remain committed to just trying crazy stuff."

    Comment
    First of all, we recall once again some thoughts related to the definitions of terms, which we use in relation to our Ontologic System (OS):

  • Caliber/Calibre is one of the physical, digital, metaphysical, cybernetical, and philosophical idea, concept, and foundation of our Ontologic System.
  • New Reality (NR) is the fusion of
    • all kinds of reality, including
      • eXtended Mixed Reality (XMR) or simply eXtended Reality (XR) of us, including
        • (True or Real) Reality ((R)R),
        • Mediated Reality (MR or MedR), and
        • Mixed Reality (MR), including
          • Augmented Reality (AR) and
          • Augmented Virtuality (AV),
        • Virtual Reality (VR),
      • Simulated Reality (SR or SimR), and
      • Synthetic Realitiy (SR or SynR),

      and their

    • (information) spaces, envrionments, worlds, and universes, which are
      • real and physical,
      • virtual and digital (not virtual and cybernetical, because cybernetical is also real and physical),
      • cybernetical and cyber-physical (not digital and cyber-physical, because digital (virtual representation of real thing, event, and place) and cyber-physical (real and virtual respectively physical and digital representation of thing, event, and place) are not the same),
      • ontological and metaphysical, and also
      • all other (information) spaces, envrionments, worlds, and universes.
  • Ontoverse (Ov) is the manifestation of the NR with the fusion of all (information) spaces, envrionments, worlds, and universes to the New Reality Environment (NRE).
  • Ontologic Net (ON), Ontologic Web (OW), and Ontologic uniVerse (OV) are terms used for better explanation, discussion, and understanding, specifically in relation to the fields of
    • Ontologic Computing and Networking (OCN), also called Space and Time Computing and Networking (STCN),
    • Ontoverse, and
    • Cyber-Physical System (CPS), Ubiquitous Computer (UbiC) and Internet of Things (IoT), and Networked Embedded System (NES).

  • Ontologic Net (ON) with its Ontologic Net of Things (ONoT) is the successor of the Internet, and the Internet of Things (IoT), and both are based on
    • Resilient Distributed System (RDS) (e.g. fault-tolerant and trustworthy (e.g. reliable)), including
      • smart contract transaction protocol,
      • blockchain technique,
      • Byzantine Fault Tolerance (BFT) protocol, and
      • Byzantine-Resilient Replication (BRR) method,
    • High Performance and High Productivity Computing (HP²C) (e.g. Wide Area Network (WAN) supercomputer or Interconnected supercomputer (Intersup)),
    • SoftBionics (SB) (e.g. Artificial Intelligence (AI), Machine Learning (ML), Computational Intelligence (CI), Artificial Neural Network (ANN), Evolutionary Computing (EC), Computer Vision (CV), Simultaneous Localization And Mapping (SLAM), Soft Computing (SC), Autonomic Computing (AC), Natural Language Processing (NLP), Cognitive Agent System (CAS), Multi-Agent System (MAS), Evolutionary Computing (EC), Swarm Intelligence (SI) or Swarm Computing (SC), etc.),
    • Autonomous System (AS) and Robotic System (RS),
    • CHemical Abstract Machine (CHAM),
    • BlackBoard System (BBS) (central space of a Multi-Agent System (MAS)),
    • Space-Based Architecture (SBA),
    • and much more.

  • Ontologic Web (OW) with its Ontologic Web of Things (OWoT) is the successor of the World Wide Web (WWW), and the Web of Things (WoT), and both are based on
    • Resilient Distributed System (RDS) (e.g. fault-tolerant and trustworthy (e.g. reliable)), including
      • digital and virtual ledger technologies,
    • SoftBionics (SB),
    • Semantic (World Wide) Web (SWWW),
    • Reality-Virtuality Environment (RVE) Ontologic Virtual Environment (OntoVE) (e.g. Mediated Reality Environment (MRE or MedRE), Mixed Reality Environment (MRE), Augmented Reality Environment (ARE), Augmented Virtuality Environment (AVE), and Virtual Reality Environment (VRE), and also eXtended Mixed Reality Environment (XMRE) or simply eXtended Reality Environment (XRE), Simulated Reality Envirionment (SRE or SimRE), and Synthetic Realitiy Environment (SRE or SynRE)) (should be OntoRVE, but not simply OntoE, which already is the Ontoverse),
    • Massively Multiplayer Online Game (MMOG),
    • Service-Oriented technologies (SOx) and as a Service models (aaSx),
    • and much more.

  • Ontologic uniVerse (OV) includes ON and OW and adds the part of the whole Ontoverse, that is not included in ON and OW,

  • OntoNet, OntoWeb, and OntoVerse, and also OntoScope and Ontologic Collaborative Ontologic Virtual Environment (OntoCOVE) are the related Ontologic System Components (OSC).

    Our OS has also the

  • basic properties of (mostly) being
    • reflective/fractal/holonic and
    • collaborative,

    as well as

  • Ontologic System Components (OSC) and Ontologic Applications and Ontologic Services (OAOS), that are
    • log-based (Ontologic data storage Base (OntoBase), Ontologic File System (OntoFS), and also Ontologic LifeLogging (OntoLogger)),
    • persistent (OntoBase, OntoFS, and also OntoLogger, as well as OAOS),
    • emotional (Ontologic eMotion (OntoMotion)),
    • user affective (OntoMotion and OAOS),
    • user reflective (OAOS),
    • and so on.

    Many years ago, we have already investigated the Acceleration Studies Foundation and proven that it has infringed our copyright by copying content related to our Ontoverse from our websites, specifically from the website of our Ontologic System variants OntoLix and OntoLinux, for its illegal roadmap in the year 2007, as one can simply confirm by looking at the related glossary.

    We also have already investigated an online encyclopedia and proven that it has infringed our copyright by copying content related to our Ontoverse from our websites, specifically from the website of our Ontologic System variants OntoLix and OntoLinux, for its illegal webpage about the subject Metaverse, which in fact is about our Ontoverse, as the following quote of that webpage shows: "["Metaverse Definition [] The Metaverse is a complex concept. In recent years, the term has grown beyond Stephenson's 1992 vision of an immersive 3D virtual world [...]. Here is one that seems as good a starting point as any:] ["]The Metaverse is [["]a collective virtual shared space worlds, created by["]] the convergence of [1)] virtually-enhanced physical reality and [2)] physically persistent virtual space["],[1 [Metaverse Roadmap Overview, 2007. Accelerated Studies Foundation]] including the sum of all virtual worlds, augmented reality, and the Internet. The word "metaverse" is made up of the prefix "meta" (meaning beyond) and the stem "verse" (a backformation from "universe"); the term is typically used to describe the concept of a future iteration of the internet, made up of persistent, shared, 3D virtual spaces linked into a perceived virtual universe.[2 [Web Archive of IEEE V[irtual]W[orld] Standard Working Group. [3rd of March 2011 and 8th of June 2014]]]
    [...]

    Development
    [...]
    Since many massively multiplayer online games connecting millions of players share features with the Metaverse but only provide access to non-persistent instances of virtual worlds that are shared only by up to several dozen players, the concept of multiverse virtual worlds has been used to distinguish them from the Metaverse.[8 [A multiverse, not the metaverse[. 25th of February 2020]]]

    [...]
    Conceptually, the Metaverse describes a future internet of persistent, shared, 3D virtual spaces linked into a perceived virtual universe,[2] but common standards, interfaces, and communication protocols between and among virtual environment systems are still in development.

  • [...]"
  • The Metaverse Roadmap,[13 [The Metaverse Roadmap [Overview and Inputs]. Acceleration Studies Foundation[, 2007]]] Acceleration Studies Foundation (2006-2007)"

    Comment
    According to the same online encyclopedia "[p]ersistence is said to be "orthogonal" or "transparent" when it is implemented as an intrinsic property of the execution environment of a program. An orthogonal persistence environment does not require any specific actions by programs running in it to retrieve or save their state.
    [...]
    Orthogonal persistence is widely adopted in operating systems for hibernation and in platform virtualization systems [...] for state saving."

    Our OS integrates operating systems, which are

  • capability-based operating systems and
  • orthogonal persistent operating systems, such as for example Grasshopper referenced in the section Exotic Operating System of the webpage Links to Software of the website of OntoLinux,

    and therefore our

  • OS is able to provide orthogonal persistence (see the webpage Ontologic Applications of the website of OntoLinux and
  • Ontoverse is able to provide persistent instances of virtual worlds.
    The other fact is that the original Metaverse
  • is only an immersive 3D Virtual World (VW) or Immersive Virtual Environment (IVE or ImVE) respectively a Virtual Reality Environment (VRE), but
  • is not a Mixed Reality Environment (MRE), and therefore no Augmented Reality Environment (ARE) and Augmented Virtuality Environment (AVE),
  • is not a mirror world,
  • is not a physical metaverse, and
  • is not a physical multiverse, which was referenced in relation to the physical multiverse theory, and is not a virtual multiverse, which was created with our Ontoverse in relation to the original virtual Metaverse, physical metaverse, and physical multiverse.

    Before the inofficial start of our OS in the end of October 2006 only one MMVE as a MMOG was existing, which was a DVE based on a sharded server model with multiple servers due to technological reasons, which again was explained to the players with a related gamestory. But only in 2018, this gamestory was connected with a multiverse.

    This gamestory leads us back to the story Epic Games told the public here with the goal to copy our Ontoverse with the included multiverse.

    See also our Zero Ontology or Null Ontology O# to see what it has copied with what is wrongly called Zero Point.

    But original MRE, ARE and VRE, which are inspired by the orignal Metaverse of Snow Crash, exist with the Collaborative Virtual Environment (CVE) Virtual Object System (VOS) and other VEs.

    Please note in this relation that the

  • fraudulent Acceleration Studies Foundation has continued to copy content from our websites and added related content in the following years, but has not updated the year of non-copyright,
  • fraudulent author(s) of the webpage about our Ontoverse of an online encyclopedia recited the description of a part of our Ontoverse, which was illegally copied by the fraudulent Acceleration Studies Foundation before, but not what is truly described in the novel Snow Crash,
  • fraudulent author(s) of the webpage about our Ontoverse of an online encyclopedia listed the website of a multiverse organization by giving the date 2006 - 2007 despite the reference of the related overview of that plagiarism publicated on the same website in the year 2007 with its glossary mentioned above

    to confuse the public about their infringements of the rights and properties (e.g. copyright) of C.S. and our corporation.

    Obviously, "[t]he convergence of 1) virtually enhanced physical reality and 2) physically persistent virtual space. It is a fusion of both, while allowing users to experience it as either" is an original and unique part and expression of idea created and presented with our Ontologic System.
    Note that the successor of the Internet was created with our OS.

  • we do not claim for the fields of Mediated Reality (MedR) and Mixed Reality (MR) and the related spectra in the version or interpretation of others, but in our version or interpretation in relation to our MR spectrum called eXtended Mixed Reality (XMR) or simply eXtended Reality (XR), which includes this convergence and fusion, which again is
    • even bidirectional in contrast to mirror worlds due to our Caliber/Calibre and
    • called New Reality (NR) by us,

    and

  • we do not claim for the creation of the Metaverse described in the novel titled "Snow Crash", but for the creation of our Ontoverse, which is wrongly and illegally called Metaverse since the publication of our OS in the end of October 2006.
  • The webpage about Metaverse of an online encyclopedia was still about the original Metaverse as described in the novel SnowCrash on the 29th of November 2006 and only Virtual Reality (VR), but not Augmented Reality (AR) and Mixed Reality (MR), as well as our overall integration by our Ontologic System Architecture (OSA) and OntoScope component.
    The big manipulation was on the 24th of September 2010, when the original Metaverse was deliberately confused with our Ontoverse and the reference of material copied from our websites in the year 2007 by the fraudulent Accelerated Studies Foundation.
    Once again, no prior art has been presented in relation to our Ontoverse, also wrongly called Metaverse. If such prior art would exist, then we are sure that it would have been referenced on this webpage many years ago already. We have looked at virtually every existing system and environment around the years 2000 to 2006 and there was nothing like our original and unique multiverse and metaverse Ontoverse with our unified VR, AR, and MR environment, system, and platform, or even our Caliber/Calibre and NR, and our further integration with the old Internet, also wrongly called Metaverse.
    The same holds for our multiverse in relation to reality and virtuality, whereby the legal situation is even crystal clear since all the time.
    We close this chapter now until truly proper and convincing prior art will be presented. As long as this does not happen, our SOPR continues.

    In fact, Epic Games has also copied the

  • foundational concept,
  • new medium,
  • live coding environment,
  • gameplay, and
  • storytelling framework

    integrated in our Ontologic System (OS) by its integrating Ontologic System Architecture (OSA), and

  • parts of the history and the marketing story of our corporation.

    For the same reason, the companies Epic and Tencent have not created anything related to our Ontologic System,

  • definitely not our Ontoverse, also wrongly called Metaverse (see the Clarification of the 10th of March 2021) and multiverse, and
  • even definitely not almost created a new medium and a framework for storytelling (see the links in the section Multimedia of the webpage Links to Hardware of the website of OntoLinux, specifically the projects of the Synthetic Characters Group), and
  • entertainment experience of the future,

    and they will not be able to continue with their copyright infringements.

    We also got the confession of one of the responsible persons, that chief plagiarist, that Epic Games planned from the start to steal this part of our Ontoverse by a plagiarism of our original and unique work of art. This explains our impression of the last years, when we noticed that it is not just a massively multiplayer shooter videogame, but a broader virtual environment.

    Every large industrial company in the the fields of Information and Communications Technology (ICT), engineering, media, entertainment, and so on does know these facts and therefore will not support any fraud or crime due to their requirement for legal certainty, which can only be given by Society for Ontological Performance and Reproduction (SOPR), as is the case with all of our revolutions in the fields of

  • Cyber-Physical System (CPS), Internet of Things (IoT), and Networked Embedded System (NES),
  • Business Process Management (BPM),
  • industry, specifically our Industry 4.0 and 5.0,
  • mobility,
  • etc., etc., etc..

    When reviewing once again these fraudulent activities in relation to our Ontoverse, also wrongly and ilegally called Metaverse, we remembered the following facts:

  • An online encyclopedia has publicated a webpage about the subject Metaverse, which is an infringement of our copyright, because the wrong and illegal description of (a part of) our Ontoverse by the fraudulent Acceleration Studies Foundation is reproduced and referenced (see also the quote above), and the one or more authors of said webpage used the illegal designation Metaverse for our Ontoverse.
  • An online encyclopedia mentioned the term avatar in relation to the novel Snow Crash and its pure Virtual Reality Environment (VRE) called Metaverse.
  • An online encyclopedia mentioned the companies Roblox and Microsoft in relation to the Metaverse, but in the context it is our Ontoverse and used the illegal designation Metaverse.
  • Microsoft used the illegal designation Metaverse for our Ontoverse.
  • Roblox used the illegal designation Metaverse for our Ontoverse.
  • Roblox is immitating the Lego minifigure.
  • Epic Games is talking about a storytelling and also multiple entertainment properties (e.g. Star Wars and Super Heroes (DC Universe (Batman) and Marvel) characters), avatars and crazy things, and connecting them with our Ontoverse by using the wrong and hence illegal designation Metaverse,

    Then it made click and we saw the forest.
    We already do have created this foundational concept, medium, and gameplay, including blended entertainment experience that has interactive elements with the Massively Multiplayer Online Game (MMOG) and Massively Multiplayer Online Role-Playing Game (MMORPG) based on Lego (see also the webpage Virtual Object System (VOS) of the website of OntoLinux and the Investigations::Multimedia of the 18th of July 2008) as well as the feature of our OS that every user gets a Lego minifigure as her, his, or their avatar, which is connected with her, his, or their identity onto# (see Adam Whitney Savage and James Franklin Hyneman (the MythBusters) and Kristanna Loken (actor of the T-X or Terminatrix in the movie Terminator 3: Rise of the Machines of the Terminator saga) and our educational software showing some of the Lego Minifgure Series of the Collectible Minifigures (CMF) theme and the connection to Marvel and all the other entertainment properties in our Ontoverse, where characters from different fictional universes could come together as one of our original expressions of idea presented with our OS (see the various Lego themes, like for example the Star Wars, Super Heroes (DC Universe (Batman) and Marvel), and Collectible Minifigures (CMF) themes and the related Lego Minifigures Series 16 (set number 71013; includes Cyborg and Banana Guy)).
    We also already do have integrated live coding or programming, and all related environments in our OS (see the section Algorithmic/Generative/Evolutionary/Organic ... Art/Science of the webpage Links to Software of the website of OntoLinux).
    Because Lego has also a foundational concept, which is creating, building, playing, and gaming, role-playing, storytelling, and so on, all single elements have been integrated in our OS, including what Epic Games has copied with its Fortnite.
    In the novels titled Ready Player One and Ready Player Two, which are also based on our OS and hence are subject to the compulsory licensing, we also had a certain mix of entertainment properties in the form of pop culture references, due to the part of our Ontoverse, which is a virtual reality simulator even designated as a virtual universe called the Ontologically Anthropocentric Sensory Immersive Simulation (OASIS) and accessible by players.

    A virtual island of a 3D Virtual World (VW) or Virtual Environment (VE) is no character. What an utter nonsense of the marketing unit of Epic Games.
    Howsoever, this point of view reflects our expression of idea to make the os itself the main subject of our OS, which covers also our Ontologic Collaborative Ontologic Virtual Environment (OntoCOVE) component and therefore the related basic technology of Fortnite and similar virtual environments.

    In general, the way of presentation of a work of art, specifically the kind of medium chosen for a presenation, is obsolete in relation to an expression of idea and its copyright protection.
    But as all plagiarists in relation to our Ontoverse have confirmed, also wrongly and illegally called Metaverse,

  • our OS with its OSA and Ontoverse is original and unique, and
  • C.S. has even created some kind of the
    • new medium, including live coding Mediated Reality (MedR),
    • entertainment (see for example VOS) experience of the future, and
    • social media (see for example VOS) of the next generation

    as part of our New Reality (NR).
    Therefore, there is absolutely no doubt about the copyright and other rights of C.S. and our corporation.

    Eventually, Epic Games has not created that foundational concept, medium, framework for storytelling, including the concept of a virtual entertainment environment or storyline where characters from different fictional universes could come together, an own expression of idea and work of art, or whatsoever, but merely copied the related essential parts of our Ontoverse for its illegal plagiarism and also the history and marketing story of our corporation, and hence has not increased its rights and properties, as well as market and bargain powers in relation to our OS.
    What is copyrighted is the design of its virtual entertainment environment as a virtual island and maybe some software implementing said original and unique parts of our OS, though the later is more or less wortheless.

    By the way:

  • It should be more than obvious, which work is truly the most culturally impactful. Hint: It is not Fortnite.
  • The Articles of Association (AoA) and the Terms of Services (ToS) with the License Model (LM) of our SOPR include regulations to charge all members and licensees under Fair, Reasonable, And Non-Discriminatory, As well as Customary (FRANDAC) terms and conditions, which also prevent the race to the bottom regarding fees and royalties of the members and licensees, which affect the heights of their overall revenues and correspondingly the heights of our royalties.
    If a member and licensee of our SOPR does not take a sufficiently high and reasonable, as well as customary reward (e.g. price, fee, royalty) for its technology, good, and service based on our original and unique ArtWorks (AWs) and further Intellectual Properties (IPs) included in the oeuvre of C.S., then the SOPR is allowed to use its own data for the estimation of said reward under FRANDAC terms and conditions. For example:
    • If Microsoft provides its Ontologic System variant Windows for free, then our SOPR will demand a royalty from Microsoft, which is based on the estimation of its overall revenue adjusted respectively increased by a customary revenue generated with the usual sale of Windows (around 70 to 100 U.S. Dollar for the basic version and more for the various professional and special versions).
      But with Operating as a Service (OpsaaS) models (OpsaaSx) we will act only in relation to each new device in this way.
    • If Epic Games takes a fee or cut of only 5% from developers and other entities, who are using its Software as a Service (SaaS) platform (e.g. app store, skin store), for which it has to charge 9%, 19%, or 29% under FRANDAC terms and conditions, then we will demand the rest from Epic Games based on our estimation of its overall revenue adjusted by a customary revenue generated with the usual cut.
      We are so free, because our legal situation is different to the situation of the SaaS of Apple and Alphabet (Google). It will not become cheaper for Epic Games and others.


    12.July.2021

    Comment of the Day

    Automotive space™
    The space™

    21:13 UTC+2
    OASIS removed from Ready Player One and Two webpage on Wikipedia

    We noticed that an entity has removed the complete designation Ontologically Anthropocentric Sensory Immersive Simulation and only left its acronym OASIS on the webpages about the novels titled Ready Player One and Ready Player Two on the website Wikipedia.

    Such an action is only ridiculous and even useless, because people will always ask what the acronym OASIS stands for.
    In relation to our copyright it even does not matter how the related part of our Ontologic System with its Ontoverse is named, performed, or reproduced, and what medium is chosen for its performance and reproduction, because the original and unique expression of idea of C.S. is still performed and reproduced, and the copyright law does not make a difference in this regard.

    Howsoever, our copyright has been confirmed indirectly once again, because otherwise said entity would have no reason to do that nonsense.

    23:49 and 24:55 UTC+2
    Preliminary investigation of Manticore Games started

    *** Work in progress - better structure and wording, some facts (e.g. CVE, including VOS and CoVE) and links missing ***
    We quote an online encyclopedia about the company Manticore Games and its video game Core, which is based on our OS: "Core is a free-to-play online video game with an integrated game creation system, developed by Manticore Games. It was released as an open alpha version on March 16, 2020 [...]. Core hosts user-generated games [...]. Core's game creation system is designed to simplify video game creation in order to allow more individuals to develop games.[3] [...] Core is based on a similar concept as other gaming platforms for user-generated games such as Roblox.[2]

    Game creation system
    Core's game creation system allows for the development of up to 32-player multiplayer games and single-player games. It is not possible to import game assets into Core's game creation system; however, it is possible to modify and combine built-in game assets. Core allows users to code using the Lua programming language using an extensive built-in API. Games made with Core can not be exported into standalone games; however, they can be shared and played in Core.[4]

    [...]

    Fundraising
    [...] In September 2020, it was announced that Manticore Games had raised a further $15 million, of which the largest contributor was the video game company Epic Games. The interest of Epic Games in funding Manticore was tied by [somebody] of [a lying technology-oriented media company] into a desire to create a "metaverse", wherein several different gaming platforms are interconnected.[7] In March 2021, Manticore Games announced that they had closed a $100 million Series C funding round and described Core as a "creator multiverse."[8]

    Reception
    Core has received mixed reviews from critics. Tyler Wilde writing for PC Gamer gave a mixed review of Core, describing its game creation system as "fun", but finding that the character models were "ugly", and that they had stiff animations, as well as that the in-game weaponry and interfaces were not fun to use.[4] Graham Smith of Rock Paper Shotgun wrote that the platform "can't help but create shabby recreations of triple-A games".[1] Jason Fanelli from MMORPG.com "My mind was absolutely blown. Something that takes massive studios multiple years and millions of dollars was accomplished in front of my eyes in the same amount of time it took me to write this sentence."[9] In Collider, Marco Vito Oddo writes "If the project grabs enough attention, and if developers/operators Manticore can give all the support a project this big demands, Core can easily become to games what YouTube is for video content." [10] About the monetization model of the platform, Scott Baird writes in ScreenRant "One of the most exciting things about Core is its revenue split, which offers 50% to its creators, allowing users to make a profit from their in-game titles."[11]

    [Textbox:]
    Engine [Epic Games] Unreal Engine 4
    Platform(s) Microsoft Windows"

    We also quote a report about its latest funding round: "[...] Manticore Games is one of the second-layer gaming platforms looking to build on the market's momentum. The startup tells [a magazine] they've closed a $100 million Series C funding round, bringing their total funding to $160 million. The round was led by [several venture capital investment companies] and Epic Games.
    [...]
    Manticore's Core gaming platform is quite similar to Roblox conceptually [...].
    Like other players, Manticore is attempting to build a game discovery platform directly into a game engine. They haven't built the engine tech from scratch; they've been working closely with Epic Games, which makes the Unreal Engine and made a $15 million investment in the company last year.
    [...]
    This all comes at an added cost; developers earn 50% of revenues from their games, leaving more potential revenue locked up in fees routed to the platforms that Manticore depends on than if they built for the App Store directly, but this revenue split is still much friendlier to creators than what they can earn on platforms like Roblox.
    Building cross-platform secondary gaming platforms is host to plenty of its own challenges. The platforms involved not only have to deal with stacking revenue share fees on non-PC platforms, but some hardware platforms that are reticent to allow them all, an area where Sony has been a particular stickler with PlayStation. The long-term success of these platforms may ultimately rely on greater independence, something that seems hard to imagine happening on consoles and mobile ecosystems."

    Comment
    First of all, we already consider the designations

  • Core and
  • multiverse

    taken alone and together as a copyright infringement, due to our Ontologic Core (OC or OntoCore) component and common core network or fabric, and also OntoCOVE component, which has a multiverse since the year 2007 and therefore is the original.

    Furthermore, no other entity has and gets the right to implement the foundational core of our Ontoverse by using our Ontologic System.

    But what we find more interesting are the facts that

  • it is funded by Epic Games to illegally implement a foundational part of our Ontoverse, including our metaverse multiverse, despite the company has said that its copyright infringing game Fortnite would be the basis for its illegal implementation of said foundational part of our Ontoverse,
  • it takes a cut of even 50% in its closed Virtual Environment (VE), while Epic Games is sueing the companies Apple and Alphabet (Google), because they demand a cut of 15 to 30% on their SaaS platforms, which Epic Games and others claim would not be fair, reasonable, and customary,
  • Roblox takes even a higher cut, which also shows once again that
    • our License Model (LM) establishes FRANDAC terms and conditions, but also
    • we have room for adjustment,

    and

  • metaverse is not the (only) preferred term anymore, but also multiverse reflecting once again our OS with its Ontoverse, which is connected with the Many-Worlds Interpretation (MWI) and other multiverse theories, which again is a hypothetical group of multiple universes in relation to our Caliber/Calibre.

    Also note that the field of Collaborative Virtual Environment (CVE) even with active replication and atomic broadcast was already state of the art around the year 1999 (see for example the Distributed Interactive Virtual Environment (DIVE) and in the fields of AR and VR was already state of the art around the year 2002 (see for example the Virtual Object System (VOS)), both CVEs referenced in the section Collaborative Virtual Environment of the webpage Links to Software of the website of OntoLinux and also our Ontologic Collaborative Ontologic Virtual Environment (OntoCOVE) component.
    sharded server models and video games based on them were the usual way of realization respectively implementation in the early 2000s due the cost of server hosting and supercomputing, which was still valid for the year 2011.
    A basic property of our OS is the integration of the field of Problem Solving Environment (PSE), which is based on, include, and utilized for managing and orchestrating High-Performance Computing (HPC) and supercomputing.
    CoVE Typical examples are distributed simulations, 3D multiplayer games, collaborative engineering software
    By the way: See also

  • Fault-Tolerant, Reliable, and Trustworthy Distributed Systems (FTRTDSs) based on the Byzantine Fault Tolerance (BFT) protocols or the Byzantine-Resilient Replication (BRR) method, such as Askemos based on our Evolutionary operating system (Evoos) and the approach of Secure Intrusion-tolerant Replication on the Internet (SINTRA), and
  • active transaction in CoVE based on our Ontologic System Architecture (OSA) integrating transaction of DataBase Management System (DBMS) with active replication of CVEs like DIVE for synchronization.

    {the scope of claims seems not to be correct} A multiverse in relation to virtual environments is based on, part of, or one of the foundations of our Caliber/Calibre, New Reality (NR), and Ontoverse.

    There is a clear distinction between sandbox video games and multiverse video games.
    Sandbox video games and multiverse video games allow players to create and share virtual worlds with other players.
    A {sandbox also?} multiverse also enables a network of virtual worlds and allows to move or travel within and between multiple instances of virtual worlds. But only in one computer graphics package, video game engine, and VE, but not across multiple ones.
    But our our original and unique metaverse multiverse Ontoverse also

  • allows to utilize collaborative functionality of different 2D and 3D software graphics packages and video game engines, and create and share across graphics packages, video game engines, video games, and virtual worlds.

    See the OntoCOVE component and VOS.

    Therefore, we also hold the copyright for multiverse in relation to the fields of Mixed Reality (MR), Virtual Environment (VE), and so on, which we have made due to the similarity with multimedia showing somehow that C.S. has also created

    • new model of virtual worlds,
    • some kind of new general medium,
    • new kind of entertainment,
    • new kind of social media as their next stage, and
    • integrated them with all the other media.

    we already have the exclusive infrastructures of our Society for Ontological Performance and Reproduction (SOPR) (see the issue SOPR #327 of the 7th of June 2021) with an initial proposal for main contractors, suppliers, and service providers.

    Nothing else than our OS with its Caliber/Calibre, New Reality (NR), and Ontoverse, as well as Ontologic System Architecture (OSA), Ontologic System Components (OSC), is legal, truly works, and is already becoming the worldwide standard since more than 20 years in the making.

    Coupling a specific game engine with our original and unique Ontoverse, including our metaverse multiverse and also wrongly and illegally confused with the original Metaverse.


    14.July.2021

    05:58 and 16:51 UTC+2
    Clarification

    *** Work in progress - better structure and wording, some facts (e.g. CVE, including VOS and CoVE) and links missing ***
    Despite we said to close this chapter, we also already thought about making the Investigations::Multimedia of the 11th of July 2021 and the preliminary investigation of the 12th of July 2021 a related Clarification, because several important portions are more general. In addition, we found it important and interesting to work the facts out.

    We also have begun to

  • bring logic, systematology, and order, and
  • more precise regulation

    concerning the overall matter, specifically regarding our Ontoverse (Ov) and plagiarisms of it.

    Therefore, we have reviewed relevant matter in relation to the terms and fields integrated in our Ontoverse (Ov), specifically the terms and fields related to computer-simulated, -generated, or -synthesized, and synthetic multimedia system and the Reality-Virtuality Continuum (RVC), which provides us the foundational definitions and classification system.
    In contrast to other approaches, that simply and wrongly equate Virtual World (VW) and Virtual Environment (VE) with Virtual Reality (VR), we call the general fields

  • New Reality (NR), including
    • eXtended Mixed Reality (XMR) or simply eXtended Reality (XR), including
      • (True or Real) Reality ((R)R),
      • Mixed Reality (MR), including
        • Augmented Reality (AR) and
        • Augmented Virtuality (AV),
      • Virtual Reality (VR),
    • Simulated Reality (SR or SimR),
    • Synthetic Reality (SR or SynR), and
    • all other realities,

    and classify related VWs or VEs in accordance with their

  • scope (e.g. (information) space, environment, world, and universe),
  • principles of communication or connectivity (e.g. standalone or offline, and distributed or networked (online), and also mobile),
  • count of user (e.g. single user and massively multiuser),
  • etc.

  • persistent VW,
  • Immersive Virtual Environment (IVE) or Virtual Reality Environment (VRE),
  • Collaborative Virtual Environment (CVE) (see the section Collaborative Virtual Environment of the webpage Links to Software of the website of OntoLinux), and
  • Distributed Virtual Environment (DVE) or Networked Virtual Environment (NVE), including
    • shared VW,
    • sharded VW, and
    • Massively Multiuser Virtual Environment (MMVE), including
      • Massively Multiplayer Online Game (MMOG),
  • World Wide Web (WWW) or web-based,
  • entertainment (see for example the VOS),
  • social media (see for example the VOS),
  • etc.,

    which are all fusioned to the New Reality Environment (NRE) of our OntoVerse (OV) component as part of the manifestation of our New Reality (NR) as the Ontoverse (Ov).

    metaverse related

  • original VRE Metaverse of "Snow Crash",
  • original physical theory Metaverse,
  • original VE (platform and network for 3D VWs) of Open Source Metaverse Project, VOS, and others,
  • * original ARE MRE Metaverse of VOS and others,
  • original NRE RVE Metaverse and metaverse of VOS or us Caliber/Calibre, New Reality (NR), and Ontoverse,
  • ** original Cyber-Physical Environment (CPE) Metaverse and metaverse of us Caliber/Calibre, New Reality (NR), and Ontoverse,

    multiverse related

  • original physical theories of multiverse, including Many-Worlds Interpretation (MWI),
  • * original VE (platform and network for MMOG and 3D VWs) of Multiverse Network Limited,
  • original virtual Multiverse and multiverse of us Caliber/Calibre, New Reality (NR), and Ontoverse,
  • ** original CPE Multiverse and multiverse of us Caliber/Calibre, New Reality (NR), and Ontoverse,

    metaverse and multiverse related

  • *original VE metaverse multiverse of us Caliber/Calibre, New Reality (NR), and Ontoverse,
  • **original CPE metaverse multiverse of us Caliber/Calibre, New Reality (NR), and Ontoverse,

    * These are the hot topics, where the bees are dancing in frenzy.
    ** These are the next hot topics, where the bees are already beginning to dance.

    There are several problems:

  • Some experts call AR
    • VR,
    • AVR, which is Augmented Virtuality by definition of the Reality-Virtuality (RV) continuums or continua, or spectra of others in relation to AR and MR (without the ends of pure or complete reality and pure or complete virtuality along the virtuality continuum) and us in relation to all possible realities in the observable university (superset of the first interpretation with the ends and therefore called eXtended Mixed Reality (XMR) or simply eXtended Reality (XR), and also Simulated Reality (SR or SimR) and Synthetic Realitiy (SR or SynR) and therefore called the New Reality (NR)), and
    • MR, which was even done by the authors of the reality-virtuality continuum.
  • Some experts discuss AR, VR, and AVR in relation to VE. VOS: "Virtual Reality is a persistent three-dimensional space fundamentally designed to use computer networks for communications among many users; users are represented by articulated three-dimensional models called avatars and interact with each other in real time for social, entertainment, educational and economic purposes; users are able to contribute to the world by creating, modifying, deleting or otherwise interacting with any feature of the [virtual] world for which they have permission to do so. [...]
    To users, this mean a seamless experience of moving from world to world engaged in activities such as talking to other avatars. [...]
    VOS was chosen to develop an augmented reality game."
  • Some experts declare that "Mixed Reality (MR) visual displays, a particular subset of Virtual Reality (VR) related technologies that involve the merging of real and virtual worlds somewhere along the "[reality-]virtuality continuum" which connects completely real environments to completely virtual ones [...] The concept of a "[reality-]virtuality continuum" relates to the mixture of classes of objects presented in any particular display situation, as illustrated in Figure 1, where real environments, are shown at one end of the continuum, and virtual environments, at the opposite extremum. The former case, at the left, defines environments consisting solely of real objects (defined below), and includes for example what is observed via a conventional video display of a real-world scene. An additional example includes direct viewing of the same real scene, but not via any particular electronic display system. The latter case, at the right, defines environments consisting solely of virtual objects (defined below), an example of which would be a conventional computer graphic simulation. As indicated in the figure, the most straightforward way to view a Mixed Reality environment, therefore, is one in which real world and virtual world objects are presented together within a single display, that is, anywhere between the extrema of the [reality-]virtuality continuum. [...]
    Figure 2: Different aspects of distinguishing reality from virtuality: i) Real vs Virtual Object; ii) Direct vs Non-direct viewing; iii) Real vs Virtual Image." and
    "[W]e do in fact agree that AR and VR are related and that it is quite valid to consider the two concepts together. The commonly held view of a VR environment is one in which the participant observer is totally immersed in a completely synthetic world, which may or may not mimic the properties of a real-world environment, either existing or fictional, but which may also exceed the bounds of physical reality by creating a world in which the physical laws governing gravity, time and material properties no longer hold. In contrast, a strictly real-world environment clearly must be constrained by the laws of physics. Rather than regarding the two concepts simply as antitheses, however, it is more convenient to view them as lying at opposite ends of a continuum, which we refer to as the Reality-Virtuality (RV) continuum. This concept is illustrated in Fig. 1 below.", which we call the first interpretation of others due to the reason that our interpretation of the RV continuum includes Simulated Reality (SR or SimR) and Synthetic Reality (SR or SynR) as well.
  • Some experts confuse real and physical things with virtual and digital things, like for example physical theories and artistical and science-fictiontal expressions of ideas, because of our Caliber/Calibre, New Reality (NR), and Ontoverse.
  • Some experts confuse gameplay and story with the layout, configuration, and operation of the underlying network of servers.
  • Several experts make contradictory explanations, statements, and definitions in relation to their own works and in relation to the works of others.
  • Several non-experts simply use all buzzwords they can get without understanding them, but even define them for even more stupid non-experts.
  • Several experts and non-experts use the terms and publicate manipulated materials to confuse the public and the competitors.

    Obviously, there is a difference between metaverse, multiverse, CVE, and so on, which suggest an orthogonal multidimensional taxonomic framework for classifying.

    Our Caliber/Calibre connects, integrates, or fusions physical theories with cyberspace as part of the realization of the reality-virtuality continuum called New Reality (NR). The emphasize is layed on accepted theories. But in case of the physical metaverse and multiverse theories there is a lack of supporting scientifical evidence. Therefore, we said some kind of a multiverse in relation to our Ontologic Collaborative Ontologic Virtual Environment (OntoCOVE) component and Ontoverse.
    Howsoever, this connection was done by C.S. before the manipulation of the webpage about the original Metaverse began on the 15th of January 2007 with the reference of a scientist and his multiversal Theory of Everything (ToE) at first, but removed it quietly and quickly again due to the usual plagiarism and other fraud in societies, governments, sciences, and industries, which still is held up in relation to him in the field of cybernetics, despite this connection between multiverse theories, ToE, and cybernetics was also created by C.S. with the Caliber/Calibre, New Reality (NR), and Ontoverse.

    We knew that we have publicated in the year 2007 something like "Integration of some kind of a Multiverse into [OntoScope]."
    See the OntoLinux Website update of the 23rd of April 2007 in relation to the outdated (publication of the) project status of the Ontologic Collaborative Ontologic Virtual Environment (OntoCOVE) component of the OntoScope component.
    Also note that we referenced a scientist and his multiversal Theory of Everything (ToE) at first, but removed it quietly quickly again due to the usual plagiarism and other fraud in societies, governments, sciences, and industries

    Also note that "[the term] Metaverse is a compound conjuction of "meta" and "verse" and has been used by [a scientist and not so genius plagiarist of our OS] as an extension of Multiverse. The Metaverse contains the Multiverses and all universes past and present."
    But the scientist used the term metaverse in relation to his specific holistic Theory of Everything (ToE). In his book publicated in the year 2004 in relation to the fundamental energy and information-carrying field that informs not just the current universe, but all universes past and present (collectively, the "Metaverse"). This means not in relation to parallel universes and many worlds existing at the same time and really happening simultaneously, and hence not in relation to a multiverse theory.

    "Early recorded examples of the idea of infinite worlds existed in the philosophy of Ancient Greek Atomism, which proposed that infinite parallel worlds arose from the collision of atoms. In the third century BCE, the philosopher Chrysippus suggested that the world eternally expired and regenerated, effectively suggesting the existence of multiple universes across time.[1] The concept of multiple universes became more defined in the Middle Ages.
    [...]
    The American philosopher and psychologist William James used the term "multiverse" in 1895, but in a different context.[3] The term was first used in fiction and in its current physics context by Michael Moorcock in his 1963 SF Adventures novella The Sundered Worlds (part of his Eternal Champion series)."

    The term metaverse was only used in relation to the term metaverse itself in relation to the original Metaverse in the novel titled "Snow Crash", but not in relation to the multiverse and the functionality of computing and multimedia systems, including video games or in relation to both metaverse and multiverse. {misleading claim due to Collaborative Virtual Environment (CVE or CoVE)}The latter came from us through our Caliber/Calibre, which includes such physical theories and interpretations to form our dynamic Theory of Everything (ToE), as proven above and explained yesterday.

    Also note that the author of an online encyclopedia did not know what she or he was writting, because by definition there is only one multiverse and multiple multiverse theories or hypotheses, but not multiple multiverses.

    Also obviously, there exists prior art, which is free to use and not covered by the technological and legal scope of our Ontologic System with its Ontoverse. But it is either already obsolete or irrelevant or both, or becoming more and more obsolete or irrelevant or both, specifically in relation to our integrations of the fields of

  • SoftBionics (SB),
  • Space-Based technologies (SBx),
  • Service-Oriented technologies (SOx) and as a Service models (aaSx),
  • Fault-Tolerant, Reliable, and Trustworthy Distributed System (FTRTDS) (e.g. smart contract transaction protocol, blockchain technique, Byzantine Fault Tolerance (BFT) protocols, Byzantine-Resilient Replication (BRR) method),
  • and so on.

    Of course, everybody is allowed to use the Virtual Object System (VOS), which

  • was inspired by the original shared Virtual Reality Environment (VRE) Metaverse,
  • is a Distributed Virtual Environment (DVE) or Networked Virtual Environment (NVE), and Collaborative Virtual Environment (CVE) with AR and VR and persistent Virtual Worlds (VWs), and
  • is used for Virtual Reality (VR), defined as "a persistent three dimensional space [or [virtual] world... in which] users are represented by [] avatars and interact with each other in real time for social, entertainment [e.g. gaming], education and economic purposes [... are] creating, modifying, deleting and otherwise interacting with any feature of the [virtual space or] world".

    "For a VR system to be robust, it must not contain a single point of failure."
    "There is also an important philosophical point: the technical values of the system will influence the social values of the system. Do we want a dictatorial central server, or a democratic distributed system? The [World Wide W]eb democratically allows anyone with a bit of connectivity to host his or her own web site, and VR should be the same way."
    "Security, privacy and trust are absolutely crucial elements of VR, yet are often overlooked. [... I]t must be possible to encrypt communications. [...] Encryption gives users a sense of privacy as personal information can be protected. By using [...] cryptography nd digital signatrures, users can verify the identity of another user [...]."
    By creating, publicating, discussing, protecting, and unclosing our OS we have given the societal, social, philosophical, artistical, scientifical, technological, and economical answers and why our SOPR and our other Societies do exist. There is just no other entity entitled and qualified. *<:o)

    But our Caliber/Calibre and New Reality (NR) make this 3D space the at least 4D real universe or multiverse, for example by also

  • realizing a reality-virtuality continuum, including both ends, and the transformation or bridge along this reality-virtuality continuum, and
  • including Simulated Reality (SR or SimR) and Synthetic Reality (SR and SynR),

    which is a totally and fundamentally different expression of idea on the foundational operational level than

  • Mixed Reality (MR), including Augmented Reality (AR) and Augmented Virtuality (AV), and also
  • precursors of our other expression of idea called the Cyber-Physical System (CPS 2.0)

    on the application layer

    because our New Reality (NR) fusions the various (information) spaces, environments, worlds, and universes of the Reality-Virtuality-Continuum (RVC) and makes no difference and (seamless) division between them anymore in contrast to MR and CPS, so that there are neither "the integration of the physical with the digital [...] uncovering and supporting the variety of possible relationships between physical and digital worlds [...] combine the physical and digital [...] new technologies that merge [or mix] the physical and the digital" and "new inter-relationships between the physical and digital" nor "cross[ings of] the border between physical and digital experience without a thought, thereby enabling each [real and virtual] environment to complement the other, as equals in a kind of dynamic balance [as part of] a seamless divide".

    In relation to

  • connection of original Metaverse (VRE and other VE) and original Metaverse (physical theory; temporal),
  • connection of original Metaverse (VRE and other VE) and original Multiverse (physical theories; spatial),
  • original Multiverse (Virtual Environment (VE)),
  • connection of original Multiverse (physical theories) and original Multiverse (VE), and due to the later Multiverse (Cyber-Physical Environment (CPE)),
  • and so on

    this is not the case, because legal scope of ... the OntoLand.

    So there is no evidence about a multiverse VE other than our Ontoverse and eXtended Mixed Reality (XMR) or simply eXtended Reality (XR) included in our New Reality (NR) anyway.
    Therefore, a VE can have and share multiple

  • VWs or computer-simulated environments, and
  • subgames

    between which the users can move or travel, and communicate, but not multiple

  • computer graphics packages,
  • video game engines, and
  • VEs

    respectively cannot be a Multiverse (VE and CPE).

    But this discussion is already outdated since some few years, as we have made clear several times in more than the last 3 years, because all major operating systems are Ontologic System variants in whole or in part,

  • cloud computing never really existed, but has always been a marketing term for the related part of our OS and is called Space and Time Computing and Networking (STCN) or simply Space Computing, or better Ontologic Computing and Networking (OCN) or simply Ontologic Computing (OC),
  • 5G is already our 5G NG and 6G and following generations are based on our OS anyway, and
  • there is virtually no smartphone, tablet computer, smart watch, smart glasses, smart car, and so on anymore, but only our Ontoscope in various variants and form factors.

    Honestly, we do not think that there exists a modern VE, including VW, including MMOG and CVE, which is not in the legal scope of ... the Ontoverse, also known as the OntoLand.

    "Technological convergence, [is] the tendency for different technological systems to evolve toward performing similar tasks".
    The metaverse is described as convergence of all virtual worlds, Mixed Reality (MR) (Augmented Reality (AR) and Augmented Virtuality (AV)), and the old Internet.
    Indeed, all single elements were existing before the presentation of our OS with its Ontoverse

  • immersive 3D Virtual World (VW) or Immersive Virtual Environment (IVE or ImVE) respectively Virtual Reality Environment (VRE) of the original Metaverse as a (pure) VR-based Wide Area Network (WAN) similar to an immersive 3D World Wide Web on the Internet, but not an MR-based Wide Area Network (WAN),
  • shared 3D Virtual World (VW) or Virtual Environment (VE), specifically Collaborative Virtual Environment (CVE) for VR and MR (AR and AV) on the Internet, and
  • Internet,

    But the successor to the Internet and also the convergence of all of them were not existing, but were created with our Ontoverse, which again includes the convergence or sum of all

  • virtual worlds (non-persistent and persistent virtual worlds),
  • VR and MR (AR and AV), and
  • Internet, and also
  • (sharded) Metaverse-like Massively Multiplayer Online Games (MMOGs) with persistent virtual worlds,

    and hence a metaverse and also a multiverse according to our definition and common sense, which suggests ...

    XR mirror world, including MR mirror world, including AR mirror world respectively XR 4D globe, including MR 3D globe, including AR 3D globe

  • mirror world - bridges map worlds and the real world as a simulation
  • XR mirror world - bridges mirror worlds, which bridge map worlds and the real world as simulations, and the real-virtual world, all of which were created with our New Reality (NR) and Ontoverse.

    A home world of a perceived virtual universe of virtual worlds or a home universe of virtual worlds therefore has to be viewed in the sense of a home page in our Ontonverse.

    There is only one metaverse multiverse, which is our Ontoverse.
    Like in the case of

  • Operating system as a Service (OpsaaS) and
  • Games as a Service (GaaS), including
    • MMOG and
    • cloud space gaming respectively gaming on demand,

    there are

  • Computer Graphics as a Service (CGaaS), including
    • cloud space rendering respectively rendering on demand,

    and

  • Game Engine as a Service (GEaaS)

    capability and operational models (aaSx), which have to be liquid, organic, molecular (respectively modular, compatible, and so on) in relation to the exclusive infrastructures of our SOPR, specifically those parts of CG engines and GEs that are common to all members and licensees of our SOPR.
    Prominent examples in relation to CG engines are network, operating system, and event services.
    Prominent examples in relation to GEs are network, operating system, CG engine, and event services, but also user services regarding trust, identity, consent, Electronic Commerce (EC), and so on.
    A prominent example in relation to MMOGs, VEs, and VWs is our home universe or home world with its centralized managed and orchestrated, and distributively operated hub of our Ontoverse with its collection of universes or worlds and therefore also called the original and unique, one and only metaverse multiverse.
    The VEs and VWs of our Ontoverse will be put on top of these SBx, SOx, and aaSx.

    As we said, around 2 years ago already, the tide has turned and there was never a loophole based on simulating an ordinary technological progress and taking single prior art integrated by our Ontologic System Architecture (OSA), by taking our OS as blueprint.

    Therefore the clause in the AoA and the ToS with the LM of our SOPR simply says: All or nothing at all.

    20:38 UTC+2
    SOPR #32x or #33y

    Topics
    We focus on one thing in this issue, which some find important and others do not, and this is money:

  • Ontologic Financial System (OFinS) [Digital money supply]
  • Ontologic Bank [Ontologic Payment System (OPS)]
  • Further steps

    Ontologic Financial System (OFinS) [Digital money supply]
    For the realization, management, and utilization of digital money various technologies based on cryptography are utilized, including

  • smart contract transaction protocol,
  • blockchain technique,
  • digital and virtual ledger technologies,
  • Non-Fungible Token (NFT),
  • validated and verified computing, and
  • digital wallet,

    as also utilized for and with the

  • Trust Management System (TMS or TrustMS),
  • IDentity and Access Management System (IDAMS), and
  • Consent Management System (CMS or ConsMS), and also
  • Ontologic Financial System (OFinS)
    • Ontologic Bank (OntoBank)
      • Ontologic Payment System (OPS or OntoPay) (see the related section below),
      • Ontologic Payment Processing System (OPPS),
      • Ontologic Exchange (OEx, OntoEx, or OntoExchange),
      • Ontologic Bank Financial Information and Communications (OBFIC or OntoBankFinIC), and
      • other subsystems and platforms

    of the infrastructures of our SOPR.

    Generally, the money supply of a country is usually defined to consist of

  • currency in circulation (value of currency or cash (notes and coins) issued) and
  • demand deposits (non-confidential money, deposit money, (central bank) book money, giro money, or commercial bank money)==Buchgeld oder Giralgeld).

    In short, there are two types of money in a fractional-reserve banking system:

  • MB = central bank money - obligations of a central bank, including currency and central bank depository accounts
  • M1, M2, and M3 = commercial bank money - obligations of a commercial bank, including transaction accounts (checking accounts, demand accounts, demand deposit accounts, current accounts, or giro accounts) and savings accounts.

    In general, the types of commercial bank money, that tend to

  • be valued at lower amounts, are classified in the narrow category of M1, and
  • exist in larger amounts, are classified in the categories of M2 and M3.

  • M1 = coin and currency in circulation plus checking accounts
    • coin
    • currency
    • demand deposits
    • traveler checks
  • M2 = M1 plus short-term liquid assets
    • M1
    • saving deposits
    • time deposits
    • certain Certificate of Deposits or Checkable Deposits (CDs)
    • money market deposit accounts
    • money market mutual funds
  • M3 = M2 plus large and long-term deposits

    For example, the Federal Reserve Banks (FRBs) of the U.S.American Federal Reserve System (Fed) use M1 and M2 as a standardized way of defining money in the economy.

    Correspondingly, the digital money supply is defined to consist of

  • digital currency (tokens) in circulation and
  • digital demand deposits (tokens).

    We do not call a digital currency a supplement to cash, because

  • the definition as an digitial object already says that a digital currency is a virtual representation of a corresponding real currency and
  • a supplement to real cash would be a virtual currency without a real counterpart.

    Therefore, a digital currency is a supplemental form of cash, or simply put cash, used for a supplemental way respectively digital way of monetary transaction.

    The digital currencies are issued and controlled by the central banks, just like cash. They are created by the money creation of the central banks and the commercial banks, which means in the same way as

  • real coins and banknotes, which are stored in a savings account, and
  • demand deposits, which are stored in a transaction account.

    In this way, the

  • states retain their monetary sovereignty and
  • central banks retain the control over the digital currencies,

    and thus they are able to

  • guarantee the stability and security of the national and international financial markets and also
  • protect the assets and properties of the population.

    A digital currency is booked in a separate digital transaction account, which is separated from deposits in a normal transaction account.
    In fact, this digital transaction account is held by a mandatory joint venture between a central bank and our OntoBank of our SOPR, but it will be managed by the commercial banks registered at our OntoBank.

    A digital currency is stored in a digital wallet of a private end user or consumer as part of the TMS, IDAMS, and CMS of our SOPR.

    More and more central banks, reserve banks, or monetary authorities of sovereign territories (unions of states, countries, and states of a union of states) are

  • developing and
  • planning to issue

    their official digital currencies, which are not virtual currencies, including cryptocurrencies.
    Unsurprisingly, the count of these Central Bank Digital Currencies (CBCDs), which are accepted as legal moneys and tenders in the Ontologic Financial System (OFinS) of our Society for Ontological Performance and Reproduction (SOPR), are steadily rising.

    In our Ontoverse, which is already becoming the successor of the old and outdated Internet and the old and outdated World Wide Web (WWW), only the

  • OntoCoin and OntoTaler, as well as
  • official digital (and virtual) currencies issued by central banks of sovereign territories,

    based on the Urcoin, which is our Quantum Coin (Qoin), will be accepted.

    Sovereign territories, that cannot or do not want to introduce and afford an own digital currency, simply take our

  • digital currency and stable coin OntoCoin or
  • virtual currency OntoTaler

    by default.

    A payment (transaction) is done with the OntoBank OntoPay, which is an omnichannel payment system, platform, and service. The payment (transaction) can be done both online and offline through

  • transmission technologies, including
    • Personal Area Network (PAN) (e.g. Bluetooth Low Energy) for low-bandwidth, short-range, and electric power efficient wireless communication,
    • LoRa for low-bandwidth, long-range, and high-power wireless communication, and
    • Frequency-shift keying at 900 MHz for interaction and communication with legacy appliances,
  • biometrics, including recognition by scan of
    • face,
    • eye,
    • hand,
    • vein,
    • etc.,

    and

  • any other technical solutions provided with the
    • common backbone, core network, or fabric, and
    • subsystems and platforms

    of the infrastructures of our SOPR.

    The payment procedure is therefore very simple and also much faster than for example a Single Euro Payments Area (SEPA) transfer, which has already rapidly accelerated payment transactions in the 36 member states of the SEPA payment-integration initiative.
    Even better, the payment procedure works worldwide.

    Digital demand deposits are part of the (digital) money supply, as they can be used via digital check (transaction) and digital draft (transaction) as a means of

  • payment (transaction) for goods and services, and
  • digital settlement of debts or debt repayment (transaction).

    Digital demand deposits are funds held in transaction accounts in commercial banks.

    Digital book money or digital ledger money is, as a demand for cash, a means of payment (transaction), that can be used in the banking system by transferring it from transaction account to transaction account by means of cashless (accounting) transactions on the basis of the smart contract transaction protocol, which results in data storage entries in a digital ledger.

    Also important for a financial system in general and our Ontologic Financial System (OFinS) in particular are the

  • money market,
  • capital market, and
  • loan market.

    The money market is a financial market in which short-term funds (until 1 year) are provided and short-term loans are dealt in.

    The capital market is a financial market in which long-term debts (1 year or longer) or equity-backed securities are bought and sold.

    The capital market is divided in

  • non-organized markets
    • free capital market and
    • interbank market,

    and

  • organized markets
    • OFinS interbank market for digital currencies,
    • bond market, and
    • stock market
      • primary market for new digital stocks and debts (e.g. bonds, notes, bills, etc.), and
      • secondary market for existing digital securities.

    The interbank market is

  • unregulated and decentralized in relation to real currencies, but
  • organized and centralized in relation to digital currencies.
    Specifically, foreign digital currency options are regulated in a number of countries.

    The central banks in many countries and of many economies

  • publish closing spot prices on a daily basis and
  • implement their monetary policy by manipulating instruments, that allow them to achieve a certain value for an operational objective.

    Instruments are defined as the variables directly controlled by a central bank, such as the

  • real and digital cash ratios,
  • interest rate paid on funds borrowed from the central bank, and
  • structure of the balance sheet.

    The three main constituents of the OFinS interbank market are the

  • spot market,
  • forward market, and
  • Ontologic Bank Financial Information and Communications (OBFIC), so to say the Society for Worldwide Interbank Financial Telecommunications of the Next Generation (SWIFT NG), which provides by design of the Ontologic System (OS) a network, that enables financial institutions worldwide to send and receive information about financial transactions in a standardized, secure, and resilient (fault-tolerant and trustworthy (reliable)) environment.

    The OFinS interbank market (interbank dealings or interbank transactions==Interbankenhandel oder Interbankenverkehr) is an important segment of the overall foreign exchange market, because it is the

  • wholesale market through which all digital currency transactions are channeled and
  • top-level foreign exchange market, where banks exchange different digitial currencies,

    All these digital currency transactions take place at the specific central exchange of our OntoBank called OntoExchange (OEx, OntoEx, or OntoExchange) correspondingly.
    The OFinS interbank market is mainly used for trading among bankers. The banks deal with one another through the electronic brokering platform(s) of our OFinS, which belong(s) to the subsystems and platforms of our SOPR.

    Correspondingly, the following 3 tokens and related digital ledgers are suggested and discussed by central banks and provided, managed and orchestrated, and also operated by the joint ventures between central banks and our OntoBank:

  • digital currency token or digital wallet coin for the daily use by the private end users or consumers,
  • digital demand deposit or digital ledger money token for the use by the industries, and
  • special money token for the use by the digital capital markets, including the OFinS interbank market with the OBFIC.

    These 3 basic tokens are compatible with the real money supplies of the

  • U.S.America and the real U.S. Dollar,
  • European Union (EU) and the real Euro,
  • P.R.China and the real Renminbi and its basic unit Yuan, and
  • other unions of states, countries, and states of union of states and their real currencies.

    In correspondence with the matter described, discussed, and defined above and the Articles of Association (AoA) and the Terms of Services (ToS) with the License Model (LM) of our Society for Ontological Performance and Reproduction (SOPR), the assignment of responsibilities, tasks, and works is quite straightforward:

  • Joint ventures between central banks, reserve banks, or monetary authorities, and our OntoBank
    • digital money supply (digital currencies, and digital demand deposits or digital ledger money) and
    • management and orchestration, and also operation of digital transaction accounts

    Joint ventures between central banks, reserve banks, or monetary authorities, and our OntoBank provide no financial services for private end users or consumers, or in other words they will not have direct access to such a joint venture and a central bank, as with real cash, and demand deposits or book money.

  • OFinS with its OntoBank
    • digital capital markets, including the
      • interbank market for digital currencies,
      • bond market for digital debts and securities (e.g. digital bonds, notes, bills, etc.), and
      • stock market for digital stocks, commodities, and other tradable digital assets
  • OntoBank
    • OntoPay System (OPS)
    • OntoPay Processing System (OPPS),
    • OntoExchange (OEx),
    • OntoBankFinIC (OBFIC), and
    • International Bank of Settlement for digital and virtual currencies, which will work together with the Bank for International Settlement
  • Trust Management System (TMS)
    • Smart Contract as a Service (SCaaS) for the execution of financial transactions,
    • Blockchain as a Service (BaaS or BlaaS) for the management and orchestration, and also operation of digital ledgers, and
    • digital wallet inherently included in the Ontologic System by design
  • Commercial banks and other financial companies registered at our OntoBank
    • provision of financial services for private end users or consumers, and other customers, and
    • utilization of digital transaction account through TrustMS, IDAMS, and ConsMS, and also OFinS subsystems, platforms, and services (e.g. OPS, OPPS, OEx, and OBFIC)

    Among other implications, their outside capital (debt capital or borrowed capital) is not reduced by around 25% in this way.

    Data protection or privacy, and data security, as well as data governance regarding digital wallet, digital transaction account, and digital ledger (hereafter data structure) are established and guaranteed through the following regulations:

  • Any access to any data structure of the OFinS is only permitted for real persons (private end user or consumer, and official agent).
  • If an official agent has to access a
    • data structure of the OFinS or
    • data stored in a data structure of the OFinS

    hold by another person, then she, he, or they either

    • need the consent of said holder of said data structure or said data stored in a data structure, or
    • request an access warrant in relation to said data structure or said data stored in a data structure, which has to be issued by at least one real person (judge) at a public court.
  • Any access to any data structure of the OFinS is logged in another digital ledger.
  • Depending on the importance of a token or a digital ledger used for the storage of tokens, the tokens of a digital ledger have to be stored for a sufficiently long period of time before it can be removed.

    Ontologic Bank [Ontologic Payment System (OPS)]
    The Ontologic Financial System (OFinS) Ontologic Bank (OntoBank) Ontologic Payment System (OPS or OntoPay) is an omnichannel payment (transaction) system, which

  • provides a foundational platform and services, and
  • is integrated with the other subsystems and platforms of the infrastructures of our SOPR and utilizes their features and functionalities in this way.
    The integration of all offline channels and online channels offers end users or consumers, and other customers a unified, seamless and effortless, high-quality experience within and between contact channels.

    Like the Ontologic Exchange (OEx, OntoEx, or OntoExchange) and Ontologic Bank Financial Information and Communications (OBFIC or OntoBankFinIC), the OntoPay treats digital transaction accounts and digital ledgers of the OFinS digital money supply (see the related section above) as fully available to all legitimated and authorized channels (retail and E-Commerce (EC), financial services (commercial banks and Financial Technology service providers (FinTechs)), marketing, media and entertainment, etc.) from one service platform.
    While the internal payment transaction process does not diverge to optimize the operations, the outbound payment service process only diverges at the point of contact respectively Point of Sale (POS) (e.g. stationary POS and mobile POS of commercial banks, retailers, and online shops).
    Through this integration it is possible to update

  • stock levels,
  • financial journals, and
  • data of end users or consumers, and other customers

    directly in the related central (centralized and decentralized) database, when an action is executed, like for example at a POS.

    Further steps
    We also would like to recall two general points once again.

    The times of

  • illegal cryptocurrencies,
  • illegal platforms based on the smart contract protocol and the blockchain technique,
  • illegal cryptocurrency exchanges, and
  • illegal exchange-traded funds,

    are over.

    The Ontologic Financial System (OFinS) with its Ontologic Bank (OntoBank) and digital and virtual currencies are also part of the societal compromise and related Big New Build Back Better Green World Plan and Deal, which we just called in this context the Big Deal, including our Green Deal, and was closed with C.S. and our corporation, including our SOPR.


    18.July.2021

    Clarification

    *** Revision - fields of cybernetics, Intelligent Environment, immobile robot, sensor network, Computational Intelligence, etc. ***

    We looked at the so-called field of Cyber-Physical System (CPS) once again and got more and more evidences that CPS is based on our original and unique works of art titled

  • Analyse und Entwurf eines Betriebssystems nach evolutionären und genetischen Aspekten==Analysis and Design of an Operating system according to evolutionary and genetic Aspects, also called Evolutionary operating system (Evoos), which was publicated in December 1999, and
  • Ontologic System, also called OS, which is the successor of our Evoos and was publicated in the end of October 2006.

    At first, we quote an online encyclopedia to provide brief introductions to the relevant subject matters and short overviews of the fields of

  • attention,
  • Situational Awareness or Situation Awareness (SA),
  • self-awareness, and
  • Cognitive-Affective Personality or Processing System (CAPS),

    and also the subjects

  • Computational Physics (CP, ComP, or CPhy), and
  • Physical Computing System (PCS or PhyCS),

    then we quote documents about the fields of

  • Model-Based Autonomous System (MBAS) or immobilie robot (immobot),
  • Sensor Network (SN),
  • SN Sensor Web,
  • Multi-Agent System (MAS) and SN, and
  • CPS,

    and at the end, we quote again an online encyclopedia about

  • CPS

    to provide additional informations and the concretization, definition, and delimitation of them.

    We quote an online encyclopedia about the subject attention: "Attention is the behavioral and cognitive process of selectively concentrating on a discrete aspect of information, whether considered subjective or objective, while ignoring other perceivable information.
    [...]

    Modelling
    In the domain of computer vision, efforts have been made to model the mechanism of human attention, especially the bottom-up intentional mechanism[84] and its semantic significance in classification of video contents.[85][86] Both spatial attention and temporal attention have been incorporated in such classification efforts."

    Comment
    The chapter 3.2 Funktionsweise eines Gehirns==Functioning of a Brain of The Proposal, which is describing our Evoos, references the work Mental Minds of Philip Johnson-Laird, which is based on the mental model theory of reasoning, which again has been applied to the main domains of deductive inference, including

  • relational inferences, such as spatial and temporal deductions,
  • propositional inferences, such as conditional, disjunctive, and negation deductions,
  • quantified inferences, such as syllogisms, and
  • meta-deductive inferences.

    The chapter 5 Zusammenfassung==Summary of The Proposal lists the point "das Sehen - die Videokamera und der Scanner"==the Vision or Seeing - the video camera and the scanner in relation to a behavioral and Cognitive Agent System (CAS).
    Please note that a camera and a scanner are based on an image sensor or imager, and a scanner is also based on an actuator.

    We quote an online encyclopedia about the subject Situational Awareness or Situation Awareness (SA): "Situational awareness or situation awareness (SA) is the perception of environmental elements and events with respect to time or space, the comprehension of their meaning, and the projection of their future status.[1 [Endsley, 1995b]]
    [...]

    Mental models
    Accurate mental models are one of the prerequisites for achieving SA.[30][31] A mental model can be described as a set of well-defined, highly organized yet dynamic knowledge structures developed over time from experience.[32][33] [...]"]

    Comment
    See once again the chapter 3.2 Funktionsweise eines Gehirns==Functioning of a Brain of The Proposal and the comment before.

    We also quote an online encyclopedia about the subject self-awareness: "In philosophy of self, self-awareness is the experience of one's own personality or individuality.[1][2] It is not to be confused with consciousness in the sense of qualia. While consciousness is being aware of one's environment and body and lifestyle, self-awareness is the recognition of that awareness.[3] [...]

    Body
    [...]

    Human development
    Bodily self-awareness in human development refers to one's awareness of their body as a physical object, with physical properties, that can interact with other objects. Tests have shown that at the age of only a few months old, toddlers are already aware of the relationship between the proprioceptive and visual information they receive.[9] This is called first-person self-awareness.
    At around 18 months old and later, children begin to develop reflective self-awareness, which is the next stage of bodily awareness and involves children recognizing themselves in reflections, mirrors, and pictures.[10] Children who have not obtained this stage of bodily self-awareness yet will tend to view reflections of themselves as other children and respond accordingly, as if they were looking at someone else face to face. In contrast, those who have reached this level of awareness will recognize that they see themselves, for instance seeing dirt on their face in the reflection and then touching their own face to wipe it off.
    Slightly after toddlers become reflectively self-aware, they begin to develop the ability to recognize their bodies as physical objects in space and time that interact and impact other objects. For instance, a toddler placed on a blanket, when asked to hand someone the blanket, will recognize that they need to get off it to be able to lift it.[9] This is the final stage of body self-awareness and is called objective self-awareness.

    Psychology
    [...]

    Developmental stages
    [...]

    Infancy and early childhood
    [...] [F]or Piaget, the objectification of the bodily self occurs as the infant becomes able to represent the body's spatial and causal relationship with the external world (Piaget, 1954). [...]

    Piaget
    [...]"

    Comment
    See specifically the chapters 3 Entwicklung und Funktionsweise eines Gehirns==Development and Functioning of a Brain and 5 Zusammenfassung==Summary, and also the related comments above and below.

    We also quote an online encyclopedia about the subject Cognitive-Affective Personality System (CAPS): "The cognitive-affective personality system or cognitive-affective processing system (CAPS) is a contribution to the psychology of personality proposed [...] in 1995. According to the cognitive-affective model, behavior is best predicted from a comprehensive understanding of the person, the situation, and the interaction between person and situation.[1 [A cognitive-affective system theory of personality: Reconceptualizing situations, dispositions, dynamics, and invariance in personality structure[. 1995]]]"

    Comment
    The chapter 6 Ausblick==Outlook of The Proposal, which is describing our Evoos, mentions prototypical Multimedia Systems (MSs or MultiSs) of the fields of affective computing and emotional intelligence, and also integrates them with a Cognitive Agent System (CAS) to a Cognitive-Affective Processing System (CAPS) in relation to

  • metaphysics,
  • logics,
  • semantics,
  • cybernetics,
  • SoftBionics (SB) (e.g. Artificial Intelligence (AI), Machine Learning (ML), Computational Intelligence (CI), Artificial Neural Network (ANN), Evolutionary Computing (EC), Computer Vision (CV), Simultaneous Localization And Mapping (SLAM), Soft Computing (SC), Autonomic Computing (AC), Natural Language Processing (NLP), Cognitive Computing (CogC), Cognitive Agent System (CAS), Cognitive-Affective Personality or Processing System (CAPS), Swarm Intelligence (SI) or Swarm Computing (SC), etc.),
  • HardBionics (HB),
  • Physical Computing System (PCS or PhyCS), and
  • Cyber-Physical System (CPS), as well as
  • the first step of C.S.' cybernetic
    • self-reflection, self-image, or self-portrait, and
    • reflection, augmentation, and extension.

    But also important to note in relation to these first 4 quotes is the direct connection between ontology, developmental biology, psychology, Piaget, reflection and reflexion, (reflective) self-awareness, and situational awareness in the scope of our Evoos, because it shows the relation of our Evoos to temporal semantics and timed computational semantics.

    We quote an online encyclopedia about the subject computational physics: "Computational physics is the study and implementation of numerical analysis to solve problems in physics for which a quantitative theory already exists.[1] Historically, computational physics was the first application of modern computers in science, and is now a subset of computational science.
    [...]

  • 5. [...] (1982). Simulating physics with computers

    [...]

  • [...] Computational Physics (1986)"

    Comment
    A related diagram shows that computational physics is the intersection of the fields of

  • mathematics, which provides the techniques,
  • computer science, which provides the HardWare (HW) and SoftWare (SW), and
  • physics, which provides the applications.

    But missing are the fields of

  • metaphysics,
  • logics,
  • semantics,
  • cybernetics,
  • SoftBionics (SB),
  • HardBionics (HB),
  • Autonomous System (AS), and
  • Robotic System (RS),

    which are the more relevant fields in relation to the topic of this clarification.

    We quote an introduction of a book about physical computing, which was publicated in 2004: "[...] We need to think about computers that sense more of your body, serve you in more places, and convey physical expression in addition to information. [...]
    [...] we have found people from very diverse backgrounds looking to bridge this gap between the physical and the virtual.
    [...] this book is designed to help you make a more interesting connection between the physical world and the computer world.
    [...] Robotics is the physical equivalent to AI. The technology you will learn in this book is very similar to what you'd learn in a book on robotics, but our typical applications are different. [...] Our approach comes out of a different area of computing called Intelligence Amplification (IA)."

    Comment
    See the next comment.

    We also quote an online encyclopedia about the subject physical computing, which was publicated on the 19th of October 2005: "Physical (or embedded) computing, in the broadest sense, means building interactive physical systems by the use of software and hardware that can sense and respond to the analog world.
    [...]

  • [...] (2004). Physical Computing: Sensing and Controlling the Physical World with Computers."

    We also quote an online encyclopedia about the subject physical computing, which was publicated on the 26th of May 2021: "Physical computing involves interactive systems that can sense and respond to the world around them. [...]
    Physical Computing intersects the range of activities often referred to in academia and industry as electrical engineering, mechatronics, robotics, computer science, and especially embedded development."]

    Comment
    The field of physical computing is teached since at least the year 1994. But obviously, the field of physical computing was only about the fields of

  • interactive telecommunications and
  • Multimedia System (MS or MultiS).

    {Physical Computing System (PCS or PhyCS) came before Model-Based Autonomous System (MBAS) or Immobile Robotic System (ImRS or Immobot) and seems to be included in it. Cybernetical Physics (CP or CybP)} But missing are the fields of

  • metaphysics,
  • logics,
  • semantics,
  • cybernetics,
  • SoftBionics (SB),
  • HardBionics (HB), and
  • Autonomous System (AS), and
  • Robotic System (RS)

    as in the case of computational physics (see above).
    {Evoos also model-based AS or immobot, and hence also PCS or PhyCS already} These missing fields were added to or are integrated with the field of Physical Computing System (PCS or PhyCS) by our Evoos before the publication of the quoted book.
    Correspondingly, the authors took a different strategy by approaching these fields, specifically metaphysics, semantics, and cybernetics, from the real side, discussing AI and RS at first, and then explaining that Physical Computing is different and about Intelligence Amplification (IA), which suggest somehow AI and RS are already included in Physical Computing, but also cybernetics, as well as human enhancement and cyborgs respectively cybernetical enhancements.

    In relation to the field of MS it should also be noted that an RS controlled by an Augmented Reality (AR) system already existed on the basis of for example the Virtual Object System (VOS) (2002), which was added to or is integrated with our Evoos as well as the more missing fields like for example computational physics by our OS and discussed on the website of OntoLinux.

    See also the discussion of the subsumption architecture for RS in relation to CPS below.

    We quote a first document, which is about the field of Model-Based Autonomous System (MBAS) or Immobile Robotic System (ImRS or Immobot) of the National Aeronautics and Space Administration (NASA) and was publicated in the year 1996: "Immobile Robots [] AI in the New Millennium
    A new generation of sensor-rich, massively distributed, autonomous systems are being developed that have the potential for profound social, environmental, and economic change. These systems include networked building energy systems, autonomous space probes, chemical plant control systems, satellite constellations for remote ecosystem monitoring, power grids, biospherelike life-support systems, and reconfigurable traffic systems, to highlight but a few.
    The limelight has shifted dramatically in the last few years from an AI tradition of developing mobile robots to that of developing software agents (that is, softbots) [...].
    [...]
    [...] AI's central goal of developing agent architectures and a theory of machine intelligence [...] software environments, such as a UNIX shell and the World Wide Web, provide softbots with a set of ready-made sensors (for example, LS and GOPHER) and end effectors (for example, FTP and TELNET) that are easy to maintain but still provide a test bed for exploring issues of mobility and real-time constraints. [...]
    [...] Can such a software environment adequately drive research on agent kernels? Second, given that much of the information on the Internet is textual, will oft-envisioned softbot applications, such as information gathering and synthesis, be viable before the hard nut of language understanding has been cracked?
    [...] the information-gathering capabilities of the Internet, corporate intranets, and smaller networked computational systems supply additional test beds for autonomous agents of a different sort. These test beds, which we call immobile robots (or immobots), have the richness that comes from interacting with physical environments yet promise the ready availability associated with the networked software environment of softbots. [...] Conversion of these and other realtime systems to immobile robots will be a driving force for profound social, environmental, and economic change.
    [...] at the focus of attention of immobile robots is directed inward, toward maintaining their internal structure, in contrast to the focus of traditional robots, which is toward exploring and manipulating their external environment. This inward direction focuses the immobot on the control of its complex internal functions, such as sensor monitoring and goal tracking; parameter estimation and learning; failure detection and isolation; fault diagnosis and avoidance; and recovery, or moving to a safe state. Metaphorically speaking, the main functions of an immobot correspond to the human nervous, regulatory, and immune systems rather than the navigation and perceptual systems being mimicked in mobile robots.
    [...] these immobots give rise to a new family of autonomous agent architectures, called model-based autonomous systems. Three properties of such systems are central: First, to achieve high performance, immobots will need to ["develop sophisticated regulatory and immune systems that accurately and robustly control their complex internal functions" and] exploit a vast nervous system of sensors to model themselves and their environment on a grand scale. They will use these models to dramatically reconfigure themselves to survive decades of autonomous operations. Hence, self-modeling and selfconfiguration make up an essential executive function of an immobot architecture. Second, to achieve these large-scale modeling and configuration functions, an immobot architecture will require a tight coupling between the higher-level coordination function provided by symbolic reasoning and the lower-level autonomic processes of adaptive estimation and control. Third, to be economically viable, immobots will have to be programmable purely from high-level compositional models, supporting a "plug and play" approach to software and hardware development.
    [...] Our work on these systems fuses research from such diverse areas of AI as model-based reasoning, qualitative reasoning, planning and scheduling, execution, propositional satisfiability, concurrent reactive languages, Markov processes, model-based learning, and adaptive systems. [...] Moriarty and Livingstone are grounded in two immobot test beds. Moriarty was part of the Responsive Environment [...], an intelligent building control system developed within the Ubiquitous Computing Project at Xerox Parc. Livingstone is part of the Remote Agent, a goal-directed, fully autonomous control architecture, which will fly the National Aeronautics and Space Administration (NASA) Deep Space One space probe [...]. [...]
    [...] a new category of autonomous system that is sensor rich, massively distributed, and largely immobile. This technology is being quickly embedded in almost every form of real-time system, from networked building energy systems to spacecraft constellations. The current generation of these hybrid hardware-software systems [...]
    [...]

    HAL9000
    [...] "brain and central nervous system" of a spacecraft [...].
    [...] HAL's major responsibility is to look after the health of the spacecraft and crew, including monitoring the spacecraft's health, performing fault diagnosis and repair, operating the life-support systems, and continuously monitoring the medical status of crew members in hibernation. Finally, the movie highlights the connection between HAL's higher-level symbolic reasoning and the complex, low-level, autonomic processes distributed throughout the spacecraft. Hence, HAL can be thought of as the spacecraft's immune system.

    The New Millennium Program
    [...] establishing a virtual presence in space through an armada of intelligent space probes that autonomously explore the nooks and crannies of the solar system: [...]
    [...]
    [...] Cassini's "nervous system" includes a sophisticated networked, multiprocessor system, consisting of two flight computers that communicate over a bus to more than two dozen control units and drivers. This nervous system establishes complex sensing and control paths to an array of fixed sensors and actuators [...]. [...]

    "Spock's Brain"
    [...] a little-known episode of the original Star Trek series called "Spock's Brain." In this episode, Spock's body is found robbed of its brain by an unknown alien race, and the crew of the Enterprise embarks on a search of the galaxy to reunite Spock's brain and body. Spock detects that he has a new body that stretches into infinity and that appears to be breathing, pumping blood, and maintaining physiological temperature. When discovered, Spock's brain is found to be within a black box, tied in by light rays to a complex control panel. Instead of breathing, maintaining temperature, and pumping blood, he is recirculating air, running heating plants, and recirculating water. That is, the function that requires this supreme intelligence is the regulation of a planetwide heating and ventilation system. As with HAL, Spock's body in this case is extremely immobile. The episode portrays an immobile robot as a massively distributed behemoth, sufficient to encircle a globe. Again, the crucial link between high-level reasoning (that is, Spock's brain) and autonomic processes is highlighted. Finally, although 2001 highlights HAL's function as an immune system that maintains the health of the immobot, this episode highlights Spock's function as a regulatory system, performing high-fidelity control of the immobot's internal organs. [...]
    [...] networked building management systems [...]
    [...] NASA's Earth-observing system is moving toward a level of sensing that will enable full Earth ecosystem modeling, providing insight into problems of pollution, global warming, and ozone depletion. Vast networked control systems are generating a revolution in chemical processing, factory automation, drug manufacturing, and semiconductor fabrication, producing immobile robots that enable substantial improvements in quality, efficiency, and safety.
    [...] The two immobot test beds discussed in this article represent first steps toward these future visions.

    Immobot Characteristics
    [...] properties that distinguish immobots from their mobile robot and softbot siblings, both in terms of their physical structure and their most salient functions. Structurally, immobots are massively distributed, physically embedded, autonomous systems with a large array of simple, fixed-location sensors and actuators. Functionally, an immobot's primary task is to control its massive regulatory, immune, and nervous systems through a coupling of high-level reasoning and adaptive autonomic processes. More specifically, an immobot has the following distinctive features:
    Physically embedded: [...]
    Immobile: [...] In contrast, a traditional robot typically has a set of complex mobile sensors and actuators, such as three-dimensional vision systems and articulated hands, arms, and legs.
    Massively distributed and tightly coupled: [...]
    Self-absorbed: Mobile robots and softbots focus largely on what is occurring in the world outside the "bot," including navigating around obstacles, moving through networks to databases, and changing navigation paths because of external failures. However, an immobile robot's attention is largely directed inward toward monitoring the internal health of its network (immunology) and reconfiguring its components or control policies to achieve robust performance (regulation). Although some reasoning is directed outward, the external world is not fluid in the same sense that it is for classical robots.
    One of a kind: Ironically, what binds together immobile robots is that no two are alike. [...] The dilemma is how to cost effectively build these one of a kinds yet provide high performance and reliability.

    Controlling Immobots
    [...] one-of-a-kind nature of immobots means that the cost of reasoning through systemwide interactions cannot be amortized and must be paid over again for each new immobot.
    [...]
    [...] autonomic processes of an immobot involve a broad range [...] This range of behaviors needed by an immobot makes it difficult and expensive to both manually synthesize high-fidelity autonomic processes and couple these autonomic processes to high-level symbolic reasoning.

    Model-Based Autonomous Systems
    [...] The kernel that we are driving toward is defined by three desiderata: (1) model-based programming, (2) model-based reactive execution, and (3) model-based hybrid systems.
    [...] In contrast, model-based autonomy explores the coordination of hardware and software interactions using a digital controller.

    Model-Based Programming A model-based autonomous system addresses the difficulty of reasoning about systemwide interactions using model-based programming. Model-based programming is based on the idea that the most effective way to amortize software development cost is to make the software plug and play. To support plug and play, immobots are programmed by specifying component models of hardware and software behaviors. A model-based autonomous system combines component models to automate all reasoning about systemwide interactions necessary to synthesize real-time behaviors [...]. [...]
    [...]
    [...] Model-based programming on a large scale is supported by developing languages, compilers, debuggers, and visualization tools that incorporate classical concepts of object-oriented, procedural, and hierarchical abstractions into the modeling language.

    Model-Based Reactive Execution
    The difficulty of precomputing all responses means that a model-based autonomous system must use its models to synthesize timely responses to anomalous and unexpected situations at execution time. Furthermore, the need to respond correctly in time-critical and novel situations means that it must perform deliberative reasoning about the model within the reactive control loop. Although the list of tasks that the model-based execution system must support is seemingly diverse (figure 2), they divide into two basic functions: (1) self-modeling and (2) self-configuring.
    Self-modeling: [...] Although parts of its model are provided a priori using model-based programming, other parts need to be adapted or elaborated using sensor information. [...]
    Self-configuring: [...] To provide immune and regulatory systems, an immobot must be selfconfiguring; it must dynamically engage and disengage component operating modes and adaptive control policies in response to changes in goals, the immobot's internal structure, and the external environment. [...]

    Model-Based Hybrid Systems
    Given the wide range of digital, analog, and software behaviors exhibited by an immobot, developing a model-based approach for coordinating the immobot's autonomic processes requires a rich modeling language and reasoning methods that go well beyond those traditionally used in qualitative and modelbased diagnosis.
    Concurrent software: Coordinating, invoking, and monitoring real-time software requires formal specifications of their behavior. These behaviors are modeled by incorporating formal specifications of concurrent transition systems into the model-based programming language. Concurrent transition systems provide an adequate formal semantics for most concurrent real-time languages [...].
    Continuous adaptive processes: Achieving high fidelity requires the merging of symbolic model-based methods with novel adaptive estimation and control techniques (for example, neural nets). [...]
    Stochastic processes: [...]
    [...] The essential property that makes these requirements manageable is the relative immobility of our robots. The system interactions are relatively fixed and known a priori through the component models and their interconnection. The flexibility within the system is largely limited to changes in component modes and control policies and adjustments to parameter values. [...] two implemented systems that exploit immobility to achieve these desiderata.. They form two major components of a kernel we envision for maintaining the regulatory and immune systems of immobots.
    [...]
    [...] networked control system for the complete building plus 15 model offices, each of which has been enhanced with a networked microprocessor that controls an array of sensors and actuators [...]
    [...] global model to adaptively predict where the optimum lies [...]
    [...] An immobot can estimate its parameters by adjusting their values until the model best fits the sensor data. [...]
    [...] Moriarty automates key aspects of how a community of modelers decompose, simplify, plan, and coordinate large-scale model-estimation tasks through a technique called decompositional, model-based learning (DML).
    [...] Moriarty decomposes a model into a set of simplest estimators that minimize the dimensionality of the search space and the number of local minima, hence improving learning rate and accuracy. Each estimator, together with the appropriate subset of sensor data, forms a primitive estimation action. Moriarty then plans the ordering and the coordination of information flow between them.
    [...]
    [...] The decomposition of a diagnostic problem is based on the concept of a conflict - a minimal subset of a model (typically in propositional or first-order logic) that is inconsistent with the set of observations [...]. Moriarty's decompositional learning method is based on the analogous concept of a dissent - a minimal subset of an algebraic model that is overdetermined given a set of sensed variables (that is, a dissent is just sufficient to induce an error function). [...] Moriarty uses a dissent-generation algorithm [...] that parallels the conflictrecognition phase of model-based diagnosis.
    Moriarty's simplification step (currently under development) is based on an order-of-magnitude simplification method called caricatural modeling [...].
    [...]
    To summarize, a model-based approach is essential for regulating systems of the size of
    [...]
    [...] Livingstone is a fast, reactive, model-based configuration manager. Using a hierarchical control metaphor, Livingstone sits at the nexus between the high-level feed-forward reasoning of classical planning-scheduling systems and the low-level feedback response of continuous adaptive control methods, providing a kernel for model-based autonomy. Livingstone is distinguished from more traditional robotic executives through the use of deliberative reasoning in the reactive feedback loop. This deliberative reasoning is compositional and model based, can entertain an enormous search space of feasible solutions, yet is extremely efficient because of the ability to quickly focus on the few solutions that are near optimal.
    Three technical features of Livingstone are particularly worth highlighting: First, the approach unifies the dichotomy within AI between deduction and reactivity (Brooks 1991 [Intelligence without Reason, which is also a key paper like the paper Intelligence without representation in relation to the subsumption architecture and the associated fields of Weak Artificial Intelligence (WAI) and Behavior-Based Robotics (BBR) discussed in the comment to the quote of the first document about the field of CPS below] [...]). We achieve a reactive system that performs significant deduction in the sense-response loop by drawing on our past experience at building fast propositional conflict-based algorithms for model-based diagnosis and framing a model-based configuration manager as a propositional feedback controller that generates focused, optimal responses. Second, Livingstone's representation formalism achieves broad coverage of hybrid discrete-continuous, software-hardware systems by coupling the concurrent transition-system models underlying concurrent reactive languages [...] with the qualitative representations developed in model-based reasoning. Third, the long-held vision of model-based reasoning has been to use a single central model to support a diversity of engineering tasks. For model-based autonomous systems, it means using a single model to support a variety of execution tasks, including tracking planner goals, confirming hardware modes, reconfiguring hardware, detecting anomalies, isolating faults, diagnosing, recovering from faults, and safing. Livingstone automates all these tasks using a single model and a single-core algorithm [...].
    [...]
    [...] In this sense, a configuration manager such as Livingstone is a discrete control system that is strategically situated between high-level planning and low-level control [...].

    Model-Based Configuration Management
    Livingstone is a reactive configuration manager that uses a compositional, component-based model of the spacecraft to determine configuration actions [...]. Each component is modeled as a transition system that specifies the behaviors of operating and failure modes of the component, nominal and failure transitions between modes, and the costs and likelihoods [(probabilities)] of transitions [...]. Mode behaviors are specified using formulas in propositional logic, but transitions between modes are specified using formulas in a restricted temporal, propositional logic. [...] The spacecraft transition-system model is a composition of its component transition systems in which the set of configurations of the spacecraft is the cross-product of the sets of component modes. We assume that the component transition systems operate synchronously; that is, for each spacecraft transition, every component performs a transition.
    A model-based configuration manager uses its transition-system model to both identify the current configuration of the spacecraft, called mode identification (MI), and move the spacecraft into a new configuration that achieves the desired configuration goals, called mode reconfiguration (MR). [...]
    In practice, MI and MR need not generate all transitions and control commands, respectively. Rather, just the most likely transitions and an optimal control command are required. We efficiently generate these by recasting MI and MR as combinatorial optimization problems. [...] We efficiently solve these combinatorial optimization problems using a conflict-directed best-first search algorithm.
    [...]
    [...] an autonomous agent architecture that integrates Livingstone with the HSTS planning and scheduling system [...] and a multithreaded smart executive [...] based on RAPS [...] have been selected to fly Deep Space One, forming the core autonomy architecture of NASA's New Millennium Program.
    [...]
    We are only now becoming aware of the rapid construction of a ubiquitous, immobile robot infrastructure that rivals the construction of the World Wide Web and has the potential for profound social, economic, and environmental change."

    Comment
    First of all, we note that nothing happened on a wider scale until we created our Evolutionary operating system (Evoos) and publicated it in 1999 and then again nothing happened on a wider scale until we created our Ontologic System (OS) with its Ontoverse (Ov) and handheld, immobile robot Ontoscope (Os) and publicated it in 2006.
    Our OS is also the breakthrough of SoftBionics (SB), our Ov is also the breakthrough of Mixed Reality (MR), and our Os is also the breakthrough of AS and RS in the mass market, which means no one-of-a-kind anymore.
    Also note that our Ontologic System Architecture (OSA) integrates all in one respectively the cross-product of all basic properties and all sets of included, referenced, and connected works of prior art by being liquid and microservice-oriented, and synchronous and asynchronous.

    The field of Agent-Based System (ABS) include the robot types

  • software robot (software bot or softbot) (focus directed outward), and
  • model-based Autonomous System (AS) or Immobile Robotic System (ImRS or Immobot) (focus directed inward),

    but also more robot types with the

  • cognitive robot (focus directed inward and outward),
  • migrating robot, and
  • robot, which integrates two or more or all of them.

    A metaphor is a figure of speech that, for rhetorical effect, directly refers to one thing by mentioning another. "Metaphorically speaking, the main functions of an immobot correspond to the human nervous, regulatory, and immune systems rather than the navigation and perceptual systems being mimicked in mobile robots", and therefore the mobile robot and the immobile robot do not belong to the field of Cognitive Agent System (CAS), including Cognitive Robotic System (CRS).
    We also note that the document deals only with the field of Embedded System (ES), including the field of Networked Embedded System (NES), but it is not about the fields of computing and networking system, operating system (os), Agent-Based System (ABS), and CPS itself and therefore it is also not about the fields of Agent-Based operating system (ABos), Autonomic Computing (AC), and Natural Language Processing (NLP), etc., which implies once again that an immobot is not a Cognitive Robotic System (CRS).

    We quote a first presentation which is about the field of cognitive robot of one of the authors of the immobot document and was publicated in the year 2002: "Introduction in Cognitive Robots
    [...]

    Course Objective 1
    To understand the main types of cognitive robots and their driving requirements:

    • "Immobile" Robots and Engineering Operations
      - Robust space probes, ubiquitous computing
    • Robots That Navigate
      - Hallway robots, Field robots, Underwater explorers, stunt air vehicles
    • Cooperating Robots
      - Cooperative Space/Air/Land/Underwater vehicles, distributed traffic networks, smart dust.

    Portable Satellite Assistant [(PSA)]

    Robonaut: Robotic Assistance For Orbital Assembly and Repair

    Course Objective 2
    To understand advanced methods for creating highly capable cognitive robots.
    [Graphic: (Boxes shown in the graphic are quoted as list points.)]

  • Localize in World
  • Intepret Scenes
  • Plan Activities
  • [...]
  • Exectue & Adapt
  • Manage Dialogue
  • Map and Explore
  • Navigation & Manipulation
  • Manipulation

    Accomplished by:
    Lectures on advanced core methods
    [...]

    Lectures: Planning and Acting Robustly
    [...]

    Lectures: Interacting With The World
    Simultaneous Localization and Mapping [(SLAM)]

  • Basic SLAM
  • Vision-based SLAM

    Cognitive Vision (CogV)

  • Visual Interpretation using Probabilistic Grammars
  • Context-based Vision

    Navigation & Manipulation

  • Probabilistic Path Planning
  • Exploring Unknown Environments

    Human - Robot Interaction

  • Discourse Management & Nursebot
  • Social Robotics

    Lectures: Fast, Large-scale Reasoning
    [...]

    Topics On Cognitive Robot Capabilities

  • Robots that Plan and Act in the World
    - Robots that Deftly Navigate
    - Planning and Executing Complex Missions
  • Robots that Are State-Aware
    - Robots that Find Their Way In The World
    - Robots that Deduce Their Internal State
  • Robots that Preplan For An Uncertain Future
    - Theoretic Planning in a Hidden World
    - State and Fault Aware Systems

    Course Objective 3

  • To dive into the recent literature, and collectively synthesize, clearly explain and evaluate the state of the art in cognitive robotics.

    Course Objective 4

  • To apply one or more core reasoning methods to create a simple agent that is driven by goals or rewards"

    Comment
    See the next comment.

    We quote a second presentation which is about the field of cognitive robot of one of the authors of the immobot document and was publicated in the year 2003: "Introduction in Cognitive Robots
    [...]

    Course Objective 1
    To understand the main types of cognitive robots and their driving requirements:

    • Robots That Navigate
      - Hallway robots, Field robots, Underwater explorers, stunt air vehicles
    • Engineering and "Immobile" Robots
      - Intelligent spaces
      - Robust space probes
    • Cooperating Robots
      - Cooperative Space/Air/Land/Underwater vehicles, distributed traffic networks, smart dust.

    [...]

    Portable Satellite Assistant [(PSA)]

    Course Objective 2
    To understand fundamental methods for creating the major capabilities of cognitive robots.
    [Graphic: (Boxes shown in the graphic are quoted as list points.)]

  • Locate in World
  • [...]
  • Plan
  • [...]
  • Execute
  • Map
  • Navigate

    Accomplished by:
    Lectures on core methods
    [...]

    Topics On Cognitive Robot Capabilities

  • Robots that Plan and Act in the World
    - Robots that Deftly Navigate
    - Planning and Executing Complex Missions
  • Robots that Are State-Aware
    - Robots that Find Their Way In The World
    - Robots that Deduce Their Internal State
  • Robots that Preplan For An Uncertain Future
    - Theoretic Planning in a Hidden World
    - State and Fault Aware Systems

    Course Objective 3

  • To dive into the recent literature, and collectively synthesize, clearly explain and evaluate the state of the art in intelligent embedded systems.

    Course Objective 4

  • To apply one or more core reasoning methods to create a simple agent that is driven by Goals or Rewards

    [...]

    Model-based Autonomy
    Programmers generate breadth of functions from commonsense models in light of mission goals.

  • Model-based Programming
    • Program by specifying commonsense, compositional declarative models.
  • Model-based Planning, Execution and Monitoring
    • Provide services that reason through each type of system interaction from models.
    • on the fly reasoning requires significant search & deduction within the reactive control loop.

    [...]

    Example of a Model-based Robot:

  • Goal-directed
  • First time correct
    • projective
    • reactive
  • Commonsense models
  • Heavily deductive

    [Graphic: (Boxes shown in the graphic are quoted as list points.)]

  • Remote Agent
  • [...]
  • Scripts
  • Mission-level actions &resources
  • component models

    Conventional Wisdom: Reservations about Intelligent Embedded Systems

  • "[For reactive systems] proving theorems is out of the question" [Agre & Chapman 87]

    How can general deduction achieve reactive time scales?
    [...]
    Developed RISC-like, deductive kernel (OPSAT)

    Can model-based agents perform many different types of reasoning from a common model?
    [Graphics: (One graphics has been shown as Figure 8. Transition-System Model of a Valve. in the document quoted above.)]
    Transition Systems + Constraints + Probabilities

    [...]

    Executing Temporal Plans
    [...]

    Propagating Timing Constraints Can Be Costly
    [Graphic: (Boxes shown in the graphic are quoted as list points.)]

  • Executive
  • Controlled System

    Solution: Compile Temporal Constraints to an Efficient Network
    [...]

    [...]

    Model-based Execution of Activities
    [...]
    Estimating Modes
    [...]
    Reconfiguring Modes

    Model-based Execution as Stochastic Optimal Control
    [Graphic: (Boxes and labels shown in the graphic are quoted as list points.)]

  • Controller
    • Model
    • Goal
    • mode Estimation
    • mode reconfiguration
  • Plant

    Livingstone

    Models

  • modes engage physical processes
    • encoded as finite domain constraints
  • probabilistic automata for dynamics
  • Concurrency to model multiple processes

    Model-based Execution as Stochastic Optimal Control
    [Graphic: (Boxes and labels shown in the graphic are quoted as list points.)]

  • Controller
    • [...]
    • mode Estimation
      • Current Belief State
    • [...]
  • [...]"

    Comment
    The first presentation was updated by the second presentation.
    Note how the author tried in the

  • first presentation to present the field of intelligent Agent-Based System, specifically the field of Model-Based Autonomous System (MBABS) or Immobile Robotic System (ImRS or Immobot), as a the field of Cognitive Agent System (CAS), specifically cognitive robotics, and
  • second presentation to present the field of cognitive robotics as the field of intelligent embedded systems and to reduced the advanced methods to fundamental methods

    to classify prior art in a way that it becomes prior art in relation to our Evolutionary operating system and other works. We have observed this unwanted societal, legal, scientifical, and economical marketing trick in relation to other intelligent ABS as well, specifically in relation to the

  • Belief-Desire-Intention (BDI) software model for rational or intelligent ABS, and
  • field of blackboard-based system, including for example the Procedural Reasoning System (PRS) (based on BDI) and Cognitive Agent Architecture (Cougaar).

    The model-based planning, execution and monitoring, and also reactive closed loop reminds us of the closed loops of

    • Deming circle or Plan, Do, Act, Check (PDCA) cycle,
    • Quality Management (QM) circle (planning, controlling, executing, monitoring and evaluating),
    • Operations Management (OM) cycle (planning, implementing, evaluating, reporting), and
    • Product Lifecycle Mangement (PLM) (create, produce, operate, and service).

    A Reduced Instruction Set Computer (RISC)-like, deductive kernel is a kernel, which operates between Hardware and Software applications and hence an operating system kernel, which points directly to our Evoos once again.

    We quote a document about Holonic MS, which was publicated in 1999: "
    Abstract
    This article presents a new approach to the design of the architecture of a computer-integrated manufacturing (CIM) system. It starts with presenting the basic ideas of novel approaches which are best characterised as fractal or holonic models for the design of flexible manufacturing systems (FMS). The article discusses hierarchical and decentralised concepts for the design of such systems and argues that software agents are the ideal means for their implementation. The agent architecture InteRRaP for agent design is presented and is used to describe a planning and control architecture for a CIM system, [...]
    [...]

    Introduction
    [...] [20 [The Fractal Company A - Revolution in Corporate Culture. 1995]] adopted the metaphor of fractals to describe a model for a flexible manufacturing system (FMS) in which self-contained entities organise themselves without the power of an external force. In a related approach the term holon is used to describe an identifiable entity of an FMS which can itself be decomposed into entities of similar structure. The term holon is a combination of the Greek word holos, meaning whole and the suffix on meaning particle or part. The Hungarian author and philosopher Arthur Koestler [13 [The Ghost in the Machine. 1989]] proposed it to describe a basic unit of organisation in biological and social systems. Koestler observed that in living organisms and in social organisations entirely self-supporting, non-interacting entities did not exist. Every identifiable unit of organisation, such as a single cell in an animal or a family unit in society, comprises more basic units (plasma and nucleus, parents and siblings) while at the same time forming a part of a larger unit of organisation (muscle tissue or a community). Hence, a holon is an identifiable part of a system, which has a unique identity, yet is made up of sub-ordinate parts and in turn is part of a larger whole. Holonic manufacturing (HMS) [3 [ Holonic Manufacturing Systems - Initial Architecture and Standard Directions. 1994], 5 [A cooperation framework for holonic interactions in manufacturing. 1994]], 11 [Holonic planning and scheduling architecture for manufacturing. 1994] tries to apply these ideas in a manufacturing context. One of the basic elements of the holonic manufacturing paradigm is the incorporation of hierarchy into distributed systems to combine reactivity to disturbances with high quality predictable performance [1].

    The Agent Architecture InteRRaP
    [...]

    Holonic Design Of A Flexible Manufacturing System
    [...]
    The decentralised model offers robustness and agility with respect to uncertainties in task execution. The major advantages of the introduction of centralised planning and controlling instances and in doing so maintaining hierarchical structures in distributed control are predictability, opportunities for performance optimisation, and an easier migration path from current to distributed systems [1]. There are already well-established layers of abstraction in the control of an FMS [...]: production planning and control [(PPC)], shop floor control [(SFC)], flexible cell control [(FCC)], autonomous system control [(AS)], and machine control. Each of these layers has a clearly defined scope of competence. The idea of HMS is now on the one hand to combine hierarchical and decentralised control and on the other hand to allow flexible self-organisation of the hierarchical system. In Figure 2 we can see holons on each of the five layers: at the lowest layer, the physical body of an autonomous system (i.e. an autonomous robot or a machine tool) together with its controlling agent and, at the higher layers, the flexible cell control system, the shop floor control system, and the production planning and control system each of which represent a whole group of holons to the outside environment.
    Each planning and controlling entity has to handle (1) co-ordination with the entities on the same and the next higher level, (2) the actual local problem solving (i.e. planning), and (3) the execution of the computed solution. Section 2 has already introduced the agent architecture InteRRaP, which supports this approach with its three-layered structure. To not confuse the layers of the agent architecture with the layers of the hierarchical planning and controlling structure, we call the latter FMS layers. However, from this trouble in naming concepts we can see that we actually do find nested structures of similar kind; just as the holonic framework suggests.

    The FMS Layer: Production Planning and Control
    [...] The more deterministic the execution of the manufacturing orders, the more it is possible for the PPC system to predict when the execution of client orders will be finished.

    The FMS Layer: Production Planning and Control
    [...]

    The FMS Layer: Autonomous Systems
    This section illustrates in more detail how the agent architecture InteRRaP is used to design the autonomous systems (AS) in a flexible manufacturing system. Note that although this article outlines only the InteRRaP agents for the AS FMS layer, the PPC system and SFC system are also designed according to this basic architecture. [...] In the co-operative planning layer the ASs communicate with the superior SFC or FCC system and with other ASs. Task planning, i.e., the actual problem solving process is done in the local planning layer. Finally, the behaviour-based layer controls the actual task execution process.

    The Co-operative Planning Layer
    [...]

    The Local Planning Layer
    [...]

    The Behaviour-Based Layer
    [...] The solution to such a task may be specified with the help of sensor/actor networks [4 [Minimalist Mobile Robotics. 1990] ,2 [A Robust Layered Control System for a Mobile Robot. 1986]]. [...] In the future it is likely that the controlling units of such systems will offer the computational power of today's workstations and, therefore, the AS FMS layer and the machine control FMS layer will grow together.

    Model for a Flexible Manufacturing Plant
    [...] identified by a barcode decoder [...]

    Production Line
    [...]
    [...] We call a setting with a fixed number of production units a topology [...]. A topology specifies a concrete geometrical position and orientation for each production unit and defines the paths which workpieces can take to pass from one production unit to the next one. [...] Therefore, we designed the control architecture in a generic manner so that it can control any desired topology. We can now evaluate the performance of different topologies in a simulation environment. In an optimisation cycle a new topology is generated, a schedule for this topology is generated, the execution of the schedule is simulated [...], a multi-agent system controls the simulation and reacts to machine faults with online optimisation strategies, and the result of the simulation is evaluated [...].

    Experimental Results
    [...] we need intelligent execution strategies in the autonomous systems when we want to describe the task to be solved in the SFC as a pure scheduling problem, [...]. If we actually had an optimal schedule with respect to the mathematical model used in the SFC, we would have to verify that the execution of this schedule using the intelligent execution strategies would maintain the optimality of this schedule. [...]

    Conclusion
    [...]
    [...] we are investigating constraint-based planning techniques to describe the agent's reasoning, which give us a unique framework to describe and solve scheduling problems and task planning. [...]"

    Comment
    The document is based on a work about Balanced Automation SyStems (BASYS) respectively Agent-Based Design of Holonic Manufacturing Systems, which was publicated in September 1998.
    It should be very easy to recognize the holon in our Evoos.
    See also the Investigations::Multimedia, AI and Knowledge management of the of the 22nd of February 2017 about Basic System Industry 4.0 (BaSys 4.0), which is a related attempt to continue the stealing of our Evoos and OS, which already began in the year 1999 or even in the year 1989.
    Note in this relation the repetition of the attempt to steal the middleware functionality of our Evoos and OS, which obviously does not work for artistical, technological, and legal reasons.
    Also note that we have the foundation of microService-Oriented Architecture (mSOA) and other Service-Oriented Technologies (SOx) included in our Evoos and the Space-Based Architecture (SBA) included in our OS as basis. See also Java Jini and Agent-Based System (ABS), such as the actor + agent = ActorAgent system and Multi-Agent System (MAS) (see for example the OntoLix and OntoLinux Website update of the 10th of March 2019 and the Further steps of the 27th of March 2019, and also the Clarification of the 18th of January 2020).

    Especially interesting for us are the facts that agent architecture layers across all FMS layers is virtually rejected, despite the acknoledgement of the "nested structures of similar kinds" of the agents of the different FMS layers and the layered agent architecture.
    "Each of these layers has a clearly defined scope of competence. The idea of HMS is now on the one hand to combine hierarchical and decentralised control and on the other hand to allow flexible self-organisation of the hierarchical system."
    But an overall agent system is excluded therefore.
    With Evoos with its foundation of mSOA and OS with its liquid property of its Ontologic System Architecture (OSA) we generalized this. We also added a ontological, logical, and mathematical model with our new Zero Ontology, and the basic properties of (mostly) being validated and verified for optimal intelligent strategies and functional operations.
    The combination of hierarchical and decentralised control and the allowance of flexible self-organization of the hierarchical system are also the reason that we described our OSA seen as a layered architecture but integrates all in one.

    We also note that the second example of the presented Holonic Manufacturing System (HMS) describes an optimization cycle of a production line, which is related to a Deming cycle or Plan, Do, Act, Check (PDCA) cycle.

    In relation to the fields of Holonic Manufacturing System (HMS) and Holonic Multi-Agent System (HMAS) see also the first document about the field of CPS quoted and commented below.

    We quote an online encyclopedia about the subject Wireless Sensor Network (WSN): "Wireless sensor networks (WSNs) refer to networks of spatially dispersed and dedicated sensors that monitor and record the physical conditions of the environment and forward the collected data to a central location. [...]
    These are similar to wireless ad hoc networks in the sense that they rely on wireless connectivity and spontaneous formation of networks so that sensor data can be transported wirelessly. [...] Modern networks are bi-directional, both collecting data[2] and enabling control of sensor activity.[3] The development of these networks was motivated by military applications such as battlefield surveillance.[4 [Wireless sensor networks for battlefield surveillance. [2006]]] Such networks are used in industrial and consumer applications, such as industrial process monitoring and control and machine health monitoring.
    A WSN is built of "nodes" - from a few to hundreds or thousands, where each node is connected to other sensors. [...]
    [...]

    Application
    [...]

    Industrial monitoring
    Machine health monitoring
    Wireless sensor networks have been developed for machinery condition-based maintenance (CBM) as they offer significant cost savings and enable new functionality.[18]
    Wireless sensors can be placed in locations difficult or impossible to reach with a wired system, such as rotating machinery and untethered vehicles.
    [...]

    Threat detection
    The Wide Area Tracking System (WATS) is a prototype network for detecting a ground-based nuclear device[21 [A national strategy against terrorism using weapons of mass destruction. [latency January/February 1998. 1997]]] such as a nuclear "briefcase bomb." WATS is being developed at the Lawrence Livermore National Laboratory (LLNL). WATS would be made up of wireless gamma and neutron sensors connected through a communications network. Data picked up by the sensors undergoes "data fusion", which converts the information into easily interpreted forms; this data fusion is the most important aspect of the system.[22][obsolete source]
    The data fusion process occurs within the sensor network rather than at a centralized computer and is performed by a specially developed algorithm based on Bayesian statistics.[23 [Sensing for Danger. [Issue July/August 2001. 2001]]] WATS would not use a centralized computer for analysis because researchers found that factors such as latency and available bandwidth tended to create significant bottlenecks. Data processed in the field by the network itself (by transferring small amounts of data between neighboring sensors) is faster and makes the network more scalable.[23]
    An important factor in WATS development is ease of deployment, since more sensors both improves the detection rate and reduces false alarms.[23] WATS sensors could be deployed in permanent positions or mounted in vehicles for mobile protection of specific locations. One barrier to the implementation of WATS is the size, weight, energy requirements and cost of currently available wireless sensors.[23] The development of improved sensors is a major component of current research at the Nonproliferation, Arms Control, and International Security (NAI) Directorate at LLNL.
    WATS was profiled to the U.S. House of Representatives' Military Research and Development Subcommittee on October 1, 1997 during a hearing on nuclear terrorism and countermeasures.[22] On August 4, 1998 in a subsequent meeting of that subcommittee, Chairman Curt Weldon stated that research funding for WATS had been cut by the Clinton administration to a subsistence level and that the program had been poorly re-organized.[24]
    [...]

    Platforms
    [...]

    Online collaborative sensor data management platforms
    Online collaborative sensor data management platforms are on-line database services that allow sensor owners to register and connect their devices to feed data into an online database for storage and also allow developers to connect to the database and build their own applications based on that data. [...] Such platforms simplify online collaboration between users over diverse data sets ranging from energy and environment data to that collected from transport services. Other services include allowing developers to embed real-time graphs & widgets in websites; analyse and process historical data pulled from the data feeds; send real-time alerts from any datastream to control scripts, devices and environments.
    The architecture of the [...] system [...] describes the key components of such systems to include APIs and interfaces for online collaborators, a middleware containing the business logic needed for the sensor data management and processing and a storage model suitable for the efficient storage and retrieval of large volumes of data.

    Simulation
    At present, agent-based modeling and simulation is the only paradigm which allows the simulation of complex behavior in the environments of wireless sensors (such as flocking).[44] Agent-based simulation of wireless sensor and ad hoc networks is a relatively new paradigm. Agent-based modelling was originally based on social simulation.
    [...]

    Other concepts
    [...]

    Data integration and sensor web
    The data gathered from wireless sensor networks is usually saved in the form of numerical data in a central base station. [...] standards for interoperability interfaces and metadata encodings that enable real time integration of heterogeneous sensor webs into the Internet, allowing any individual to monitor or control wireless sensor networks through a web browser."

    Comment
    One successor of WATS seems to be the Sensor Web of the NASA (see the next quote).

    Indeed, Sensor Networks (SNs), which

  • have the capability of in-network processing utilized for data fusion and aggregation were known before the presentation of our Evoos, and
  • utilize mobile phones as traffic probes, which are
    • equipped with a Global Positioning System (GPS) receiver or
    • tracked through triangulation from nearby cell base station towers,

    were known before the presentation of our OS.
    But calling an SN a CPS requires that it has the capabilities to react to, reason about, and manipulate spatial, temporal, or spatio-temporal sensory information, which leads us back to the start, because with spatial and temporal awareness respectively situational awareness and self-awareness of a brain, and sensoric and motoric primitives this is (at least) our Evoos.

    Also note that Sensor Networks (SNs) had limitations due to energy and bandwith, which was also noted in relation to CPS in the year 2006. Our OS includes CPS 1.0, but was already created and designed for our vision of a mobile communications standards, which is a foundation of our

  • 5G Next Generation (5G NG), 6G, and so on based on Software-Defined Networking (SDN), Network Function Virtualization (NFV), and Virtualized Network Function (VNF), Cloud-native Network Function (CNF) Space-native Network Function (SNF), and Software-Defined Infrastructure (SDI), and also MANagement and Orchestration (MANO), and
  • CPS 2.0.

    The field of sensor network or sensor web is also important, because such a sensor network is distributed and tightly coupled (an individual node is inept in monitoring and regulating, all nodes constitute a single instrument). But no model-based planning and scheduling, reasoning, etc..
    On the basis of the Distributed operating system (Dos) TUNES OS our Evoos viewed

  • as a cybernetic reflection, extension, and augmentation of a brain, a Central Nervous System (CNS), and an Autonomic Nervous System (ANS) of a living being, and also
  • in relation to Cybernetical Intelligence (CI)

    allows the massively distributed and tightly coupled sensors and actuators of an Autonomous System (AS) and Robotic System (RS) to be massively distributed and loosely coupled as well.

    We quote a website about the field of wireless sensor network Sensor Web of the National Aeronautics and Space Administration (NASA): "JPL Sensor Webs Project
    A New Class of Instruments for Monitoring and Exploring Environments.
    Technology is continuously changing and improving at JPL. We encourage discovery of breakthroughs by identifying promising ideas and supporting their development. The JPL Sensor Webs Project is an example of a great idea growing into an innovative technology.
    The Sensor Web is an independent network of wireless, intra-communicating sensor pods, deployed to monitor and explore a limitless range of environments. This adaptable instrument can be tailored to whatever conditions it is sent to observe.
    Explore the JPL Sensor Webs Project site to discover more about this emerging technology. [...]

    Sensor Webs
    Goal: Knowledge from Information
    A New Instrument Paradigm: A Macro-Instrument of Distributed Transducers Communicating Via Wireless for Information Sharing
    Operating Principle: Sensors on individual pods collect data and this information is hopped along and shared with other pods until an uplink point is reached.
    [...]

    Sensor Web Advantages
    Recursively Based Instrument (a node can be another web)
    Flexible Concept (land / aqueous / atmosphere / space / crafts)
    Open Architecture (scaleable)
    Local Decision Making and Analysis
    Distributed Control
    Macroscopic Web Intelligence (information synthesis /emergent behavior)
    Cheap / Economy of Scale
    Fault Tolerant / Built-In Redundancy

    The Interweb [] A Global Virtual Presence
    A Node Can Be A Web
    In Situ Sensing
    Remote sensing
    The Internet"

    Comment
    Sensor Web

    We quote a webpage, which is about the MicroEdition of the Agent-Based System (ABS) Cougaar (for Cognitive Agent Architecture) of the DARPA and the wireless sensor network Sensor Web of the NASA: "The Cougaar MicroEdition project has developed a JAVA based distributed agent software suite derived from the Cougaar architecture that executes on very small wireless computing devices integrated with sensor and robotic packages for sensor-web and cooperative robotic applications. This project provides the necessary programming environment and distributed application tools to facilitate the development of sophisticated, self-organizing networks of wireless connected agent systems for military and space applications. The project culminated in the demonstration of the utility of such systems by connecting a simple robotic sensor-web with a fully enabled Cougaar society, providing distributed data and information management down to the simple sensor packages.
    The project is focused on the extension to the Cougaar infrastructure to support operation on resource-limited embedded devices. These sensing and acting agents operate as a wireless web to perform data fusion operations that are not possible with single-sensor systems. They are interoperable with Cougaar agents, which can perform higher-level aggregation, additional data fusion and problem solving using the Cougaar cognitive model. The project is being carried out in cooperation with the NASA Jet Propulsion Laboratory's SensorWebs project ([see the quoted website before]).

    [...]

    Version 1.0 Release:
    A number of people have been interested in exploring the CougaarME capabilities. Here is the current release for community review and comment. This release and the example code includes the demonstration from the December 2000 workshop as well as the robotics demonstration described above.
    [...]

    Objective:
    Extend the Cougaar distributed agent technology to support embedded devices in a way that is:

  • Multi-layered: One size does not fit all, but it must be interoperable with current Cougaar societies
  • Randomly-distributed: Embedded agents need spatial awareness. Many must sense their position, surroundings
  • Self-organizing: Embedded agents must discover their neighbors. They must advertise their services and coordinate the use of other agents services.

    [...]

    Architecture Concept:
    The concept is that one CougaarSE (Standard Edition) would manager several CougaarME modules. These modules provide access to sensors and actuators in the real world, effectively serving as the eyes and hands for the agent in the real world. [...]
    When you extend this concept to a robotics platform - the most obvious place to explore this kind of real world interface capability, we would have a physical architecture [....]"

    Comment
    In fact, Cougaar was developed as part of the Advanced Logistics Project (APL) of the Defense Advanced Research Projects Agency (DARPA) and is only a blackboard-based middleware for "large-scale distributed agent-based applications", which has

  • blackboards, which are local to the agents,
  • pluggable capabilities such as the Peer-to-Peer (P2P) protocol specification Juxtapose (JXTA) (P2P protocols defined as messages in eXtensible Markup Language (XML)), and
  • a connection to the Control of Agent-Based Systems (CoABS) Grid of the DARPA, which is based on Jini, based on JavaSpaces based on the tuple space model, or being more precise, the Linda tuple space system.
    The integration of such a Multi-Agent System (MAS) with a Sensor Network (SN) was in 2001.
    In 2000 began a movement to designate an Intelligent Agent-Based System (IABS) as a Cognitive Agent System (CAS) or Cognitive Agent Architecture as a reaction on our work on our Evoos.
    Therefore, we do not call Cougaar a CAS, but an ABS and MAS,
    But Cougaar is also related to our Evolutionary operating system (Evoos) and even based on it in part, so that our transformation from an Intelligent Agent-Based System (IABS) to a Cognitive Agent System (CAS) and a Cognitive Robotic System (CRS) with Evoos is becoming more and more decisive once again, specifically in this case of CPS.
    Our Evoos is based on Artificial Life (AL) and self-organization, specifically self-organizing neural network

    We quote a first document about the field of CPS, which was publicated in September 2006: "In this [position] paper we discuss some of the technical challenges that need to be addressed in order to interface and manipulate the physical wor[l]d. We also make some considerations regarding the requirements of cyber-physical systems.

    I. General Considerations
    Cyber-physical systems (CPS) will soon redefine how we perceive and interact with the physical world. Using commercially available hand-held devices we will be able to observe, change and even customize certain aspects of the physical environment [or substratum] that traditionally were beyond reach.
    [...] "programmable car" [...]
    This example attempts to illustrate one of the desirable characteristics of the next generation of CPS: the possibility to decouple, within certain limits, the physical environment or substratum from what is perceived by the end user.
    In the remaining paper we will speculate about the necessary abstractions for locally physical but globally virtual CPS. We will pay especial attention to the extent that the physical [environment or] substratum will shape the abstractions as well as the physical network of embedded sensors, actuators, [and] computing and communicating devices.

    II. Topological Abstractions Of The Physical World
    CPS usually comprise a network of physically distributed embedded sensors and actuators equipped with computing and communicating capabilities. Although each individual device is fairly inept at monitoring or regulating the physical [environment or] substratum, the coordinated action of the individual network nodes has the potential for unprecedented capabilities.
    [...]

    III. Self-Organizing, Distributed, In-Network Computation
    Current research on CPS emphasizes the use of networked embedded systems as distributed sensing and data gathering devices [(for example as part of a Sensor Network (SN))]. The natural next step is to consider actuation, thus moving from a passive framework, where information is extracted from the physical world, to an active framework, where information is sensed, processed and used within the network.
    [...]"

    Comment
    We can do it simple and short. See the fields of Holonic Manufacturing System (HMS) (see the quoted document above) and Holonic Multi-Agent System (HMAS), which have the features of all all 3 points and note once again that our Evoos is reflective, fractal or holonic, and holologic.

    But we would also like to discuss some details in the rest of this comment.

    As shown with the comment related to Cognitive-Affective Processing System (CAPS) above and a more complete comparison given in the summarizing comment after the quoted materials below, our Evoos is a CAPS with a Human-Machine Interface (HMI) for interaction, which has these basic features and functionalities.

    Also important to note are the facts that

  • on the one hand networks of sensors have the advantages of local decision making and analysis, distributed control, and macroscopic web intelligence (information synthesis and emergent behavior), but they are not self-regulating, self-organizing, and self-managing autonomous and autonomic systems, because the individual sensor nodes collect data and share and transport them until an uplink point is reached, and
  • on the other hand networks of actuators comparable to networks of sensors were not known before the presentation of our Evoos, since the required energies for actuators and the protection against malfunctions place considerably higher demands on the nodes and the networking, and the number of actuators is much smaller than the number of sensors.

    Indeed, at least one related concept based on sensors and actuators in the field of Robotic System (RS) existed at that time with the reactive robotic control architecture called subsumption architecture (A robust layered control system for a mobile robot. 1986), which is heavily associated with the fields of Weak Artificial Intelligence (WAI) and Behavior-Based Robotics (BBR) (Intelligence without representation. 1991). BBR does not use preset calculations to tackle a situation, but relies on adaptability, and has been developed further into the field of Biology, Electronics, Aesthetics, and Mechanics (BEAM) robotics (see also the webpage Philosophy of BEAM Robotics of the website of OntoLinux).
    Consequently, this basic RS concept or architecture has no concept or abstration of a locally physical (e.g. sensors, actuators, and network wires, and also body), but globally virtual (e.g. smart, intelligent, cognitive, mental overall control and operating systems; mind) system, or even a CPS, and therefore

  • has not the notions of in-network computation, and coordinated or orchestrated action by an overall system, and
  • is not utilized for monitoring or regulating the physical environment or substratum, but merely acting within it

    besides its many other deficits.

    The same holds for other related concepts, which already existed in 2006 or only exist since 2006.
    One of these concepts is the Belief-Desire-Intention (BDI) software (agent) model (or system) (1995), which is more related to planning and provides a mechanism for separating the activity of selecting a plan or schema (synonym outline, model, recipe, and instruction) from executing currently active plans. But creating a plan respectively planning in the first place is not within the scope of the model as is the case with Learning in BDI (2004), which seems to be based on our Evoos as well and was publicated in detail only later.
    The integration of the BDI software (agent) model (or system) and the field of Multi-Agent System (MAS) (BDI MAS) (1999) was publicated in the same year, when we started to discuss and present our Evoos.

    Another one of these concepts is the field of Autonomic Computing (AC) and Autonomy-oriented computation (2001), and Autonomic System (AS), which was also created with our Evoos, because its foundation is named after, inspired by, and patterned on the Autonomic Nervous System (ANS) of the human body, which controls and regulates important bodily functions or body systems (e.g. respiration, heart rate, and blood pressure) without any conscious input or intervention.
    But both concepts are related to software.

    Interestingly, an autonomous middleware for AC (2006), which is based on the integration of BDI MAS, AC, has a (virtual) execution engine for classical, non-classical logic (e.g. Many-Valued Logic (MVL) (e.g. Three-Valued Logic (3VL) and Fuzzy Logic (FL)), Linear Logic (LL), Modal Logic (ML) (e.g, Linear Temporal Logic (LTL)), Computation Tree Logic (CTL), and BDICTL), and cybernetical ((circular) logic of (causal action of) feedback) logic, and supports Business Process Model and Notation (BPMN), semantic and logic models, and ontologies, and also system, goal, plan, and task specifications, and semantic agent communication, which again addresses for example the point d. of the second quoted position paper and many other requirements of various systems.
    But obviously, this was also already created with Evoos, as described in the The Proposal, specifically its chapter 5 Zusammenfassung==Summary and the questions Wie sieht die Ontologie des Software-Systems aus?==How does the ontology of the software system looks like? and Wie wird Wissen gespeichert und wie erfolgt der Zugriff auf Wissen?==How is knowledge stored and how is knowledge accessed?, and its chapter 8 Lösungsansatz==Solution Approach and the subchapters 8.1 Recherche==Research and 8.3 Wachstum des Betriebssystems==Growth of the Operating System.

    The next relevant concept is An evolvable Network of Tiny Sensors-An Evolvable Operating System for Wireless Sensor Networks (ANTS-EOS), which cites the System Support For MultimodAl NeTworks of In-situ Sensors (MANTIS) (2003) and is referenced in the section Exotic Operating System of the webpage Links to Software of the website of OntoLinux.

    Also related is the field of Cognitive Grid Computing (CGC) (2006), which is based on the fields of CAS, Distributed System (DS), and Ontology-Based Agent-Based System (OBABS).
    But these three concepts are related and even based on our Evoos and were presented only after our Evoos as well, obviously.
    Also note that the discussion about the (eventually not existing) difference between an operating system and a middleware is irrelevant in the context of CPS.

    We quote a second document about programming models and methods for spatio-temporal actions and reasoning in CPSs, which was publicated in September 2006 and presented at the NSF Workshop On Cyber-Physical Systems in October 2006: "Motivation and Context:
    The proposed Cyber-Physical Systems (CPS) initiative presents a compelling vision of enormous opportunities for societal-scale improvements in the quality of life and our infrastructure for transportation, telecommunications, healthcare etc [...]. Working at the intersection of logical and physical worlds with the advances in processing, communications, localization technologies, these hold an enormous potential as a catalyst for cross-fertilization of ideas from diverse science and engineering disciplines including embedded real-time/distributed systems to structural, biomedical engineering. The marriage of physics and computational sciences is full of promises. [...] And yet, the grander vision of deeply coupled pervasive sensors and ubiquitous computing and communications driving fundamental changes in our social infrastructure remains in futuristic predictions than simply a matter of at-scale implementation and application of known ideas and techniques.
    Indeed, there are fundamental challenges ranging from how computing systems and software are architected, implemented, composed and programmed; and how these systems are validated. In this position statement, I will focus our attention on "location or space" as a theme to point out one of the glaring CPS limitations and fundamental technical problems that must be solved for meaningful advances in the design and deployment of CPS elements to societal infrastructure and at-scale applications. While "location" is well-studied at the application or technology level respectively by the ubiquitous computing (as contextual or meta-information) and engineering (as localization technologies) communities, our focus specifically is on programming support for location and time information, and its use in assertions and validations. We believe that location and time are first class[/excellent] entities[/separate units, quantities, or information objects] that can provide a rich source of new capabilities for CPS applications. Our presentation at the workshop will specifically address requirements, technical challenges and abstractions necessary for successful design of cyber-physical systems. Due to time limitations, we focus on location; it is closely tied to the treatment of time. Clearly, the two quantities are related (and it is this relation that comes readymade with a well-developed calculus from classical physics).

    Problems Addressed:
    Current CPS systems lack the ability [respectively models and methods] to capture spatial information - information related to the location of actions as well as the use of location information in defining actions. While geographical location information can be 'stored' in various forms and at various levels, semantic support to use this information at various levels of the system implementation is severely lacking. [...] When building CPS, this limitation manifests in many ways: from inadequately specified CPS functionalities and their validation to a lack of any guarantees related to availability or unavailability of computational resources as a function of location. Cyber-physical systems are quintessential embedded sensor systems: they react to and manipulate spatiotemporal sensory information. Yet these lack models and methods to capture such information, validate their behavior performance against timing [or temporal] and spatial requirements. More importantly, this presents a fundamental barrier to the scaling and use of cyber-physical systems to societal-scale applications because a whole host of constraints, from energy, power, bandwidth, processing to resource availability, simply rule out the use of all the sensors at all times. Therefore, methods must be devised to 'duty-cycle' sensors and system resources for scalability and operational efficiencies. While information theory guides us when sampling time, there is no such support for determining the "focus of attention" (FOA) of a sensor network to achieve overall system-level goals.
    [...] Most interesting CPS applications require dynamic determination of the FOA. To do this, we need a computational infrastructure [that] can exploit the time-space relationships (based on the physical phenomenon being observed), in other words, under a locally physical control of the computational processes.

    Important Research Challenges:
    To achieve our goal of semantic support for location and time at all levels, we need to address the following technical problems:
    a. How do we capture location (and timing) information into CPS models that allows for validation of the logical properties of a program against the constraints imposed by its physical (sensor) interactions?
    b. What are useful models for capturing faults and disconnections within the coupled physical-computational systems? How can we reason with these models to define the notion of system availability?
    c. What kind of properties that can be verified, and assertions that can be ensured in applications that make use of both physical (real) time as well as location information? Do these propositions require direct algebraic support for location? How best these location and timing aware assertions can be validated?
    d. What programming model is best suited for CPS applications utilizing dynamic FOA behaviors? Are there any specific operating system or 'middleware' services that can ease the task of building such applications, and doing so reliably? [Bingo!!!]
    e. What are the metrics to measure effectiveness of physically-coupled embedded systems? How do we characterize operational efficiency with measures that take into account spatial information?
    [...] Promising projects to address these challenges will span programming, formal methods, distributed and embedded real-time systems, and whole host of disciplines that come together in building sensors and sensor network applications. Among the promising pieces of work are significant advances in programming models for sensor networks (a rich area of investigation with diverse approaches in the choice of language and formalisms); formalism that model hybrid systems and those that seek to capture location information such as Pi and Ambient Calculus, Reactive Mobile Concurrent Processes, and advances in name/directory based discussion of resources in sensor network applications.

    Possible Milestones:

  • [...]
  • New models and methods that capture and reason with physical interactions for time and location for determination of FOA; efficient methods to implement FOA;
  • New fault/failure models for validation of CPS applications;
  • [...]
  • [...]
  • 'Virtualization' technologies for the cyber-physical systems."

    Comment
    At first, we note that mobile devices are utilized as traffic probes and their locations are used for forecasting or predicting temporally the traffic flow as part of an Intelligent Transportation System (ITS). (2001) But it is at the application or technology level in relation to the fields of ubiquitous computing (as contextual or meta-information) and engineering (as localization technologies).

    Indeed, Sensor Networks (SNs), which

  • have the capability of in-network processing utilized for data fusion and aggregation were known before the presentation of our Evoos,
  • utilize mobile phones as traffic probes, which are
    • equipped with a Global Positioning System (GPS) receiver or
    • tracked through triangulation from nearby cell base station towers,

    were known before the presentation of our OS.

    But calling an SN a CPS requires that it has the capabilities to react to, reason about, and manipulate spatial, temporal, or spatio-temporal sensory information, which leads us back to the start, because with spatial and temporal awareness respectively situational awareness and self-awareness of a brain, and sensoric and motoric primitives this is (at least) our Evoos.

    With an immobot "[m]ode behaviors are specified using formulas in propositional logic, but transitions between modes are specified using formulas in a restricted temporal, propositional logic". But this is not the kind of semantics, which are discussed in work quoted above.

    As shown above, our Evoos is a Cognitive Agent System (CAS) and even a Cognitive-Affective Processing System (CAPS), which has functions and functionalities related to

  • attention,
  • Situational Awareness or Situation Awareness (SA),
  • self-awareness,
  • perception of environmental elements and events with respect to space and time,
  • recognition of bodies as physical objects in space and time,
  • classification of elements and events incorparting spatial and temporal attention, and
  • understanding of space and time

    in addition to the functions and functionalities of the fields listed in the quotes before.
    But calling a technology, like for example an Embedded Sensor System (ESS) and a Sensor Network (SN), a CPS requires that it has the capabilities to react to, reason about, and manipulate spatial, temporal, or spatio-temporal sensory information, which leads us back to the start, because with spatial and temporal awareness respectively situational awareness and self-awareness of a brain, and sensoric and motoric primitives this is (at least) our Evoos.
    requires that it has/provides semantic support for location and time
    See once again the chapters 3 Entwicklung und Funktionsweise eines Gehirns==Development and Functioning of a Brain and 5 Zusammenfassung==Summary, and the related comments above and below.
    See also the comments to the field of Sensor Network (SN).

    Our Evoos also includes virtualization.

    See also the discussion of the Belief-Desire-Intention (BDI) software (agent) model (or system) in relation to CPS in the previous comment above, and also the comment related to the quoted excerpt of a book about embedded systems and CPSs in relation to modelling, virtualization, and other features and functionalities of our Evoos and the summarizing comment after the quoted materials below.

    We quote slides of an online presentation about the modeling and simulation framework Ptolemy, which is related to CPS and the actor-oriented paradigm and was held at a summit in 2009: "Ptolemy Project Vision [...]

    2 Cyber-Physical Systems (CPS)
    Where it is going
    CPS: Orchestrating networked computational resources with physical systems.

    3 CPS is Multidisciplinary
    Computer Science: Carefully abstracts the physical world
    System Theory: Deals directly with physical quantities
    Cyber Physical Systems: Computational + Physical

    4 Ptolemy Project Research
    Foundations: Timed computational semantics.
    Bottom up: Embedded processors (PRET).
    Top down: Distributed real-time systems (PTIDES).
    Holistic: Scalable model-based design.

    5 Object Oriented vs. Actor Oriented Software Components
    [...]

    6 Timed Software Semantics
    A unique least fixed point, tuples of signals s element of S^N such that F(s)=s, exists and can be constructively found if S^N is [Complete Partial Order (]CPO[)] and F is (Scott) continuous.
    Causal systems operating on signals are usually naturally (Scott) continuous.

    7 Results
    Software: Ptolemy II realizes a number of timed concurrent models of computation (MoCs) with well-founded rigorous semantics.
    [...]

    [...]

    13 PTIDES: Programming Temporally Integrated Distributed Embedded Systems
    Distributed execution under DE semantics, with "model time" and "real time" bound at sensors and actuators.
    [...]

    [...]

    16 Hierarchical Multimodeling
    Hierarchical compositions of models of computation. Maintaining temporal semantics across MoCs is a main challenge.

    17 Multi-View Modeling:
    Distinct and separate models of the same system are constructed to model different aspects of the system.
    Functional model in Statecharts
    Functional model in Ptolemy II
    Deployment model in Ptolemy II
    Verification model in SMV
    Reliability model in Excel

    [...]

    19 Addressing the Design Challenges for Cyber Physical Systems
    Foundations: Timed computational semantics. Abstract semantics on super-dense time
    Bottom up: Make timing repeatable. Precision-timed (PRET) machines
    Top down: Timed, concurrent components. Distributed real-time discrete-events (PTIDES)
    Holistic: Model engineering. Multimodeling, ontologies, property system, ..." [Bingo!!!]

    Comment
    See the OntoBot software component, specifically Smodels for stable model semantics for logic programs, the section Pure Rationality, and note that "[t]he well-founded semantics can be viewed as a three-valued version of the stable model semantics".
    Also note that the actor-oriented paradigm is a foundation of the reflective and distributed operating systems TUNES OS and Apertos (Muse), and any actor network can be treated as a feedback system.
    A brain, Central Nervous System (CNS), and Autonomic Nervous System (ANS) work in real time. Our OS is also for the fields of Embedded System (ES), Networked Embedded System (NES), Distributed Embedded System (DES), Autonomous System (AS), and Robotic System (RS). If we say operating systems, then we mean all operating systems known at that time, including Real-Time operating systems (RToss), RT-Linux, and so on.
    We would also like to give the reminder that the TUNES Project already referenced in The Proposal describing Evoos is based on the actor-oriented paradigm and that the around 45 operating systems examinated in the course of the research for Evoos for sure include all reflective operating systems known at that time, including the reflective, object-oriented, actor-based (concurrent), (resilient) (survivable) fault-tolerant, and distributed operating systemTUNES OS and the reflective, object-oriented, actor-based (concurrent), (resilient) (survivable) fault-tolerant and (trustworthy) reliable, and distributed operating system Aperion (Apertos (Muse)) and Cognac based on Apertos.
    So, we already have another teaching of timed computational semantics, which is the first of the four relevant points of the project vision of the modelling and simulation framework Ptolemy II in 2009. About (real-time) operating system, which is the third relevant point, and holistic modeling, multimodeling, and ontologies, which is the fourth relevant point, we do not need to talk anymore.

    We quote a third document about the field of CPS and the Impact of Control Technology, which was publicated in 2011 at the Institute of Electrical and Electronics Engineers (IEEE) Control Systems Society (CSS): "The term cyber-physical systems (CPS) refers to a new generation of systems with integrated computational and physical capabilities that can interact with humans through many new modalities. [Bingo!!!]
    [...]

    Need for CPS Research
    CPS research is still in its infancy. Professional and institutional barrier have resulted in narrowly defined, discipline-specific research and education venues in academia for the science and engineering disciplines. Research is partitioned into isolated subdisciplines such as sensors, communications and networking, control theory, mathematics, software engineering, and computer science.
    [...]

    Abstraction and Architectures
    [...]

    Distributed Computations and Networked Control
    [...]

    Verification and Validation
    [...]

    Challenges and Opportunities: Industry-Academia
    [...]

    Biomedical and Healthcare Systems
    [...]

    Next-Generation Air Transportation Systems (NextGen)
    [...]

    Smart Grid and Renewable Energy
    [...]

    Conclusions
    Cyber-physical systems are expected to play a major role in the design and development of future engineering systems with new capabilities that far exceed today's levels of autonomy, functionality, usability, reliability, and cyber security.
    [...]

    Selected recommendations for research in cyber-physical systems:

  • Standardized abstractions and architectures that permit modular design and development of cyber-physical systems are urgently needed.
  • CPS applications involve components that interact through a complex, coupled physical environment. Reliability and security pose particular challenges in this context - new frameworks, algorithms, and tools are required.
  • Future cyber-physical systems will require hardware and software components that are highly dependable, reconfigurable, and in many applications, certifiable ... and trustworthiness must also extend to the system level."

    Comment
    See the other comments and the summarizing comment after the quoted materials below.

    We also quote an excerpt of a book about embedded systems and Cyber-Physical Systems (CPSs), which was publicated in 2013: "A cyber-physical system (CPS) is an integration of computation with physical processes. Embedded computers and networks monitor and control the physical processes, usually with feedback loops where physical processes affect computations and vice versa.
    As an intellectual challenge, CPS is about the intersection, not the union, of the physical and the cyber. It is not sufficient to separately understand the physical components and the computational components. We must instead understand their interaction."

    Comment
    We looked at this book in more detail, or better said, at the reference list of it, which is related to the activities of the U.S.American National Institute of Standards and Technology (NIST) and the modeling and simulation framework Ptolemy (see the quote above) only to get the confirmation that it is a plagiarism, indeed, as usual, and as not expected otherwise.
    The list includes 190 references, which are about the topics

  • Opportunities and Obligations for Physical Computing Systems 2005
  • Security-aware mapping for CAN-based real-time distributed automotive systems 2013
  • Operational Semantics of Hybrid Systems 2005
  • The Design of the Spring Kernel 1987
  • A structural approach to operational semantics 2004
  • [Complete Partial Order (]CPO[)] semantics of timed interactive actor networks 2008
  • The Semantics of a Simple Language for Parallel Programming 1974
  • Software Verification with BLAST 2003
  • The software model checker Blast 2007
  • On Computable Numbers, with an Application to the Entscheidungsproblem 1937 Turing
  • Symbolic Model Checking without BDDs 1999
  • Ãœber das Relativitätsprinzip und die aus demselben gezogenen Folgerungen 1990
  • Modal Models in Ptolemy 2010
  • Graph-Based Algorithms for Boolean Function Manipulation 2012
  • The SPIN Model Checker : Primer and Reference Manual 2004
  • Theory of Modeling and Simulation 2000
  • 1976: Theory of Modeling and Simulation 1976
  • Cybernetics, or Control and Communication in the Animal and the Machine 1963

    and also about the fields of

  • systems architecture,
  • modal logics,
  • petri net,
  • concurrent system,
  • real-time system,
  • verification,
  • crypthography,
  • visual Computer-Aided Software Engineering (CASE),
  • and so on.

    This already provides us sufficient evidence to show the causal link to our original and unique, copyrighted works of art titled Ontologic System and Evolutionary operating system, and to prove our claim that they have been taken as blueprints without allowance and referencing.

    Also very interesting are the facts that the plagiarists

  • referenced only 3 documents, which include the term Cyber-Physical in their titles, but all were publicated in 2009, 2010, and 2015,
  • used the terms Physical Computing System and Hybrid System in 2005, but not the term Cyber-Physical System [Gotcha!!! Physical Computing: Sensing and Controlling the Physical World with Computers. (2004)],
  • tried to confuse the public with the way the book is structured and the references were selected,
  • tried to continue on the basis of the frauds of the 2 position papers quoted in the Clarification of the 18th of July 2021, specifically timed computational semantics, though they are also convicted as plagiarisms and the reusult of espionage, and
  • took the image Evidence also shown on the webpage Caliber - Revolutionary Thing of the website of OntoLinux as blueprint for the design of the cover page,

    which proves our claim that the designation Cyber-Physical System (CPS) was created for stealing our original and unique ArtWorks (AWs) and further Intellectual Properties (IPs) by going over the fields of Model-Based Autonomous System (MBAS) or Immobile Robotic System (ImRS or Immobot), Sensor Network (SN), and semantics for temporal data.
    In addition, all works publicated after the October 2006 show that it is common sense that CPS is included in our Ontologic System (OS) respectively we also created CPS with our Evoos and OS, which are based on

  • cybernetics,
  • feedback (system) (actor (network)),
  • self-adaptive,
  • self-regulating,
  • self-organizing,
  • management (planning, controlling or orchestrating, executing, controlling or monitoring, closed-loop or -circuit, or circular Total Quality Management (TQM)),
  • multimodal,
  • semantics (well-structured and -formed, and well-founded, actor and least fixed point semantics F(s)=s),
  • model-based (e.g. ontology, digital twin),
  • Object-Oriented (OO 1),
  • Ontology-Oriented (OO 2),
  • actor-oriented (concurrent constraint logic),
  • Agent-Oriented (AO),
  • reflective (self-adaptive, self-awareness, actor and least fixed point semantics, digital twin),
  • logics (deductive, inductive, abductive, reactive, active, predictive, reasoning, order-sorted, etc.),
  • intelligent (reasoning, cognitive, self-awareness),
  • simulation,
  • virtualized,
  • resilient (fault-tolerant),
  • reliable,
  • real-time,
  • networked,
  • distributed,
  • etc..

    We would also like to give the reminder that The Proposal, which is describing our Evoos, already references the TUNES Project and the around 45 operating systems examinated in the course of the research for Evoos include all reflective operating systems known at that time, for sure, such as the

  • reflective, object-oriented, actor-based, fault-tolerant, reliable, and distributed operating system Apertos (Muse), which is also the basis for the Cognac system based on Apertos and the Aperios (Apertos) Real-Time operating system (RTos) used for at least one Robotic System (RS) (e.g. robotic dog, and companion, pal, or partner Sony Artificial Intelligence roBOt (AIBO)), and
  • model-level reflective, object-oriented, migratable actor-based, logic-based (untyped lambda calculus (e.g. Lisp), modal logic (e.g. Arrow system), order-sorted equational logic and rewriting logic (e.g. Maude)), self-managing, orthogonally persistent, cybernetic, fault-tolerant, security-based (formal proof from axioms), reliable, and distributed operating system TUNES OS.

    (Note the similar basic properties of Apertos (Muse) and also TUNES OS and that the listed basic properties of Apertos (Muse) are all included in TUNES OS.)

    In fact, essential for CPS is feedback system, which is realized respectively modelled and implemented as an actor-oriented system or network on the basis that "[a]ny actor network can be treated as a feedback system", which again equals {properties or capabilities of immobot} the integration of Apertos and TUNES OS, which leads us back to the start, because with spatial and temporal awareness respectively self-awareness and situational awareness of a brain, and sensoric and motoric primitives this is (at least) our Evoos.
    At this point, we already have another teaching of temporal semantics and timed computational semantics in our Evoos besides its capabilities of a CAPS, which is the first of the four relevant points of the project vision of the modelling and simulation framework Ptolemy II in 2009. About (real-time) operating system, which is the third relevant point, and holistic modeling, multimodeling, and ontologies, which is the fourth relevant point, we do not need to talk anymore.

    Also note that Apertos (Muse) and TUNES OS are also referenced in the section Exotic Operating System of the webpage Links to Software of the website of OntoLinux.

    We also quote the description of a concept map of CSP, which was created by the U.S.American National Institute of Standards and Technology (NIST) and one of the authors of the quoted book and is related to Ptolemy: "Cyber-Physical Systems (CPS) are integrations of computation, networking, and physical processes. Embedded computers and networks monitor and control the physical processes, with feedback loops where physical processes affect computations and vice versa. The economic and societal potential of such systems is vastly greater than what has been realized, and major investments are being made worldwide to develop the technology. The technology builds on the older (but still very young) discipline of embedded systems, computers and software embedded in devices whose principle mission is not computation, such as cars, toys, medical devices, and scientific instruments. CPS integrates the dynamics of the physical processes with those of the software and networking, providing abstractions and modeling, design, and analysis techniques for the integrated whole.
    As a discipline, CPS is an engineering discipline, focused on technology, with a strong foundation in mathematical abstractions. The key technical challenge is to conjoin abstractions that have evolved over centuries for modeling physical processes (differential equations, stochastic processes, etc.) with abstractions that have evolved over decades in computer science (algorithms and programs, which provide a "procedural epistemology" [Abelson and Sussman[. Structure and Interpretation of Computer Programs (SICP). 1985 [(programming language LISP→Scheme)]]]). The former abstractions focus on dynamics (evolution of system state over time), whereas the latter focus on processes of transforming data. Computer science, as rooted in the Turing-Church notion of computability, abstracts away core physical properties, particularly the passage of time, that are required to include the dynamics of the physical world in the domain of discourse."

    Comment
    The facts are that the system properties shown in the concept map of CPS are virtually all included in the basic properties of our OS and the description is about The Proposal, which is describing our Evoos, and our Caliber/Calibre of our OS.
    See also the note about the TUNES Project and the reflective operating systems Apertos (Muse) and TUNES OS in the related comments above.

    We also quote the abstract of a fourth document about CPS and also Industry 4.0, which is classified in the subjects logic in computer science, databases, and information theory, and was publicated in 2018: "Cyber-Physical Systems (CPS) are systems composed by a physical component that is controlled or monitored by a cyber-component, a computer-based algorithm. Advances in CPS technologies and science are enabling capability, adaptability, scalability, resiliency, safety, security, and usability that will far exceed the simple embedded systems of today. CPS technologies are transforming the way people interact with engineered systems. New smart CPS are driving innovation in various sectors such as agriculture, energy, transportation, healthcare, and manufacturing. They are leading the 4-th Industrial Revolution (Industry 4.0) that is having benefits thanks to the high flexibility of production.
    The Industry 4.0 production paradigm is characterized by high intercommunicating properties of its production elements in all the manufacturing processes. This is the reason it is a core concept how the systems should be structurally optimized to have the adequate level of redundancy to be satisfactorily resilient. This goal can benefit from formal methods well known in various scientific domains such as artificial intelligence. So, the current research concerns the proposal of a CPS meta-model and its instantiation. In this way it lists all kind of relationships that may occur between the CPSs themselves and between their (cyber- and physical-) components. Using the CPS meta-model formalization, with an adaptation of the Formal Concept Analysis (FCA) formal approach, this paper presents a way to optimize the modelling of CPS systems emphasizing their redundancy and their resiliency."

    Comment
    Obviously, that document is also about our OS and its basic properties and Ontologic System Architecture (OSA), which integrates all in one.
    There is no doubt that the field of Industry 4.0 is part of our field of Industry 5.0, which was also created with our OS.
    See also the comment to the quote about the concept map and the comment to the quote of the first document about the field of CPS above, and the summarizing comment after the quoted materials below.

    We also quote an online encyclopedia about the subject Cyber-Physical System (CPS): "A cyber-physical system (CPS) or intelligent system is a computer system in which a mechanism is controlled or monitored by computer-based algorithms. In cyber-physical systems, physical and software components are deeply intertwined, able to operate on different spatial and temporal scales, exhibit multiple and distinct behavioral modalities, and interact with each other in ways that change with context.[1 [US National Science Foundation [(NSF)], Cyber-Physical Systems (CPS)[. 2010]]] CPS involves transdisciplinary approaches, merging theory of cybernetics, mechatronics, design and process science.[2][3 [Applied Cyber-Physical Systems. [...] 2014]] The process control is often referred to as embedded systems. In embedded systems, the emphasis tends to be more on the computational elements, and less on an intense link between the computational and physical elements. CPS is also similar to the Internet of Things (IoT), sharing the same basic architecture; nevertheless, CPS presents a higher combination and coordination between physical and computational elements.[4 [Smart Monitoring of Potato Crop: A Cyber-Physical System Architecture Model in the Field of Precision Agriculture. [2015]]]
    Examples of CPS include smart grid, autonomous automobile systems, medical monitoring, industrial control systems, robotics systems, and automatic pilot avionics.[5 [Design Techniques and Applications of Cyber Physical Systems: A Survey[. 2014]]] Precursors of cyber-physical systems can be found in areas as diverse as aerospace, automotive, chemical processes, civil infrastructure, energy, healthcare, manufacturing, transportation, entertainment, and consumer appliances.[5]

    Overview
    Unlike more traditional embedded systems, a full-fledged CPS is typically designed as a network of interacting elements with physical input and output instead of a standalone devices. The notion is closely tied to concepts of robotics and sensor networks with intelligence mechanisms proper of computational intelligence [or soft computing] leading the pathway. Ongoing advances in science and engineering improve the link between computational and physical elements by means of intelligent mechanisms, increasing the adaptability, autonomy, efficiency, functionality, reliability, safety, and usability of cyber-physical systems.[6 [Intelligence for Embedded Systems. [2014]]] This will broaden the potential of cyber-physical systems in several directions [...].[7 [Cyber-physical systems. Program Announcements & Information. The National Science Foundation. [30th of September 2008]]]

    [...]

    Importance
    The US National Science Foundation (NSF) has identified cyber-physical systems as a key area of research.[25 [The Good News and the Bad News (Embedded Computing Column). [2007]]] Starting in late 2006, the NSF and other United States federal agencies sponsored several workshops on cyber-physical systems [(line breaks added for better readability)].
    [26 [NSF Workshop On Cyber-Physical Systems[. 16th - 17th October 2006]]]
    [27 [Beyond [Supervisory Control And Data Acquisition (]SCADA[) and Distributed Control System (DCS)]: Networked Embedded Control for Cyber Physical Systems[. 8th - 9th of November 2006]]]
    [28 [NSF Cyber-Physical Systems Summit[. 24th - 25th of April 2008]]]
    [29 [National Workshop on High-Confidence Automotive Cyber-Physical Systems[. 3rd - 4th of April 2008]]]
    [30 [National Workshop on Composable and Systems Technologies for High-Confidence Cyber-Physical Systems[. 9th - 10th of July 2007]]]
    [31 [National Workshop on High-Confidence Software Platforms for Cyber-Physical Systems (HCSP-CPS)[. 30th of November - 1st of December 2006]]]
    [32 [New Research Directions for Future Cyber-Physical Energy Systems[. 3rd - 4th of June 2009]]]
    [33 [Bridging the Cyber, Physical, and Social Worlds[. 27th - 28th of May 2009]]]
    [34 [NIST Foundations for Innovation in Cyber-Physical Systems Workshop[. 13th - 14th of March 2012]]]"]

    Comment
    First of all, we would like to note that "[p]osition papers in academia enable discussion on emerging topics without the experimentation and original research normally present in an academic paper."
    In this case, the emerging topic was claimed to be the field of CPS, though our Evolutionary operating system (Evoos) already existed nearly 7 years at that time in September 2006.

    Furthermore, we have to note that the authors of these documents, but also many other entities in the field of CPS and related fields, took informations about our activities

  • gained on quite dubious ways until October 2006 and
  • found on the website of our Ontologic Systems (OSs) OntoLix and OntoLinux and our other websites since the end of October 2006

    as blueprint for own works without allowance and referencing, as it seems to be usual in sciences.

    As already noted in the beginning, when investigating in this field, then we always come to the conclusion again and again, that the field of Cyber-Physical System (CPS) is a part of our original and unique

  • Evoos and
  • OS.

    In this regard, we called at first the

  • current and speculative next generations of CPS presented until the end of October 2006 as Cyber-Physical System of the first generation (CPS 1.0) and
  • part related to CPS of our OS presented at the end of October 2006 as Cyber-Physical System of the second generation (CPS 2.0)

    to differentiate CPS 2.0 from the

  • precursors of CPS in the fields of
    • Embedded System (ES)
      • Network(ed) Embedded System (NES)
        • Sensor Network(ing) (SN or SenN) system
          • Wireless Sensor Network (WSN) system,
    • Industrial Control System (ICS)
      • Programmable Logic Controller (PLC),
      • Distributed Control System (DCS), and
      • Supervisory Control And Data Acquisition (SCADA) system,
    • Autonomous System (AS),
    • Robotic System (RS),
    • Physical Computing System (PCS or PhyCS), and
    • Ubiquitous Computing System (UCS or UbiCS) or Pervasive Computing System (PCS or PerCS)
      • Internet of Things (IoT),

    and

  • CPS 1.0.

    Around September 2006 some very few position papers and workshops presented, thematized, discussed, and speculated about something new and future under the designation CPS, as can be seen once again with the following quotes of the first position paper:

  • "[...] technical challenges that need to be addressed [...]"
  • "[...] make some considerations regarding the requirements of cyber-physical systems [...]"
  • "Cyber-physical systems (CPS) will soon redefine [...] we will be able to [...]"
  • "[...] next generation CPS [...]"
  • "[...] will speculate about the necessary abstractions for locally physical but globally virtual CPS."
  • "Current research on CPS emphasizes the use of networked embedded systems as distributed sensing and data gathering devices. The natural next step is to consider actuation, thus moving from a passive framework, where information is extracted from the physical world, to an active framework, where information is sensed, processed and used within the network."
  • "[...] proposed Cyber-Physical Systems (CPS) initiative presents a compelling vision [...]"
  • "And yet, the grander vision of deeply [or tightly] coupled pervasive sensors and ubiquitous computing and communications driving fundamental changes in our social infrastructure remains in futuristic predictions [...]."
  • "[...] glaring CPS limitations and fundamental technical problems that must be solved [...]"
  • "Due to time limitations [...]"

    In addition, the focus and object of interest became the

  • transition from a passive framework to an active framework,
  • self-organization,
  • autonomic computing,
  • distributed, in-network computing,
  • higher combination, communication, and coordination,
  • machine simulation,
  • virtualization,
  • formal analysis and modeling,
  • semantics and semantic support for sensors and actuators, and the related signals and data, as well as ontology,
  • cybernetics,
  • feedback,
  • actor-oriented,
  • intense link between computational and physical elements by means of intelligent mechanisms,
  • intelligent mechanism controlled or monitored by a cyber-component respectively computer-based algorithm, and
  • overall intelligent system.

    But obviously, doubtlessly, and definitely, a closer look shows that these allegedly new and future features and functionalities of CPS respectively the foundations of the next generation of CPS, as described, discussed, and speculated about in the first and second position papers quoted above, in fact are technologies, techniques, features, and functionalities, which are also included in and were created and presented with our Evoos, which again are

  • based on
    • timed systems (e.g. heart and pulse, processor clock generator, and operating system (os)),
    • control and operating systems (e.g. brain, and Basic Input/Output System (BIOS) and operating system (os)),
    • networking systems, including the brain, the Central Nervous System (CNS), and the Autonomic Nervous System (ANS), and computer network, and also Software-Defined Networking (SDN)),
    • virtualization (e.g. mind, and Virtual Machine (VM) and virtual logical machine or execution engine, Virtual Network Function (VNF), and Network Function Virtualization (NFV)),
    • self-regulating systems (e.g. cybernetics and feedback),
    • self-organizing systems (e.g. Artificial Neural Network (ANN) and brain),
    • self-managing systems (e.g. cognitive system and brain)
    • self-adaptive and reflective systems (e.g. cognitive system and operating system, and also SDN),
    • adaptive or learning systems (e.g. cognitive system and brain, ANN, Machine Learning (ML) system, Cognitive Agent System (CAS), and (reflective) operating system),
    • Soft Computing (SC or SoftC) and Computational Intelligence (CI),
    • spatio-temporal actions and reasoning (e.g. cognitive system and brain, non-classical logics (e.g. spatial and temporal), and circular logics of causal action or cybernetic logic of feedback, and CAS),
    • dynamic determination of the Focus of Attention (FoA) (e.g. cognitive system and brain, and CAS),
    • in-network computation (e.g. body with ANS (including brain)),
    • sensors and actuators (e.g. sensoric and motoric primitives, and camera and microphone, network card, display, and also loudspeaker and printer),
    • physical and biological substratum (e.g. body and ANS (including brain), BIOS chip, and Central Processing Unit (CPU), and also other hardware),
    • cyber-physical substratum (e.g. virtual cell, ANN, ANS, and the other points listed above),
    • link between computational and physical elements by means of intelligent mechanisms (related points listed above), and
    • intelligent mechanism controlled or monitored by a cyber-component respectively computer-based algorithm (related points listed above).
  • described in the The Proposal, specifically in its chapters

    and

  • shown in the previous comments, specifically in relation to
    • attention,
    • self-awareness or knowledge about all of the own components,
    • situational awareness, and
    • semantic support for sensors and actuators, and the related signals and data.

    This coincidence is exact to such a high degree and extent that a collision of equal and independent interests must be ruled out and therefore is excluded.
    As a further conclusion we also found out that our first classification was not quite right, because Evoos already includes the foundation of CPS 1.0.

    Moreover, our Evoos also integrates features and functionalities, which

  • are
    • passive and
    • active,
  • work with
    • unconscious intervention and
    • conscious intervention,
  • are able to
    • create,
    • plan, and
    • learn,

    and

  • belong to the fields of
    • physical science,
    • biological science, and
    • computational science,
    • software and
    • hardware,
    • Weak Artificial Intelligence (WAI) and
    • Strong Artificial Intelligence (SAI) or symbolic, logic, symantic AI,
    • cybernetics,
    • bionics,
    • Autonomous System (AS or ASys),
    • Robotic System (RS),
    • Physical Computing System (PCS or PhyCS),
    • and so on.

    What is more, our Evoos also includes the foundations of for example the fields of

  • Software-Defined Networking (SDN),
  • microService-Oriented Architecture (mSOA),
  • Autonomic Computing (AC), Autonomic System (AS or AutoS) or Autonomic Computing System (ACS), Autonomic Personal Computing (APC), and Autonomic Computing operating system (ACos) (e.g. Artificial Neural Network (ANN), and also brain, Central Nervous System (CNS), and Autonomic Nervous System (ANS)),
  • semantic software agent model or system, Ontology-Based Agent-Based System (OBABS)
  • Semantic Grid Computing (SGC),
  • Cognitive Grid Computing (CGC),
  • and so on.

    Another aspect is Service-Oriented technologies (SOx), including microService-Oriented Architecture (mSOA), virtualization and containerization, Software-Defined Networking (SDN), Network Function Virtualization (NFV), and Virtualized Network Function (VNF), Cloud-native Network Function (CNF) Space-native Network Function (SNF), and Software-Defined Infrastructure (SDI), and Grid, Cloud, Edge, and Fog Computing (GCEFC) Space and Time Computing and Networking (STCN) and Ontologic Computing (OC).

    Another aspect are the fields of Business Intelligence (BI), Visualization, and Analytics (BIVA), and Data Science and Analytics (DSA), including Big Data Fusion (BDF) and Big Data Processing (BDP).

    These basic properties of the Evolutionary operating system Architecture (EosA) were made more explicit

  • in general with its successor, which is our OS with its integrating Ontologic System Architecture (OSA), which again is a special kernel-less reflective, fractal, holonic abstraction of a layered system architecture and integrates all in one, and
  • in particular with the definition of the lower and upper ends of the spectrum of the field of SoftBionics (SB) from null (no representation, Zero Ontology or Null Ontology, and so on) to everything (see also the webpage Philosophy of BEAM Robotics and the section Softbionics and Artificial Intelligence 3 of the webpage 21st Century Terms of the website of OntoLinux),

    which is insofar important to note, because it provides more aspects for the concretization, definition, and delimitation of the field of CPS.

    Furthermore, the other new and future features and functionalities of CPS respectively the rest of the foundations of the next generation of CPS were created with our OS, added to Evoos and thus to CPS, and presented officially at the end of October 2006, specifically its

  • OSA with homogeneous, heterogeneous, synchronous, and also asynchronous modules,
  • integration of the features and functionalities of Evoos, and
  • integration of other already existing features and functionalities of the fields of
    • SoftBionics (SB) (e.g. Artificial Intelligence (AI), Machine Learning (ML), Computational Intelligence (CI), Artificial Neural Network (ANN), Evolutionary Computing (EC), Computer Vision (CV), Simultaneous Localization And Mapping (SLAM), Soft Computing (SC), Autonomic Computing (AC), Natural Language Processing (NLP), Cognitive Computing (CogC), Cognitive Agent System (CAS), Cognitive-Affective Personality or Processing System (CAPS), Swarm Intelligence (SI) or Swarm Computing (SC), etc.),
    • Resilient Distributed System (RDS) respectively Challenge-Tolerant and Trustworthy Distributed System (CTTDS), including
      • Fault-Tolerant, Reliable, and Trustworthy Distributed System (FTRTDS),
    • Computer-Aided technologies (CAx), including
      • Computer-Aided Software Engineering (CASE) for the
        • formal analysis,
        • formal modeling,
        • formal verification, and
        • formal validation,
  • and much more

    for the modular design, development, and implementation of building blocks and homogeneous, heterogeneous, synchronous, and also asynchronous modules for all

  • Ontologic System Components (OSC) of these features and functionalities,
  • abstract OSA layers of the fields of
    • operating system,
    • database management system,
    • middleware,
    • application,
    • service,
    • etc.,

    and

  • systems of the fields of
    • Distributed System (DS)
      • Distributed Real-Time Embedded System (DRES),
    • CPS,
    • etc.,

    that enable and provide

  • high-performance,
  • high-capacity,
  • high-redundancy,
  • high-confidence,
  • and so much more.

    Interesting in this context are the cybernetic, holologic, and holonic, as well as reflective properties of our Evoos and OS.
    The field of cybernetics, holonic system, and reflection deal with the aspects of both worlds, the real and physical, and virtual and metaphysical worlds, but have different focuses.

  • The field of cybernetics has as foundational principle or core concept the
    • circular causality or feedback ((control) loop) respectively
    • circular logics of causal action in the core concept of feedback,

    which would mean across the border between these realities in case of CPS. Integration of computation, computing, or computational processes, and networking processes, and physical processes and the dynamics of both (usually) with feedback loops.

  • A holon is simultaneously a
    • whole in and of itself, as well as
    • part of a larger whole.

    The field of Holonic System (HS of HoS) is also connected to the fields of

    • philosophical psychology (see for example The Ghost in the Machine (1967)),
    • Self-Organizing Holarchic Open System (SOHOS),
    • ecosystem and physics (see for example Ecosystem as Self-Organizing Holarchic Open Systems: Narratives and the Second Law of Thermodynamics (2000)), and
    • Multi-Agent System (MAS).
  • The field of reflection with or without a learning process allows the supporting software algorithms of a reflective system to be changed or reprogrammed, including its own configuring software and not only to self-configure machines.

    When these 3 fields are viewed together with semantics, planning, modelling, digital twin or cyber-twin consisting of a virtual or cyber(spatial) machine of a physical machine, and simulation, and also reflection, adaption, self-awareness, SoftBionics (SB), and so on, then CPS can be understood much easier.

    As explained above, one of the foundations of CPS are semantics and computational semantics, specifically

  • spatial and temporal semantics, and
  • located and timed computational semantics.

    But the whole field of Cyber-Physical System (CSP) is the integration of the fields of

  • cybernetics,
  • bionics, and
  • Ubiquitous Computing (UbiC),

    which was realized with the integration of the reflective, actor-oriented, fault-tolerant, and distributed operating systems

  • Aperion (Apertos (Muse)) (rudimentary Artificial Intelligence (AI), Autonomous System (AS), and Robotic System (RS) (e.g. AIBO)) and
  • TUNES OS (cybernetic, logic, and automatic system; "TUNES does not pretend or strive to be AI technology, and does not depend on the existence of such technology.")

    through our Evoos (cybernetic, logic, SoftBionic (SB), ontology-based, networked Cognitive-Affective Personality or Processing System (CAPS)).

    Our OS

  • integrates or extends our Evoos with more basic properties,
  • makes the features of our Evoos more explicit, specifically in relation to the fields of
    • logics,
    • graph-based,
    • ontology-oriented,
    • self-representation,
    • digital twin or cybernetical twin (cyber-twin),
    • specification,
    • modeling, analysis, and simulation,
    • smart contract transaction protocol,
    • blockchain technique,
    • realities,
    • etc.,

    and

  • solves all the hard problems with for example
    • formal modeling, formal analysis, and formal proving,
    • Distributed Computing (DC), specifically
      • Peer-to-Peer (P2P) Computing (P2PC) and
      • Massively Multiplayer Online Game (MMOG),
    • Service-Oriented technologies (SOx),
    • Space-Based technologies (SBx),
    • and so on,

      that pretended the breakthroughs of its precursors.

    Correspondingly, one could also call CPS as

  • Cybernetics 2.0 (as we did herewith),
  • Cybernetical Intelligence (CI or CybI) (as we also did herewith), or
  • Ubiquitous Computing 2.0 (UbiC 2.0) (as we did before)
    • Internet of Things 2.0 (IoT 2.0) (as we also did before).

    But at least with

  • ontology or Semantic (World Wide) Web (SWWW),
  • smart contract transaction protocol,
  • blockchain technique and Non-Fungible Token (NFT),
  • Byzantine resilience protocol, or
  • multiple realities respectively New Reality (NR)

    it is our OS with its Caliber/Calibre and Ontoverse anyway.

    Around the year 2006, there was a certain difference between the field of CPS and the other related fields, such as

  • Industrial Control System (ICS),
  • Distributed Control System (DCS),
  • Network(ed) Embedded System (NES),
  • Autonomous System (AS),
  • Robotic System (RS),
  • Physical Computing System (PCS or PhyCS), and
  • Ubiquitous Computing System (UCS or UbiCS)
    • Internet of Things (IoT),

    and also

  • Computational Physics (CP, ComP, or CPhy).

    But these other related fields still have their names, definitions, and scopes, and related systems are classified accordingly.
    For example, the

  • ICS with its subfields and
  • "first official definition of Smart Grid was provided by the Energy Independence and Security Act of 2007 (EISA-2007), which was approved by the US Congress in January 2007"

    were connected with CPS and even classified as such several years after the presentation of our Evoos.

    The same holds for other prominent examples of CPSs, such as the fields of

  • autonomous automotive system, autonomous automobile, or self-driving car,
  • distributed robotics,
  • medical monitoring system, and
  • automatic pilot avionics,

    which were only classified as CPS later.

    This retrospective view does not only apply to the classifications of systems as CPS, but also to the connections of fields with CPS, such as for example

  • cybernetics,
  • bionics and biomimetics,
  • Artificial Intelligence (AI),
  • Machine Learning (ML),
  • Articial Neural Network (ANN),
  • Soft Computing (SC) and Computational Intelligence (CI) (guess where the idea for the designation SoftBionics (SB) came from), and
  • Evolutionary Computing (EC), and also
  • reflective programming and reflective system,

    and even with the

  • autonomous robotic automobile with artificial intelligent and self-aware cybernetic logic module (named AutoBrain™ by us) or supercomputer on wheels (named Ontoscope on Wheels by us) called Knight Industries Two Thousand (K.I.T.T.) of the Knight Rider saga (guess who with what revived the television series and why such an SB system is called a brain),

    all of which were considered as esoteric and weird fields and items, and hence taboos for most of the scientists and engineers, because talking about them or even taking them more seriously was considered as risks for damaging and wrecking the own reputation and career at that time.

    In this regard, we always mention that we referenced them all at that time despite said taboos and risks to set another undisputable and unforgeable mark of originality and uniquness in relation to our Evoos and OS.

    Specifically very strange and striking in this context are the facts that the

  • first workshops in 2006 were beginning only just some few days before the presentation of our OS, which could be anticipated by the registration of our new World Wide Web (WWW) domains, but was anticipated by ordinary espionage in the 7 years before, specifically by monitoring our traffic in the Internet and World Wide Web (WWW) since at least 1998,
  • term cyber-physical was used only since 2006 instead of the term physical computing still used in 2005 by the same prominent scientists in these fields,
  • first publications related to the new term CPS were only introducing the change or transition to this allegedly new field of CPS without becoming concrete, defining, and delimiting,
  • first quoted position paper discusses an actuator network system in a way, which equals matter already existing with either the subsumption architecture or our Evoos applied to the field of Multi-Agent System (MAS), while the second quoted position paper does not mention actuators at all, which is supported by other sources related to SN systems, which even state clearly that networks of actuators comparable to networks of sensors are not known at all due to considerably higher demands,
  • involved entities did not know what to do at all around September 2006 and tried to find out what to do in workshops and summits conducted in the months September to December of the year 2006 and in the following years in some kind of a state of panic ("Due to time limitations [...]") and needed several years for conclusions and directions, as can be seen by the dates, in total contrast to us with the original and unique works of art titled Evolutionary operating system and Ontologic System, and created by C.S.,
  • following workshops and more activities added the missing things and concretized, defined, and delimited the subject, matter, content, scope, etc. of the field of CPS only in the following months and years after the October 2006 with for example a large portion of our production paradigm, called by plagiarists Industry 4.0 in 2011, by copying the related parts of our OS with its Industry 5.0 (Industry 4.0 and Mixed Reality (MR) as part of our Ontoverse), and
  • research and development in the field of CPS was still in its infancy in 2011 and is still going on.

    Eventually, the U.S.American National Science Foundation (NSF), other research institutes, scientists, and many other invovlded entities around the world, including the authors of the quoted documents

  • merely used other words to describe the same expression of idea created with our Evoos, which is generally and legally viewed as an editing of an original and unique expression of idea, but created no new and own expression of idea,
  • provided no sufficient and substantial evidences for claiming prior art in relation to CPS and also related fields, but provided us many significant and substantial evidences underpinning our claim, as we also showed in the Clarification of the ... and ... in relation to the activities and materials of the NSF,
  • only presented and stole the topic, but not the work with its expressions of ideas itself, because
    • one of our masterstrokes was to make our Evoos (the initial step of a totally new kind of) a self-portrait of C.S.,
    • a further masterstroke was to make our OS the successor of our Evoos to get back at the front of the line, and
    • another masterstroke was to extend the cybernetical world and replace the intersection of logical and physical worlds with its link between computational and physical elements with the fusion of these 2 worlds and all other worlds by our Caliber/Calibre of our New Reality (NR) as part of our Ontologic System (OS) with its Ontoverse, while
    • the rest is our history and legacy anyway,
  • only discussed the designation Cyber-Physical System about which we even wondered around the year 2011, when it was used by the plagiarists in relation to Industry 4.0,
  • introduced an allegedly new field with CPS for matter already researched and developed for and created with our Evoos and OS in order to classify other older fields in it and then to continue the fraud based on matter related to these older fields,
  • only classified the examplary CPSs in the retrospective after the presentation of our Evoos and OS, and
  • none of the examplary CPSs of the fields of
    • Industrial Control System (ICS), including
      • Wireless Sensor Network (WSN) system,
    • Networked Embedded System (NES),
    • Autonomous System (AS),
    • Robotic System (RS),
    • Physical Computing System (PCS or PhyCS), and
    • Ubiquitous Computing (UbiC) or Pervasive Computing (PerC), including
      • Internet of Things (IoT).

    But obviously, doubtlessly, and definitely, CPS {2.0} already has its roots in our Evoos and is not only a part of our OS and therefore the simple conclusions are that the related basic properties, features, and functionalities of our

  • Evoos created and defined CPS {2.0} and also AC respectively the true CPS 1.0, and
  • OS created and defined the true CPS 2.0 and also UbiC 2.0, IoT 2.0, NES 2.0, ICS 2.0, DCS 2.0, SCADA 2.0, and SN 2.0, including Industrial Internet of Things (IIoT) and Industry 4.0 and 5.0, and the transitions from their first generations to their second generations.

    We think it is fair to claim the

  • whole field of CPS {2.0}, including our production paradigm, which is based on ontology and designated as Industry 4.0 by plagiarists and Industry 5.0 (Industry 4.0 and Ontoverse) by us, and
  • designation Cyber-Physical System {2.0} and its acronym CPS {2.0} for the related part of our OS, including the related part of our Evoos,

    without further thinking and discussing about the matter anymore.

    Because of this differentiation, we are also considering to

  • {?} separate CPS from UbiC, IoT, and NES, and
  • use the term Ontologic Net of Things (e.g. NES, DES, UbiC, IoT, SN, CPS), or another term

    when listing and discussing different fields.

    At the end, we would like to recall once again that our New Reality (NR) with its foundational Caliber/Calibre

  • includes the
    • integration of Wireless Sensor Network (WSN) with semantic support for space and time, and
    • intersection of logical and physical worlds,

    but is different to them, because it

  • fusions all kinds of reality, including
    • eXtended Reality (XR),
    • Mixed Reality (MR), including
      • Augmented Reality (AR),
      • Augmented Virtuality (AV), and
      • Virtual Reality (VR),
    • Simulated Reality (SR or SimR), and
    • Synthetic Reality (SR or SynR),

    and in this way

  • does not differentiate between the locally physical but globally virtual aspects of a system in the sense that a
    • system can also have locally virtual and globally physical aspects and
    • virtual environment or substratum will also shape the abstractions as well as the virtual network of embedded sensors, actuators, computing and communicating devices,
  • extends the notions of
    • reality-virtuality continuum of MR by including both ends of the spectrum, which is also wrongly called eXtended Reality (XR),
    • MR, specifically AR, by integrating MR and CPS as well
    • CPS by integrating CPS and the physical theories of the observable universe,

    and eventually

  • does not differentiate between physical and virtual worlds anymore, but
  • generalizes the field of physics beyond the observable universe by integrating physics and ontonics respectively physics and the integrations listed above.

    See also for example the webpage Ontologic System Calibre Ontoscope of the website of OntoLinux.

    Ontonics Further steps

    We got at least 3 takeover offers for commercial banks, that would become our OntoBank with at least 130 of the promised branches worldwide.
    Please note that our OntoBank does not provide financial services for private end users or consumers, and other customers of commercial banks and Financial Technology service providers (FinTechs), but for our Societies and its members, that are also artwork and technology licensing partners.

    In addition, we got a takeover offer for a commercial building in Genf==Geneva, Schweiz==Switzerland, that would become another headquarter either of our

  • OntoBank, Ontologic Exchange (OEx, OntoEx, or OntoExchange), or Ontologic Bank Financial Information and Communications (OBFIC or OntoBankFinIC), or
  • SOPR or another one of our Societies.

    Welcome to the Ontoverse, including the original and unique, one and only metaverse multiverse. :)


    19.July.2021

    Comment of the Day

    "Software is harder than hardware.", [Old saying in the Information and Communications Technology (ICT) industrial sector]

    Accordingly:
    "A wall of bits is harder than a wall of steel.", [C.S., Today]

    Erecting a wall was, is, and will never be a win-win.
    Correspondingly:
    Erecting a wall of bits would have effects on worldwide manufacture places, supply chains, and sales markets, as well as global and local infrastructure and anti-corruption initiatives and also initiatives to reduce social inequality and discontent, and promote collective prosperity.


    20.July.2021

    Clarification

    We would like to clarify that the field of Process Mining (PM) and the accompanying Process Query Language (PQL) were also created as part of our Ontologic System (OS).

    We quote an online encyclopedia about the subject process mining: "Process mining is a family of techniques relating the fields of data science and process management to support the analysis of operational processes based on event logs [or records of events]. The goal of process mining is to turn event data into insights and actions. Process mining is an integral part of data science, fueled by the availability of data and the desire to improve processes.[1 [Process Mining: Data Science in Action. [2016]]] Process mining techniques use event data to show what people, machines, and organizations are really doing. Process mining provides novel insights that can be used to identify and address performance and compliance problems.
    [...] Process mining uses these event data to answer a variety of process-related questions. Process mining techniques such as process discovery, conformance checking, model enhancement, and operational support can be used to improve performance and compliance.[2 [Process Mining: Data Science in Action. [2011 2016]]]
    [...]
    There are three main classes of process mining techniques: process discovery, conformance checking, and performance analysis (also called extension, see below). In the past terms like Workflow Mining and Automated Business Process Discovery (ABPD) [3] were used.
    [...]
    Process mining techniques are often used when no formal description of the process can be obtained by other approaches, or when the quality of existing documentation is questionable. Per [a fraudulent consulting company, that is refusing to reference our work of art titled Ontologic System and created by C.S.], Process Mining is a subset of hyperautomation.[4]
    [...] Process mining is different from mainstream machine learning, data mining, and artificial intelligence techniques. For example, process discovery techniques in the field of process mining try to discover end-to-end process models that are also able to describe concurrent behavior. Conformance checking techniques are closer to optimization than to traditional learning approaches. However, process mining can be used to generate machine learning, data mining, and artificial intelligence problems."

    We also quote a websites about the Process Query Language (PQL): "Process Query Language (PQL) is a special-purpose programming language for managing process models based on information about process instances that these models describe. PQL programs are called queries.
    PQL is a declarative language that is based upon temporal logic. Temporal logic is an extension of traditional propositional logic with operators that refer to the behavior of systems over time. These behavioral operators, called predicates in PQL, provide PQL with a mathematically precise means for expressing properties about the relation between activities and events in process instances.
    The concrete syntax of PQL is inspired by [Structured Query Language (]SQL[)] - a programming language for managing data stored in relational database management systems. The rationale behind this design decision is threefold:
    1. PQL and SQL serve the same overarching purpose - retrieval of information.
    2. SQL is a widely used standard that is well-recognized by technical specialists.
    3. The concrete syntax of SQL was often recommended for PQL by the interviewed process analysts.

    [...]

    The PQL Bot
    The PQL bot is a standalone utility that can be used to systematically index models stored by the PQL tool. Once a model is indexed, it can be matched to a query using the PQL tool. One can start multiple PQL bot instances simultaneously to index several models in parallel.
    A call to the PQL indexing routine takes a workflow system described in the Petri Net Markup Language (PNML) format as input. The PNML format is an XML-based syntax for high-level Petri nets, which has been designed as a standard interchange format aimed at enabling Petri net tools to exchange Petri net models. For many high-level process modeling languages, such as WS-BPEL, EPC, and BPMN, there exist mappings to the Petri net formalism. As a result, the PQL environment can work with models developed using a wide range of modeling tools captured using many main stream notations."

    Comment
    Of course, others integrated the Entity-Relationship Model (ERM) with the Petri-Net Entity-Relationship Model (PNERM) before.
    But we also integrated

  • log-based systems
    • Data Base Management Systems (DBMSs) and
    • File Systems (FS),
  • logics
    • Linear Logic (LL),
    • Temporal Logic (TL),
    • Linear Temporal Logic (LTL), and
    • other logics,
  • Ontology (O or Onto),
  • programming paradigms
    • Logic Programming (LP),
    • Concurrent Computing Programming (CCP),
    • Concurrent Logic Programming (CLP or ConcLP),
    • Constraint Programming (CP),
    • Constrained Logic Programming (CLP or ConsLP),
    • Concurrent Constrained Logic Programming (CCLP), and
    • other programming paradigms,
  • Structured Entity Reality Model (SERM),
  • ERM-PNM,
  • Model Checking (MC),
  • Data Mining (DM),
  • Robotic Process Automation of the third generation (RPA 3.0) or RPASB (RPA with more SoftBionics (SB)) or hyperautomation
    • Process Mining (PM),
  • Operations Management (OM)
    • Business Process Management (BPM)
      • Workflow Management (WMS) and
      • other process-aware information management,
  • Quality Management (QM)
    • control loop-based QM and
    • Total Quality Management (TQM) system,
  • Product Lifecycle Management (PLM)
    • Computer-Aided Software Engineering (CASE),
  • and so on.

    Obviously,

  • PM
    • is some kind of data mining and model checking of process logs,
    • was created with our OS, and
    • is classified in the field of RPA 3.0 or hyperautomation, which "is the application of advanced technologies like RPA, Artificial Intelligence [(AI)], [M]achine [M]earning (ML) and Process Mining to augment workers and automate processes in ways that are significantly more impactful than traditional automation capabilities",

    and

  • PQL is inspired by our multiparadigmatic query and programming language included in our Ontologic Programming (OP) paradigm (see the Clarification of the ... and ... of 2007 and 2008(?)), but not by SQL.

    We also have already proven that hyperautomation is a part of our OS (see the Investigations::Multimedia, AI and KM of the 7th of March 2021).
    Therefore, there is absolutely no doubt that this field of PM and the related technologies, goods, and services are in the legal scope of ... the OntoLand respectively parts of our original and unique, copyrighted, and prohibited for fair use and democratization work of art titled Ontologic System, created by C.S., and exclusively managed and exploited by our Society for Ontological Performance and Reproduction (SOPR) with the consent and on the behalf of C.S..

    Due to the relation to the Structured Entity-Reality Model (SERM), which can be seen in an illegal CASE tool by the arrangement of a exemplary PNERMs from left to right according an existential technique, it is no surprise at all that our original and unique PQL is also related to products and services of the company SAP.

    With the works, which we referenced in the sections Formal Verification, Formal Modeling, Intelligent/Cognitive Agent, and the other related sections of the webpage Links to Software of the website of OntoLinux, and our integrating Ontologic System Architecture (OSA) one can do everything in relation to every hardware, software, and business process.


    21.July.2021

    Roboticle Further steps

    On the 17th of July 2021, we said to have something to remove big goldfishes.

    Today, we are back and would like to reveal for the sake of better water quality worldwide, that our solution is the waterlife terminator, which is a submersible device, which again

  • can be stationary or mobile, such as an Unmanned Underwater Vehicle (UUV) operated with or without cable to the water surface for the supply of energy and ammunition, and also the control and monitoring,
  • is able to work in every water condition,
  • is able to recognize any water animal, including fishes, such as a big goldfish, snakes, frogs (of course no harm to Kermit), snails, and so on, as required for a cleaning mission, and also
  • features a means for the execution of a lethal effect, such as a laser gun, to exterminate and raze exotic wildlife in waters and wetlands.

    Such Autonomous Systems (ASs) and Robotic Systems (RSs) with laser guns of this type are already utilized to automatically and highly effectively remove parasites from fishes in large fish farm tanks.
    We just increased the power to make it deadly.

    In case of a goldfish it

  • is also called the Goldinator,
  • has the size of a soccer ball and the shape of a large goldfish, and
  • has the ability to move with the same way of silent fish-like locomotion to be even more effective, if required.

    A swarm or flock of Goldinators can be scaled in number as required to accomplish every mission in reasonable time.


    22.July.2021

    Comment of the Day

    "Explain
    why you came to OntoLand,
    why you needed decades to come to OntoLand, and
    why you took paths that all lead to OntoLand.", [C.S., Today]

    If one sneaks after us through unknown territory for years and finally reaches a fence with a gate and a sign saying "Property of Us", then there is no right of way.

    Every entity, that has copied our Ontologic System (OS), had, has, or will have the problem sooner or later, that they need our integrating Ontologic System Architecture (OSA). And when this happened, happens, or will happen, then we demand the answers.

    Style of Speed Further steps

    Many years ago, we developed a whole new generation of marine technologies and vessels, which revolutionize the fields of logistics, transport, travel, and mobility as well.

    One type of vessels is designed and optimized for the transport of gaseous substances, such as hydrogen and natural gases, including cow farts.
    But what makes our gas carriers even more interesting is that their

  • emission of operation has been reduced in relation to the liquidification process of the gas and the propulsion of the carrier, and might be even lower than the emission of operation of a pipeline, and
  • cost of operation is on par with pipelines having a length of several thousands of kilometers or miles.

    For example, a fleet of just 30 Liquified Natural Gas (LNG) carriers with a capacity of 260,000 m³ each, would be able to transport around 7,800,000 m³ of LNG per tour and around 15,600,000 m³ of LNG per month (2 tours) and 187,200,000 m³ of LNG per year in a distance of up to 8,400 kilometers (distance between East cost and Belgium 6,300 km and Houston and Belgium 9,000 km).
    Please note that LNG takes up about 1/600th the volume of natural gas in the gaseous state (at standard conditions for temperature and pressure).


    23.July.2021

    King Smiley Further steps

    Some weeks ago, we developed a list of requirements and measures in relation to the restoration of the Palazzo Sacchetti in Rome, Italy:

    General

  • acquisition of the entire building
  • free hand respectively good mood and attitude, but no unreasonable interference of the authority for the protection of monuments

    Exterior

  • utilization of mining technologies to stabilize the foundation and raise parts of the building, if necessary, in particular to prevent further damages to the frescoes, realign the building, and protect the building and its interior against earthquakes
    This is quite tricky, because besides the large size of the palace one can easily guess that further damages of the frescoes have to be avoided and there will be some archaeological research at the excavation site.
  • improving the basic structure of the palace and removing the braces at the facade
    This is tricky, because it might change some details of the exterior.
  • repair or replacement of broken elements at the facade and the patchwork masonry added over the centuries by bad repairs and building alterations
    This is tricky, because the broken elements are original elements.
  • redesign of the cortile==courtyard, whereby the original architecture would remain, for example by adding a
    • glass roof without or with terrace on top,
    • glass ceiling, or glass swimming pool, or both above the piano nobile, and
    • new interior area on the ground,
  • add a second galleria on the ground of the giardino on the side of the Vicolo del Cefalo, whereby the original balcony of the Stanza die Mose would be moved to the side of the Tevere==Tiber
  • replace the unharmonic elements of the roof, which are not original
  • replace the erased coat of arms of Pope Paul III at the fron with another coat of arms

    Interior

  • restoration of the frescoes, sculptures, cielings, doors, and so on, whereby lost parts or badly repaired areas should be replaced with copies of other areas, if reasonable
  • new house infrastructure, including electric wiring, plumbing, and Heating, Ventilation, and Air Conditioning (HVAC)
  • new floor with heating and electric wiring by retaining the original design style in most rooms, maybe retaining the original floor in the Stanza di Romolo, Stanza die Salomone, and Sala dei Mappamonde
    See for example, the other palaces, such as the Palazzo Colonna (Galleria Colonna and Salone della Capella), Palazzo Farnese, and Palazzo del Quirinale (Sala degli Ambasciatori, Sala Gialla, and Sala di Augusto).
    This could be done in a way, which would allow its reversal, though we do not think that this would happen due to the overall improvement.
  • new window glasses with filters, that protect the frescoes and other works of art from UltraViolet (UV) light and other electromagnetic radiation, if required
    This is also tricky, because it might require the installation of new window frames as well, specifically if thermal insulation should be added.
  • restoration of the staircase, if required
  • new tapestry, if not original
  • repainting the ceiling of Sala dei Mappamonde, if required

    Garden and annex

  • redesign of the garden, whereby the existing arrangement would remain, if original
  • lift of the Ninfeo==nymphaeum in the garden on a new substructure to protect it from flooding, level it with the street, or lift it even higher level and connect it with the terrace and eventually make it useable again
  • take down the four ugly heads from the corners of the roof of the Ninfeo, if not original
  • new wall around the garden, if reasonable
  • add a swimming pool, for example in the new substructure of the Ninfeo as another grotto

    As one can see, the overall cost will potentially be between 2.5 to 3.5 times more than the price of the building itself. But then it is more perfect than perfect and more beauty than beauty.

    Honestly, we are not sure about this project due to the reasons that the palazzo is not exactly what we want and it is not more favourable than the two alternative projects of us in this city.

    In relation to one of these two alternative projects, we overworked and expanded the architecture of one related real estate.


    26.July.2021

    Ontonics Further steps

    We noticed that recent developments in 2 different fields coincide with another relatively young development and an relatively old project developed by one of our business units, which is already related to one of the 2 different fields.
    Having made this observation and conclusion we worked on the resulting overall concept.

    It will be quite interesting to not only see if this development solves a problem that lasts for more than 15 years now. If it does, then we would call it revolutionary.


    31.July.2021

    King Smiley Further steps

    Already for many years, we are also developing and designing environments for recreation and sports, such as

  • skateparks made for skateboard, scooter, bicross (Bicycle MotoCross (BMX)), wheelchair, and inline skating, and
  • waterparks.

    Several months ago, we have developed and designed a waterpark, which has some improved features and facilities in comparison to existing waterparks and might develop into the new standard, though the intention is to construct them in the gardens respectively private parks of some of our real estate properties.
    Specifically our magic lazy river might become interesting.
    Some of these improved features and facilities are based on technologies of other business units of our corporation.

    Over the last weeks, we also have developed and designed two skateparks, which have unique features and facilities, and might develop into a larger movement, literally spoken.
    Some of these new features and facilities are based on technologies of other business units of our corporation.

    Roboticle Further steps

    For sure, our waterlife terminator (see the Further steps of the 21st of July 2021) can also be configured as the Urchinator to terminate for example purple sea urchins and in this way

  • perform the work usually done by starfishes,
  • rescue the kelp forests, and
  • rebalance the ecosystem

    at the coast of Alaska, U.S.America, to Baja California, Mexico, for example.

    Alternatively, a different model can collect purple sea urchins for the food industry.

    The image below shows 3 Autonomous Underwater Vehicles (AUVs) of a fleet, swarm, or school of coordinated autonomous underwater drones or robotic submersibles, which are

  • based on an older and outdated model of the year 2016, which again is
    • based on our
      • field of SoftBionics (SB) (e.g. Artificial Intelligence (AI), Machine Learning (ML), Computational Intelligence (CI), Artificial Neural Network (ANN), Evolutionary Computing (EC), Computer Vision (CV), Simultaneous Localization And Mapping (SLAM), Soft Computing (SC) and Autonomic Computing (AC), Natural Language Processing (NLP), Cognitive Computing (CogC), Cognitive Agent System (CAS), Cognitive-Affective Personality or Processing System (CAPS), Swarm Intelligence (SI) or Swarm Computing (SC), etc.) and
      • OntoBot and OntoScope components of our Ontologic System (OS),

      and

    • able to make decisions on the fly in 4D (3D space plus 1D time),

    and

  • constructed and operated by the National Aeronautics and Space Administration (NASA) together with other institutions.

    National Aeronautics and Space Administration et Propulsion Laboratory → fleet, swarm, or school of coordinated autonomous underwater drones or robotic submersibles with SoftBionics (SB) (e.g. AI, ML, ANN, CV, etc.)
    © National Aeronautics and Space Administration Jet Propulsion Laboratory (NASA JPL)

  •    
     
    © or ® or both
    Christian Stroetmann GmbH
    Disclaimer