Behavior Creation Toolkit
Watch Behavior Creation Toolkit in Action
Fully Functional Visual Programming Language
As with every Discovery Machine product, the BCT is equipped with the company’s proprietary visual programming language. Since the language is visual in nature, it keeps experts engaged throughout the entire development process, allows them to critique behavior models as they are developed, and enables them to adapt them as necessary both during development and after deployment. The visual structure is available at runtime as well and enhances understanding and debugging capabilities for instructors and developers alike.
Integration with the DIRECTOR
Discovery Machine’s latest product enhancement is the DIRECTOR. The DIRECTOR is fully-customizeable and allows for real-time monitoring of a scenario. With this tool instructors can monitor scenario progress, record details about key aspects, view alerts, and inject change into a scenario without restarting the scenario.
Runtime Behavior Engine
The Discovery Machine Behavior engine allows for two-way communication between the simulation environment and the behavior engine itself. In this way, Discovery Machine behavior models can retrieve information, infer upon that information, and send information back to the simulation. This process enables situation awareness for all entities in the simulation so that you can create truly adaptive, intelligent units with your BCT integration.
Behavior Builder Framework
Perfected through a series of research projects with the United States Navy, the Behavior Builder allows users to leverage expert-validated content with simple-to-follow user interfaces. The BCT allows users to integrate with the same proven technology and leverage that content using the Behavior Builder.
Access to Communication Server
The Discovery Machine communication server allows users to develop training systems which utilize voice commands with verbal responses. Using the BCT’s architecture, users can integrate with the communication server so that trainees can speak directly to Discovery Machine behavior models verbally, enable the behavior models to adapt accordingly, and hear realistic verbal responses.
Integration API and Developer Documentation
Integration directions for the BCT and your simulation are outlined in detail in the included integration API and Developer Documentation. Several different integration examples are highlighted to guide the user through the integration process. Development services are also available to aid in initial integration and content creation.
Integration with the BCT will enable instructors and trainees to create behavior models to control entities in any simulation with state of the art AI. Each behavior model is visual in nature and can be authored with user interfaces instead of scripting. As a result, behavior models are transparent to the user, increasing trust in their abilities. Behaviors can be structured so that they are situationally aware, goal-directed, and reactive to change in their environments. Since, behavior models created with the BCT can communicate with other AI entities and human operators both via text commands and voice commands, they are true-to-life not only in their behavior, but in the way the interact with the world around them.