Skip to content

Modelling Library

Platform Users — Engineers & Low-code Ops Users (ORA / Panel Builder) OR Platform ORA — AI Planning Interface Agent Workflows Plan Visualisation ADK Integration SDK UI — Frontend Shell FDK Architecture Low code Config-driven DDK Schema Definition Code Generator Generated Server MDK WEM DAL Experiment Manager Nexus Deployment Control Live Monitoring Registry Browser SCDK Source Control Pipeline Mgmt Azure DevOps deploys ↓ SDK API — GraphQL Federation Gateway Federation Gateway Component Resolvers Auth & Licensing Plugins: gql-autogeneration Migrator Helm KinD Boilerplate GenAI ··· Microservices — Domain IP Services Data Pipeline Core Platform Metrics & Analytics Spatial & Geo Simulation Event Detection Camera & Device Fire & Resource Opt. Satellite Modelling ↓ Nexus deploys Deployed OR Applications Rail Ops Dashboard Mine Mgmt Dashboard Port Ops Dashboard ··· FDK-built · DDK-backed · MDK-powered · deployed via Nexus ↑ Application Users — Operations Teams (shift managers, analysts, planners)

The MDK includes a library of specialist models and services that can be used as building blocks in your workflows. These models span simulation engines, optimization solvers, AI models, and integration services — each designed to solve specific classes of operational problems.

Available Models

Simulation & Optimization

These models provide core computational capabilities for physical system simulation and optimization problems.

ModelTypeDescription
Traffic ModelSimulationNetwork-level traffic simulation for transport and logistics systems. Models traffic flow, congestion propagation, and network interactions.
Schedule GenerationOptimizationAutomated schedule generation and optimization for operational planning problems including resource allocation and timing optimization.
Linear SystemsSolverHigh-efficiency solver for linear systems and ordinary differential equations. Supports second-order oscillators, spring-damper systems, and statistical sampling.

AI & Machine Learning

AI-powered models for forecasting, decision-making, and intelligent automation.

ModelTypeDescription
AI Agent PythonAI AgentPython-based autonomous agent service for intelligent decision-making, task automation, and agentic workflows.
Tiny Time MixersForecastingTime-series forecasting model for operational demand prediction and state forecasting using the IBM Tiny Time Mixers foundation model.
Stats ModelsStatisticsStatistical modeling service for regression analysis, inference, and probabilistic modeling.

Integration Services

Models that connect external data sources and services into your workflows.

ModelTypeDescription
Data IngestionData PipelineStreaming and batch data ingestion service for bringing operational data into workflows.
Data LoaderData PipelineData loading and preprocessing service for preparing structured inputs to model workflows.
TomTom APIExternal APIGeospatial data integration service providing real-time traffic data and routing information from TomTom.
SAP APIExternal APISAP system integration for connecting enterprise resource planning data into operational workflows.

Infrastructure

Platform services that support AI agents and advanced workflow capabilities.

ModelTypeDescription
MCP ServerInfrastructureModel Context Protocol server providing AI agents with structured access to platform tools and data.

How to Use Models in Workflows

1. Browse Available Models

Models appear in the workflow builder's component library. Each model displays:

  • Model name and description
  • Input requirements
  • Parameter options
  • Output structure

2. Add Models to Your Workflow

Drag models from the library onto the workflow canvas. Each model becomes a task in your workflow.

3. Configure Task Parameters

Each task has configurable parameters that control its behavior:

  • Input fields — Data the model needs to run
  • Parameters — Configuration options (thresholds, algorithms, modes)
  • Output fields — Results the model produces

4. Connect Tasks

Draw connections between tasks to define data flow:

  • Upstream outputs automatically map to downstream inputs by field name
  • The Data Abstraction Layer handles data transfer between tasks
  • No custom integration code required

5. Run and Iterate

Execute your workflow to see results:

  • Test execution validates the workflow logic
  • Experiment execution enables systematic parameter variation
  • Results are stored for comparison across different configurations

Model Categories Explained

Simulation Models

Simulation models recreate real-world physical systems computationally. They answer "what if" questions:

  • What if we change the traffic signal timing?
  • What if we add more resources to this network?
  • What if demand increases by 20%?

Use simulation models when you need to understand system behavior under different conditions without affecting the real system.

Optimization Models

Optimization models find the best solution to a problem given constraints and objectives:

  • What's the optimal schedule that minimizes delays?
  • How should we allocate resources to maximize throughput?
  • Which routes minimize total travel time?

Use optimization models when you have clearly defined objectives and constraints, and need to find the best configuration.

AI & Forecasting Models

AI models learn patterns from data and make predictions or decisions:

  • What will demand be next week based on historical patterns?
  • Which maintenance action should we take based on current system state?
  • Is this pattern normal or anomalous?

Use AI models when problems are too complex for analytical solutions, or when learning from historical data is valuable.

Integration Services

Integration services connect external systems into workflows:

  • Bring real-time data from operational systems
  • Export workflow results to enterprise platforms
  • Access third-party APIs for enrichment

Use integration services to build end-to-end workflows that span your entire technology stack.

Building Custom Models

In addition to the pre-built models in this library, you can build your own custom models:

  1. Choose a language — Python, Julia, or Go
  2. Generate boilerplate — Use the MDK's model builder tool
  3. Implement your logic — Write the model's computational logic
  4. Deploy to the platform — Build and deploy as a container
  5. Use in workflows — Your model appears in the library alongside platform models

Custom models follow the same interface contract as platform models, so they integrate seamlessly into workflows.

Model Compatibility

All models in this library:

  • ✅ Work with any other model regardless of language
  • ✅ Support automatic input/output mapping
  • ✅ Integrate with the Data Abstraction Layer
  • ✅ Can be used in parallel or sequential workflows
  • ✅ Support caching for performance optimization
  • ✅ Provide OpenAPI documentation

Performance Considerations

Model Selection

Different models have different performance characteristics:

  • Lightweight models (stats, data loaders) execute in seconds
  • Medium models (time-series forecasting, route optimization) take minutes
  • Heavy models (large-scale simulation, complex optimization) can take hours

Design workflows with execution time in mind — enable caching on expensive models to avoid unnecessary re-execution.

Parallel Execution

When tasks are independent, the Workflow Execution Manager runs them in parallel:

  • Multiple lightweight models can run simultaneously
  • Reduces overall workflow execution time
  • Takes advantage of available compute resources

Data Transfer

The Data Abstraction Layer optimizes data transfer based on model needs:

  • Small data uses fast cache-based transfer
  • Large data uses efficient file-based transfer
  • Models don't need to manage transfer logistics

Next Steps

Explore individual models:

  • Click on any model in the tables above to see detailed capabilities, use cases, and configuration options

Build your first workflow:

Understand the orchestration engine:

User documentation for Optimal Reality