This shows you the differences between two versions of the page.
| en:safeav:handson:exercises [2026/04/24 10:00] – created raivo.sell | en:safeav:handson:exercises [2026/04/24 10:08] (current) – raivo.sell | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| + | ====== Exercises ====== | ||
| + | |||
| + | ===== Autonomous Systems ===== | ||
| + | |||
| + | **Productization Lessons and Assessments: | ||
| + | |||
| + | Key lessons for productization include: | ||
| + | |||
| + | - Engineers must understand their products operate inside a governance structure consisting of laws, regulations, | ||
| + | - In the case of autonomy, there are many historical standards, but standard development is also under development. | ||
| + | - A very key aspect of product design is the expectation function for the product. This expectation function is key to communication from a marketing perspective and also from a legal liability perspective. | ||
| + | |||
| + | |||
| + | Exercises | ||
| + | |||
| + | ^ Section ^ Project Title ^ Objective ^ Technical Scope ^ Deliverables ^ Learning Outcomes ^ | ||
| + | | 2.0 Autonomous Systems Fundamentals | Cross-Domain Autonomy Architecture Design | Understand how autonomy architectures differ across ground, airborne, marine, and space domains. | Define sensing, compute, control, and communication architecture for one system in each domain; analyze environmental constraints and failure modes. | Architecture diagrams (5–10 page report). | Understand how environment drives autonomy architecture, | ||
| + | | 2.1 Definitions, | ||
| + | | 2.2 Legal, Ethical, and Regulatory Frameworks | Autonomous System Liability Case Study | Understand relationship between validation, expectation functions, and legal liability. | Analyze a historical accident scenario; determine liability; evaluate compliance with ISO, SAE, FAA, or NASA frameworks. | Legal liability analysis report; governance compliance evaluation. | Understand how governance frameworks assign responsibility and require validation evidence. | | ||
| + | | 2.3 Introduction to Validation and Verification | Operational Design Domain (ODD) and V&V Development | Learn how to construct a high-level validation plan for an autonomous system. | Define ODD; generate validation scenarios; define correctness criteria; develop validation workflow including simulation and physical tests. | Complete high-level V&V plan document; ODD, coverage, and correctness criteria. | Understand structure of validation plans and role of ODD, coverage, and correctness criteria. | | ||
| + | | 2.4 Physics-Based vs Decision-Based Validation | Comparative Validation of Deterministic vs AI Systems | Understand validation complexity differences between physics-based and AI-based systems. | Construct a V&V plan for a physics-based function and also for a digital function. | Comparative report on testing methodologies. | Understand fundamental differences between validating physics-based and AI-based systems. | | ||
| + | | 2.5 Validation Requirements Across Domains | Domain-Specific Validation Design | Learn how validation requirements differ across ground, airborne, marine, and space domains. | Select domain; define hazards, validation methods, certification requirements, | ||
| + | |||
| + | ===== Hardware and Sensing Technologies ===== | ||
| + | |||
| + | |||
| + | **Assessment: | ||
| + | |||
| + | ^ # ^ Assessment Theme ^ Learning Objective ^ Deliverable ^ | ||
| + | | 1 | Evolution of Electronics in Autonomy | Understand how semiconductors and electronics transformed ground, airborne, marine, and space systems from isolated functions into integrated autonomous architectures. | Paper: comparative essay, or Project: presentation/ | ||
| + | | 2 | Sensor Fusion Design | Explain why autonomous systems require multiple complementary sensors and how sensing choices depend on mission, environment, | ||
| + | | 3 | Safety and Governance | Analyze how standards and governance frameworks shape hardware design, certification, | ||
| + | | 4 | Validation and Verification | Evaluate how validation, timing, KPIs, scenario-based testing, and simulation contribute to trustworthy autonomy validation beyond simple model-level accuracy. | Paper: methodology critique, or Project: create a validation plan with KPIs, scenarios, and simulation/ | ||
| + | | 5 | Supply Chain and Productization | Understand how supply chain resilience, certification burden, EMI/EMC compliance, cybersecurity, | ||
| + | |||
| + | ===== Software Systems and Middleware ===== | ||
| + | |||
| + | Assessment: | ||
| + | |||
| + | ^ # ^ Assessment Title ^ Description (Project / Report) ^ Learning Objectives ^ | ||
| + | | 1 | Evolution of Programmable Systems | Write a report tracing the evolution from fixed-function hardware to programmable systems (configuration, | ||
| + | | 2 | Cyber-Physical Software Stack Analysis | Develop a structured report analyzing a real-world CPS (e.g., automotive ADAS, UAV, or spacecraft). Map its software stack (HAL, RTOS, middleware, applications) and explain how each layer contributes to overall system functionality. | Identify layers in CPS software architectures. Explain the role of RTOS, middleware, and HAL. Analyze real-time and safety constraints in system design. | | ||
| + | | 3 | IT vs CPS Supply Chain Comparison Study | Produce a comparative analysis of hardware and software supply chains in IT vs CPS, with focus on lifecycle management, dependencies, | ||
| + | | 4 | Safety Verification and Validation Framework | Write a report comparing software validation approaches in IT and CPS, focusing on simulation/ | ||
| + | | 5 | Software-Defined System Proposal | Develop a conceptual design for a “software-defined” system (e.g., vehicle, drone, or marine system). Describe architecture, | ||
| + | |||
| + | |||
| + | ===== Perception, Mapping and Localisation ===== | ||
| + | |||
| + | ^ # ^ Project Title ^ Description ^ Learning Objectives ^ | ||
| + | | 1 | Multi-Sensor Perception Benchmarking | Build a perception pipeline using at least two sensor modalities (e.g., camera + LiDAR or radar). Evaluate object detection performance under varying conditions (lighting, weather, occlusion) using real or simulated datasets. | Understand strengths/ | ||
| + | | 2 | ODD-Driven Scenario Generation & Validation Study | Define an Operational Design Domain (ODD) for an autonomous system (e.g., urban driving, coastal navigation). Generate a set of test scenarios (including edge cases) and validate system performance using simulation tools. | Define and scope an ODD. Develop scenario-based testing strategies. Understand coverage and edge-case generation. Link scenarios to safety outcomes. | | ||
| + | | 3 | Sensor Failure and Degradation Analysis | Simulate sensor failures (e.g., camera blackout, GNSS loss, radar noise) and analyze system-level impact on perception, localization, | ||
| + | | 4 | AI vs Conventional Algorithm Validation Study | Compare a traditional perception algorithm (e.g., rule-based or classical ML) with a deep learning model on the same dataset. Analyze differences in performance, | ||
| + | | 5 | End-to-End V&V Framework Design (Digital Twin) | Design a validation framework for perception, mapping, and localization using simulation (digital twin). Include KPIs, test conditions (e.g., ISO 26262, SOTIF), simulations, | ||
| + | |||
| + | ===== Control, Planning, and Decision-Making ===== | ||
| + | |||
| + | Assessments: | ||
| + | |||
| + | |||
| + | ^ # ^ Project Title ^ Description ^ Learning Objectives ^ | ||
| + | | 1 |Classical vs AI Control Benchmark Study | Implement and compare a classical controller (e.g., PID or LQR) with an AI-based controller (e.g., reinforcement learning) for a simplified vehicle model in simulation. Evaluate performance under nominal and disturbed conditions. |- Understand differences between model-based and data-driven control \\ - Analyze stability, robustness, and interpretability trade-offs \\ - Evaluate controller performance under uncertainty and disturbances | | ||
| + | | 2 |Behavioral & Motion Planning Stack Design | Design a hierarchical autonomy stack that includes a behavioral layer (FSM or behavior tree) and a motion planner (A*, RRT*, or MPC). Apply it to a scenario such as lane change or obstacle avoidance. | * Distinguish between behavioral decision-making and motion planning \\ * Implement planning algorithms under constraints \\ * Understand integration between perception, planning, and control | | ||
| + | | 3 |Scenario-Based Validation Framework | Develop a scenario-based testing framework using parameterized scenarios (e.g., varying speeds, distances, agent behaviors). Use a simulator to evaluate planning/ | ||
| + | | 4 |Digital Twin & Multi-Fidelity Simulation Study | Build a simplified digital twin of a vehicle and environment. Perform validation using both low-fidelity and high-fidelity simulation setups, comparing results and identifying discrepancies. | * Understand role of digital twins in V&V \\ * Analyze trade-offs between simulation fidelity and scalability \\ * Quantify sim-to-real gaps and their implications | | ||
| + | | 5 |Formal Methods for Safety Validation | Define safety requirements using a formal specification approach (e.g., temporal logic or rule-based constraints). Apply these to simulation traces and identify violations or edge cases. | * Translate safety requirements into formal, testable properties \\ * Use formal methods for falsification and validation \\ * Understand limitations of simulation without formal rigor | | ||
| + | |||
| + | ===== Human-Machine Communication ===== | ||
| + | |||
| + | ===== Autonomy Validation Tools ===== | ||
| + | |||
| + | |||
| + | |||