SecAppDev 2026 lecture details

AI Memory, Mapped

AI memory is not just another RAG plugin; it is a stateful, persistent attack surface. Securing it requires new threat models, new detection primitives, and architectural decisions made well before deployment.

Schedule TBD
Abstract

Persistent memory transforms AI systems from stateless tools into context-aware systems, but it also introduces a class of risks. We will cover key risks including continuous exfiltration via prompt injection, delayed tool invocation, and negative psychosocial impacts. The second half focuses on building memory-safe systems by design: threat modeling memory, observability strategies, and runtime safety monitoring at scale (including BinaryShield, a novel privacy-preserving method for sharing textual customer content to detect coordinated spray attacks).

Key takeaway

Treat AI memory as an attack surface; design for safety and observability from day one.

Content level

Deep-dive

Target audience

Developers, architects, researchers

Prerequisites

None

Join us for SecAppDev. You will not regret it!

Grab your seat now
Natalie Isak
Natalie Isak

ML Engineer, Microsoft

Expertise: AI Safety

More details

Join us for SecAppDev. You will not regret it!

Grab your seat now

SecAppDev offers the most in-depth content you will find in a conference setting

Grab your seat now