Glossary

AI security moves fast, our glossary keeps up—regularly updated to give you accurate, up-to-date definitions. Whether you’re assessing risks, evaluating safeguards, or designing secure AI systems, your definitional guide to AI security starts here.
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
A
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

AI Assisted Signing

AI assisted signing is a technical integration of generative AI directly into the signing workflow, providing the signer with interactive support to interpret, navigate, and understand the contents of a legal document. Unlike traditional e-signing, which is a static process for collecting a signature, AI assisted signing transforms the document into an intelligent interface where the signer can ask questions and receive educational explanations without leaving the secure session.

Copy Paste Leakage

Copy paste leakage refers to the unintentional or unauthorized transfer of confidential information from a secure environment to an insecure one by a user manually copying text and pasting it into an external tool. In the context of e-signing, this most commonly occurs when a signer copies contract terms into public AI services (such as ChatGPT) to receive help with summarization or translation. This action causes sensitive data to leave the organization's control and become part of the external service's training data or logs.

Counterparty AI Risk

Counterparty AI Risk is the security vulnerability that arises when an external party (the recipient of a document) uses unsanctioned or public artificial intelligence tools to analyze, summarize, or translate confidential information. Unlike traditional internal security threats, Counterparty AI Risk exists outside the sender’s direct technical control, as it is triggered by the recipient's interaction with the document. This creates a critical loophole where trade secrets, personal data, and legal strategies can inadvertently leak into public AI models and their providers.

Shadow AI in Contract Workflows

Shadow AI in contract workflows refers to the unauthorized or unsanctioned use of generative artificial intelligence tools by employees to process, summarize, or analyze legal agreements. This behavior typically occurs when individuals seek to increase efficiency—such as by using public Large Language Models (LLMs) to simplify “legalese”—without the knowledge or approval of the organization’s IT or legal departments. The primary risk is that sensitive contract data is moved into public environments, where it may be used for model training or stored without enterprise-grade security controls.

Signing Order

Signing order is a control feature in e-signing that defines the exact sequence in which recipients receive and sign a document. Instead of sending the agreement to all parties simultaneously, a defined signing order creates a structured workflow where the next person in the chain only gains access to the document after the previous person has completed their action. This ensures that internal approval processes and legal hierarchies are automatically enforced.

Tenant-isolated Signing Environment

A tenant-isolated signing environment is a cloud security architecture where each customer’s data, documents, and AI interactions are physically or logically separated from those of other customers. Unlike standard multi-tenant platforms where data may share processing resources, this isolated environment ensures that sensitive contract data and the dialogue with a private AI assistant in signing remain within a dedicated "sandbox." This architecture is a primary defense against Counterparty AI Risk, providing a no training guarantee that prevents corporate data from being used to improve global AI models.

No results found.
There are no results with this criteria. Try changing your search.