CIA triad The "CIA triad" of
confidentiality, integrity, and availability is at the heart of information security. The concept was introduced in the Anderson Report in 1972 and later repeated in
The Protection of Information in Computer Systems. The abbreviation was coined by Steve Lipner around 1986. Debate continues about whether or not this triad is sufficient to address rapidly changing technology and business requirements, with recommendations to consider expanding on the intersections between availability and confidentiality, as well as the relationship between security and privacy. It has been pointed out that issues such as
non-repudiation do not fit well within the three core concepts.
Confidentiality In information security,
confidentiality "is the property, that information is not made available or disclosed to unauthorized individuals, entities, or processes." While similar to "privacy", the two words are not interchangeable. Rather, confidentiality is a component of privacy that is implemented to protect data from unauthorized viewers. Examples of confidentiality of electronic data being compromised include laptop theft, password theft, or sensitive emails being sent to the incorrect individuals.
Integrity In IT security,
data integrity means maintaining and assuring the accuracy and completeness of data over its entire lifecycle. This means that data cannot be modified in an unauthorized or undetected manner. This is not the same thing as
referential integrity in
databases, although it can be viewed as a special case of consistency as understood in the classic
ACID model of
transaction processing. Information security systems typically incorporate controls to ensure their own integrity, in particular protecting the kernel or core functions against both deliberate and accidental threats. Multi-purpose and multi-user computer systems aim to compartmentalize the data and processing such that no user or process can adversely impact another: the controls may not succeed however, as seen in incidents such as malware infections, hacks, data theft, fraud, and privacy breaches. More broadly, integrity is an information security principle that involves human/social, process, and commercial integrity, as well as data integrity. As such it touches on aspects such as credibility, consistency, truthfulness, completeness, accuracy, timeliness, and assurance.
Availability For any information system to serve its purpose, the information must be
available when it is needed. This means the computing systems used to store and process the information, the
security controls used to protect it, and the communication channels used to access it must be functioning correctly.
High availability systems aim to remain available at all times, preventing service disruptions due to power outages, hardware failures, and system upgrades. Ensuring availability also involves preventing
denial-of-service attacks, such as a flood of incoming messages to the target system, essentially forcing it to shut down. In the realm of information security, availability can often be viewed as one of the most important parts of a successful information security program. Ultimately end-users need to be able to perform job functions; by ensuring availability an organization is able to perform to the standards that an organization's stakeholders expect. This can involve topics such as proxy configurations, outside web access, the ability to access shared drives and the ability to send emails. A successful information security team involves many different key roles to mesh and align for the "CIA" triad to be provided effectively.
Additional security goals In addition to the classic CIA triad of security goals, some organisations may want to include security goals like authenticity, accountability, non-repudiation, and reliability.
Non-repudiation In law,
non-repudiation implies one's intention to fulfill their obligations to a contract. It also implies that one party of a transaction cannot deny having received a transaction, nor can the other party deny having sent a transaction. It is important to note that while technology such as cryptographic systems can assist in non-repudiation efforts, the concept is at its core a legal concept transcending the realm of technology. It is not, for instance, sufficient to show that the message matches a digital signature signed with the sender's private key, and thus only the sender could have sent the message, and nobody else could have altered it in transit (
data integrity). The alleged sender could in return demonstrate that the digital signature algorithm is vulnerable or flawed, or allege or prove that his signing key has been compromised. The fault for these violations may or may not lie with the sender, and such assertions may or may not relieve the sender of liability, but the assertion would invalidate the claim that the signature necessarily proves authenticity and integrity. As such, the sender may repudiate the message (because authenticity and integrity are pre-requisites for non-repudiation).
Other models In 1992 and revised in 2002, the
OECD's
Guidelines for the Security of Information Systems and Networks proposed the nine generally accepted principles:
awareness, responsibility, response, ethics, democracy, risk assessment, security design and implementation, security management, and reassessment. Building upon those, in 2004 the
NIST's
Engineering Principles for Information Technology Security In 2011,
The Open Group published the information security management standard
O-ISM3. This standard proposed an
operational definition of the key concepts of security, with elements called "security objectives", related to
access control (9),
availability (3),
data quality (1), compliance, and technical (4). == Risk management ==