- Data Controller An individual, organization, or entity that determines the purposes and means of processing personal data. In the context of privacy laws, such as the GDPR, data controllers have specific legal obligations regarding the handling of data, including ensuring its protection and privacy, responding to data subject's rights requests, and reporting data breaches. They are differentiated from data processors, who process personal data on behalf of a data controller.
- Data Custodian An individual or organization responsible for the maintenance and care of data or data sources. Their duties typically involve implementing technical controls, procedures, and systems that ensure the safety and quality of the data, such as performing regular backups, enforcing access controls, and ensuring data integrity.
- Data Destruction The process of eliminating or erasing data from a storage medium, making it completely unreadable and non-recoverable. Methods include physical destruction, degaussing, and software-based overwriting or encryption, all aimed at preventing unauthorized individuals from retrieving and exploiting sensitive information.
- Data Dictionary A centralized repository of information about data, such as its meaning, relationships to other data, origin, usage, and format. It serves as a guide for understanding the structure, content, and context of data sources, thereby helping ensure consistency across different parts of an organization and facilitating effective data management.
- Data Diddling A fraudulent act involving the deliberate alteration of data before or during its entry into a computer system and then changing it back after the processing is complete. This can be done to manipulate the output or results, typically for financial gain or other malicious intent. It's considered a form of cybercrime and is a security risk that organizations need to guard against with appropriate controls and auditing measures.
- Data Discovery Methods Data Discovery Methods refer to a range of processes and tools used to identify, classify, and analyze an organization's data assets. They are crucial for data governance, risk management, and compliance, ensuring that sensitive data is properly handled and protected. Techniques include automated discovery using software to scan storage systems and databases, as well as manual reviews and audits.
- Data Emanation The unintentional radiation or transmission of electrical signals from electronic equipment, such as a computer or a network device, which can potentially disclose sensitive information. Adversaries can intercept and decode these signals, also known as compromising emanations, to gain unauthorized access to the information, making mitigation techniques such as shielding, signal jamming, or the use of secure communication protocols essential.
- Data Encryption Standard (DES) A symmetric-key algorithm for encrypting electronic data. Developed in the 1970s and once widely used, DES encrypts data in 64-bit blocks using a 56-bit key. It was eventually found to be vulnerable to brute-force attacks and has been largely replaced by more secure standards like the Advanced Encryption Standard (AES). However, DES was pivotal in the development and study of modern encryption techniques.
- Data Execution Prevention (DEP) A security feature included in most modern operating systems. Its primary function is to help prevent damage to your system from viruses and other security threats by monitoring programs to ensure they use system memory safely. When DEP detects a program using memory incorrectly, it closes the program and notifies the user, thus helping to limit the impact of both malicious and unintentionally harmful software.
- Data Farm A colloquial term that describes a large-scale data storage facility or a collection of servers that work together to store, manage, and process vast amounts of data. Similar to a server farm or data center, a data farm provides the infrastructure necessary to support big data applications, cloud computing services, and extensive databases. Data farms are designed for reliability, scalability, and high availability to ensure ongoing access to critical data resources. They play a foundational role in supporting the storage needs of modern enterprises and the processing requirements of complex analytical tasks.
Share our FREE glossary with your friends and study buddies.
Disclaimer: The glossary is for informational purposes only, we are not liable for any errors or omissions, if you find errors please contact us.