The willingness and commitment of business executives to be accountable for the definition and management of the business excellence requirements of their area and related Key Value Indicators (KVI) targets.
Application Programming Interface (API)
Application Programming Interface (API) is a computing interface which defines interactions between multiple software intermediaries. It defines the kinds of calls or requests that can be made, how to make them, the data formats that should be used, the conventions to follow, etc. It can also provide extension mechanisms so that users can extend existing functionality in various ways and to varying degrees.
Artificial intelligence (AI, also machine intelligence, MI) is intelligence demonstrated by machines, in contrast to the natural intelligence (NI) displayed by humans and other animals. In computer science AI research is defined as the study of “intelligent agents”: any device that perceives its environment and takes actions that maximise its chance of successfully achieving its goals. Colloquially, the term “Artificial intelligence” is applied when a machine mimics “cognitive” functions that humans associate with other human minds, such as “learning” and “problem solving”.
An Attribute is a type or characteristic of an entity (e.g. “gender” is an Attribute of the entity “patient”). An entity typically has many attributes.
Augmented cognition is an interdisciplinary area of psychology and engineering, attracting researchers from the more traditional fields of human-Computer interaction, psychology, ergonomics and neuroscience.
The Business Excellence is the discipline of maintaining an absolute state of conformity to all criteria necessary to create an ultimate value, reconciling environmental, social equity and economic dimensions. The Business Excellence can be achieved with the adhesion of multi-level social units with common objectives to reach and sustain that state through cooperation and sharing.
Business Excellence Requirement (BER) - DEMS
The Business excellence requirement (BER) represents the backbone of the Data excellence framework. It is a prerequisite, business rule, standard, policy or best practice that business processes, transactions and data should comply with in order to have the business goals flawlessly executed to generate the value.
Business Glossary is a means of sharing internal vocabulary within an organisation.
Business intelligence comprises the strategies and technologies used by enterprises for the data analysis of business information. BI technologies provide historical, current, and predictive views of business operations.
Management (or managing) is the administration of an organisation, whether it is a business, a not-for-profit organisation, or government body. Management includes the activities of setting the strategy of an organisation and coordinating the efforts of its employees (or of volunteers) to accomplish its objectives through the application of available resources, such as financial, natural, technological, and human resources. The term “management” may also refer to those people who manage an organisation.
The total set of enterprise business objectives is represented by a chain of intermediate objectives leading the company to achieve its ultimate transactional “rendez-vous” supporting the strategic enterprise goal.
A business Rule defines or constrains some aspect of business and always resolves to either true or false. It specifically involves terms, facts and rules.
Business transactions are the interactions between businesses and their customers, vendors and others with whom they do business.
In management it has been said that Business Transformation involves making fundamental changes in how business is conducted in order to help cope with shifts in market environment. However this is a relatively narrow definition that overlooks other reasons and ignores other rationales.
Business Value establishes a standard measure of value used to determine the business worth.
Compliance means conforming with stated requirements. At an organisational level, it is achieved through management processes which identify the applicable requirements (defined for example in laws, regulations, contracts, strategies and policies), assess the state of compliance, assess the risks and potential costs of non-compliance against the projected expenses to achieve compliance, and hence prioritise, fund and initiate any corrective actions deemed necessary.
Computer science is the study of the theory, experimentation, and engineering that form the basis for the design and use of computers. It is the scientific and practical approach to computation and its applications and the systematic study of the feasibility, structure, expression, and mechanisation of the methodical procedures (or algorithms) that underlie the acquisition, representation, processing, storage, communication of, and access to, information. An alternate, more succinct definition of computer science is the study of automating algorithmic processes that scale. A computer scientist specialises in the theory of computation and the design of computational systems.
Conceptual Data Model (CDM) - DEMS
A Conceptual Data Model (CDM) contains objects (entity/relationship) that are described by a list of attributes (“attribute”). Each of these objects is either an entity or a relationship.
Context (language use)
In semiotics, linguistics, sociology and anthropology, context refers to those objects or entities which surround a focal event, in these disciplines typically a communicative event, of some kind.
Contextual intelligence: the ability to understand the limits of our knowledge and to adapt that knowledge to an environment different from the one in which it was developed. (the term is not new; Anthony Mayo and Nitin Nohria have used it in the pages of HBR (Harvard Business Review), and academic references date from the mid-1980s.) Until we acquire and apply This kind of intelligence, the failure rate for cross-border businesses will remain high, our ability to learn from experiments unfolding across the globe will remain limited, and the promise of healthy growth worldwide will remain unfulfilled.
Contextual polarisation – DEMS:
The contextual polarisation is an innovative technique to visualise and organise systematically the DEI results (components) according to the context and the level.
Controlled Languages (CLs) or Controlled Natural Languages (CNLs) - DEMS
Controlled Languages are subsets of natural languages that are obtained by restricting the grammar and vocabulary in order to reduce or eliminate ambiguity and complexity.
Controlled vocabulary - DEMS
A simple list of terms, definitions and naming conventions. A Controlled vocabulary frequently has some type of oversight process associated with adding or removing data element definitions to ensure consistency.
Corporate governance is the collection of mechanisms, processes and relations by which corporations are controlled and operated.
Customer Relationship Management (CRM)
Customer relationship management is an approach to managing a company’s interaction with current and potential customers. It uses data analysis about customers’ history with a company to improve business relationships with customers, specifically focusing on customer retention and ultimately driving sales growth.
Facts and statistics collected together for reference or analysis. Also the quantities, characters, or symbols on which operations are performed by a computer, which may be stored and transmitted in the form of electrical signals and recorded on magnetic, optical, or mechanical recording media.
Data attribute comes from other data characteristics which specific sets like the location, the length or even the type of it. Attribute term is also used for the synonym of data element or property.
A data catalog maintains an inventory of data assets through the discovery, description, and organisation of datasets. The catalog provides context to enable data analysts, data scientists, data stewards, and other data consumers to find and understand a relevant dataset for the purpose of extracting business value. (Gartner, 2017)
The data dictionary defines and specifies all data elements in the system. Each element is provided in a tabular format for easy reading.
It includes terms, definitions, naming conventions and one or more representations of the data elements in a computer system. data dictionaries often define data types, validation checks such as enumerated values and the formal definitions of each of the enumerated values.
Data driven is generally used to describe an activity or a process that focuses on data instead of more traditional ways based on experience or intuition.
Data excellence is the set of methods, techniques and tools for achieving a sustainable business excellence. It was developed by Dr Walid El Abed in 2007 and is central to his business’s strategy at Global data excellence (GDE). Nowadays, it is used in industrial sectors as well as by governments.
The term “data excellence” originally comes from the vision to elevate data’s level of excellence and to emphasize a value-driven approach for enabling business excellence. Since then, data excellence has become an imperative to unlock enterprise potential and to enable sustainable value generation. Thus, data is made visible enabling it to become an asset for the enterprise to perform business. The sustainable value generation is organised concretely by implementing an enterprise common framework, which enables a paradigm shift across the whole organisation in order to strive for business success.
Data Excellence Continuous process
A guiding principle for a successful implementation of the Data Excellence framework is to avoid designing a process that requires organisational change but to propose a process which is agnostic to your current organisation and that fits any type of business.
Data Excellence Dimension - DEMS
The Data Excellence Dimensions represent A key concept of the Data Excellence Framework. They represent 7 dynamic views of the Data Excellence Index according to 7 specific dimensions that aggregates to the overall Data Excellence Index
Data Excellence Maturity Model (DEMM)
The Data Excellence maturity model helps organisations worldwide to move successfully from The early stage of Data Excellence described as “chaotic” to The most mature stage described as “predictive”. At The later stage, Data is utilized as a core enterprise asset. The Data Excellence maturity model is often used to aid understanding The right projects and The initiatives for introducing The discipline and methodology of Data Excellence. It aims at measuring and estimating The compliance level of organisations to Data Excellence best practices.
Data Excellence Methodology
The data Excellence methodology is based on four value pillars: agility, trust, Intelligence, and transparency. These characteristics are fundamental value pillars to enable business Excellence sustainability and support economical growth. In the world of big data, this methodology represents an opportunity to enable growth, profit, and flawless execution.
– Agility is needed to react to external and internal changes and ensure prompt and successful integration that supports fast business transformations through process harmonisation, acquisitions, mergers, divestitures and reorganisations.
– Trust is associated with the integrity of the data (e.g. The labels on foods must be correct – otherwise trust in the brand is lost). If a financial product promises an incorrect return, buyers will no longer trust the brand.
– Intelligence at all levels of the enterprise leads to better execution, operational efficiency and accurate financial consolidation based on just-in-time quality data from reporting systems and applications (global and operational).
– Transparency is critical to the organisation’s performance as it increases the visibility and the collaboration across and outside the firewall. The enterprise social responsibility will enable sharing data internally within the enterprise and externally with business partners.
Data Excellence Pillars - DEMS
The data excellence pillars to support a sustainable business Excellence
- Business Excellence Requirements (BER) and Data Excellence Index (DEI)
- Business Impact and Value: Key Value Indicators (kvis)
- Data Governance Structure ( Accountability and Responsibility)
- Continuous Data Excellence Process
- Data Excellence Management System (DEMS)
Data Excellence Measurement Instrument (DEMI) - DEMS
A fundamental deliverable of the Data Excellence Science. Through it, the data excellence framework execution enables surgical, multifocal and dynamic governance linking business management, data management and information technology together guiding them towards their excellence. The business transaction becomes therefore a key component that enables managing data as a company asset.
The DEMI is composed of two measurement elements measuring the qualitative value and the qualitative value simultaneously for the context under observation. The two measures are linked together through the Business Excellence Requirements in the scope of the excellence context desired for a specific domain.
The qualitative value measure is called the Data Excellence Index (DEI) and the quantitative value measure is called the Key Value Indicator (KVI). The Data Excellence Index (DEI) and the Key Value Indicator (KVI) are key outputs of the Data Excellence Science.
Data Excellence Science
Data Excellence is an emerging discipline created to maximise the sustainable business value of enterprise Data. Indeed, Data Excellence arises from the natural evolution of organisations and society in the information age where Data is the key resource. It emerged from the field of Data Governance whose goal is to produce continually high-quality Data while lowering cost and complexity and supporting risk management and regulatory compliance. Since 2007, the Data Excellence discipline has been introduced and taught at the Research Centre Lucien Tesnière in Natural Language Processing, Franche-Comté University, at the CNAM (Centre National des Arts et des Métiers), Paris Dauphine University and finally at the Fribourg University of Law as a philosophical, economical, political and organisational model.
Data governance is a term used on both a macro and a micro level. The former is a political concept and forms part of international relations and Internet governance; the latter is a management concept and forms part of corporate governance.
Data harmonisation is the process of gathering data of varying file formats, naming conventions, and columns, and transforming it into one cohesive data set.
Data integration is the act of combining data from several sources from different systems to create sets of information for operational and analytical use. Data integration is one of the key elements in the data management process in order to consolidate data sets to ensure that the data is consistent, clean and usable by end users.
A data lake is a system or repository of data stored in its natural/raw format, usually object blobs or files.
Data lineage is often related to the life cycle of the data as well as information about where it comes from and whether it has been moved in order to analyse which key information is being used for a specific purpose.
Data Management comprises all disciplines related to managing data as a valuable resource.
Data modelling is the act of exploring data-oriented structures. Like other modelling artifacts data models can be used for a variety of purposes, from high-level conceptual models to physical data models.
Data policies: are norms regulating management and publication of research data. They range from recommendations to enforcements. (IFDO, 2019)
Data stewardship - DEMS
Data Stewardship is the willingness and commitment to be accountable/responsible for a set of business excellence requirements for the well-being of the enterprise by operating in service of the global community rather than individual interest
Data Sharing Sphere (DSS)
The Data Sharing Sphere is a facility that provides a dynamic data management, an access to all types of data and data sources required for an ecosystem to create value and to achieve its mission and goals. Both managing data at source and legally respecting data ownership constitute the fundamental principles of the DSS (harmonisation is done at the semantic level for the commons). The person (moral or physical) decides what data to share and with whom. Thus, the Data Sharing Sphere enables communities and persons to share data and data value among them while being in control over ownership of their information, deciding what information is shared and with whom while complying with the regulatory requirements.
Data steward - DEMS
A person who is Responsible for a set of Business Excellence Requirements for the well-being of the Enterprise by operating in service of the global community
- Govern the application of Business Excellence Requirements
- Monitor the monthly DEI-KVI reports
- Lead data correction
- Lead for Data continuous excellence
- Lead for data pollution prevention
- Propose for suspension of data usage if serious concerns arise
Data quality refers to the state of qualitative or quantitative pieces of information. There are many definitions of data quality, but data is generally considered high quality if it is “fit for intended uses in operations, decision making and planning”.
Data synchronisation is the act of maintaining consistency and uniformity of data between applications and where that data is stored. The purpose of data synchronisation is to ensure that the data is the same used in all devices.
The basic principle of data valorisation is to consider the data as an asset that can be owned or controlled by a person (physical or moral). Data valorisation is the mean to use data for the creation of positive economic value.
Value creation: refers to value (as per data excellence definition) and its realisation.
In computing, a data warehouse, also known as an enterprise data warehouse, is a system used for reporting and data analysis, and is considered a core component of business intelligence. Data Warehouses are central repositories of integrated data from one or more disparate sources.
DEF - Data Excellence Framework
The Data Excellence Framework (DEF) provided by Global data Excellence (GDE) describes the methodology, processes and roles required to generate business value while improving business processes using data quality and business rules [El Abed, 2009]. The framework supports the creation of a new cultural shift focused on Data Excellence, motivating the broader team and supporting collaboration between the stakeholders.
DEF Methodology - DEMS
The Data Excellence Framework’s (DEF) methodology emphasizes a value driven approach for enabling Business Excellence through Data Excellence. Data Excellence becomes an imperative to unlock the enterprise potential and enable sustainable value generation. Thus, Data is made visible to become a resource for the enterprise to make informed decisions and actions.
DEI - Data Excellence Index - DEMS
The data Excellence Index (DEI) is the instrument used to measure the compliance of data records with the Business Excellence Requirements
- The percentage of data records compliant with a collection of Business Excellence Requirements.
- The list of non-compliant records.
The processing layer is machine-generated computer code that runs on the organisational data servers at the source. It automates the queries and the Business Excellence Requirement executions and provides responses to DEMS. DEMS agent automates a dynamic data integration on the fly without moving data from it place with the purpose to provide answers and to discard the integrated view. DEMS agent runs strictly over the control of the data sources. It has an octopus-like structure, where multiple agents can be deployed in parallel. It suits any organisational data and operational structure. The DEMS agents access data directly in the source, thus avoiding unnecessary duplications or modifications. Wizards are available for easy control of generated algorithms, computer codes and technical logic, and security is enhanced with encrypted communication.
DEO - Data Excellence Object - DEMS
The Data Excellence Object is the collection of high value Data records used across the processes and transactions to run the business.
The Data Excellence Index for a DEO is the result of the collection of Business Excellence Requirements applied to the DEO.
Domain of Excellence – DEMS
A domain is a structured natural language data base organised according to the semantic meta model (SMM) and that contains the meta information and knowledge related to a subject or a theme.
The Electronic Commerce Code Management Association (ECCMA), is a member based, international not-for-profit association committed to improving data quality through the implementation of international standards. ECCMA is the current project leader for the development of ISO 8000 and ISO 22745, which are the international standards for data quality and the exchange of material and service master data.
ECCMA provides a platform for collaboration amongst subject experts on data quality and data governance around the world to build and maintain global, open standard dictionaries that are used to unambiguously label information. The existence of these dictionaries of labels allows information to be passed from one computer system to another without losing meaning.
End-to-end describe a system or service from beginning to end and delivers a complete functional solution, usually without needing to obtain anything from a third party.
Enterprise Resource Planning (ERP)
Enterprise resource planning is the integrated management of main business processes, often in real time and mediated by software and technology.
Excellence is the state of maintaining an absolute state of conformity to all criteria necessary to create an ultimate value, reconciling environmental, social equity and economic dimensions.
The Swiss Financial Market Supervisory Authority is the Swiss government body responsible for financial regulation. This includes the supervision of banks, insurance companies, stock exchanges and securities dealers, as well as other financial intermediaries in Switzerland.
In the context of relational databases, a foreign key is a field (or collection of fields) in one table that uniquely identifies a row of another table or the same table.
General Data Protection Regulation (GDPR)
The European Union regulation 2016/679 named the General Data Protection Regulation (on the protection of the natural persons with regards to the processing of personal data and on the free movement of such data, also known as the “GDPR”) imposes the highest compliance standard to date on the protection of personal data to virtually any person or legal entity in the world.
A simple list of terms and their definitions. A glossary focuses on creating a complete list of the terminology of domain-specific terms and acronyms.
Governance describes the overall management approach through which senior executives direct and control the entire organisation, using a combination of management information and hierarchical management control structures. Governance activities ensure that critical management information reaching the executive team is sufficiently complete, accurate and timely to enable appropriate management decision making, and provide the control mechanisms to ensure that strategies, directions and instructions from management are carried out systematically and effectively.
Governance model by DEMS
Most companies still invest in the core business functions focusing on business process optimization and cost rationalisations based on past transactions while the Data Excellence proposes a global, systemic and prescriptive data-driven “govern by value” concept.
The Data Excellence Framework involves the absolute “professionalisation” of the Data Excellence governance functions, which is operationalised through collaborative networks where the Data Excellence stewardship plays a pivotal role between business, data management and IT. The Data Excellence governance model establishes the accountability and the responsibility at each organisational level and according to the organisation’s geography in order to maximise the business value of enterprise data.
Govern by Value - DEMS
Govern by value is the leading concept of the Data Excellence Framework DEF in order to enable the new paradigm shift. It supports the creation of a new cultural shift focused on new concept called data excellence
GRC - Governance Risk management and Compliance
Governance, risk management and compliance or GRC is the umbrella term covering an organisation’s approach across these three areas: governance, risk management, and compliance. The first scholarly research on GRC was published in 2007 where GRC was formally defined as “the integrated collection of capabilities that enable an organisation to reliably achieve objectives, address uncertainty and act with integrity.” the research referred to common “keep the company on track” activities conducted in departments such as internal audit, compliance, risk, legal, finance, IT, HR as well as the lines of business, executive suite and the board itself.
Identifier - DEMS
An Identifier is an attribute of an entity whose values uniquely determine its occurrences.
Informational word - DEMS
According to the semantic meta model SMM, an informational word is a lexical item which corresponds to a domain specific informational lexis.
Intellectual capital is the intangible value of a business, covering its people, the value relating to its relationships, and everything that is left when the employees go home, of which intellectual property is but one component.
ISO, chosen to be the identical acronym in all languages – an example of standardisation, is an international standards body made up of representatives of national standards organisations from 165 countries.
Knowledge Management (KM)
Knowledge management is the process of creating, sharing, using and managing the knowledge and information of an organisation. It refers to a multidisciplinary approach to achieve organisational objectives by making the best use of knowledge.
Key Performance Indicator (KPI)
A performance indicator or key performance indicator is a type of performance measurement. KPIs evaluate the success of an organisation or of a particular activity in which it engages.
Key Value Indicator (KVI) - DEMS
A Key Value Indicator is a measurement of the value and impact of the Data Excellence Index on the business operations
- Value: Related to the collection of Data elements of the Data Excellence Index that comply with the Business Excellence Requirements
- Impact: Related to the collection of defective Data elements of the Data Excellence Index that do not comply with the Business Excellence Requirements
A fundamental KVI represents the key business value and impact affected by a specific Business Excellence Requirement
Lexicology is the part of linguistics which studies words. This may include their nature and function as symbols, their meaning, the relationship of their meaning to epistemology in general, and the rules of their composition from smaller elements (morphemes such as the English -ed marker for past or un- for negation; and phonemes as basic sound units). Lexicology also involves relations between words, which may involve semantics (for example, love vs. Affection), derivation (for example, fathom vs. Unfathomably), use and sociolinguistic distinctions (for example, flesh vs. Meat), and any other issues involved in analysing the whole lexicon of a language.
Linguistics is the scientific study of language, and involves an analysis of language form, language meaning, and language in context.
Master Data Management (MDM)
In business, master data management is a method used to define and manage the critical data of an organisation to provide, with data integration, a single point of reference. The data that is mastered may include reference data – the set of permissible values, and the analytical data that supports decision making.
Measurement instruments - DEMS
The Data Excellence Index (DEI) and the Key Value Indicator (KVI) are key deliverables of the Data Excellence Framework.
The DEI is the instrument used to measure the compliance of Data records with the Business Excellence Requirements.
The DEI results are used to evaluate the value and the impact of Data on business operations and transactions. The BER concept makes the DEI actionable at the record level allowing the finest root cause analysis and surgical Data governance.
The DEI is obtained by the contextual polarisation, which is an innovative technique to visualise and organise systematically the DEI results (components) according to the context and the level.
The KVI is a measurement of the value and impact of the DEI on business operations.
A fundamental KVI represents the key business value and the impact affected by a specific Business Excellence Requirement.
The KVI for any organisational level can be obtained by a contextual polarisation together with the DEI. The KVI is a fundamental deliverable of the Data Excellence Framework. Through it, the framework enables multifocal governance linking business management and Data management. The business transaction is therefore considered as a key component that enables managing Data as a company asset. The DEI becomes the pivot that links the BER, the KVI, and the Data elements.
“Data that provides information about other data”. In other words, it is “data about data.”
A system that consists of the development, acquisition, maintenance and use of complex systems of communication, particularly the human ability to do so; and a language is any specific example of such a system.
Natural Intelligence expanded Universe System
A framework that does not impact privacy and individual rights.
Operational efficiency is the ability to provide products or services in the most cost-effective manner without compromising quality. A system that consists of the development, acquisition, maintenance and use of complex systems of communication, particularly the human ability to do so; and a language is any specific example of such a system.
A framework containing the basic assumptions, ways of thinking, and methodology that are commonly accepted by members of a scientific community.
Such a cognitive framework shared by members of any discipline or group.
PEC - Perpetual Excellence Community
Perpetual excellence Community – a scientific research group from Geneva
The Perpetual excellence Community will be created by the adhesion of multi-level social units to the objective of maintaining an absolute state of conformity to all criteria necessary to create an ultimate value, reconciling environmental, social equity and economic dimensions.
To define personal data, account must be taken of all the means available to the “data controller” to determine whether a person is identifiable.
Physical Data Model
A Physical Data Model (PDM) is the transformation of logical data model to have a model that can be implemented under a specific Relational Data Base Management System like SQL-Server, DB2, Oracle, etc…
A pilot plan, product, or system is used to test how good something is before introducing or buying it.
A platform is the environment in which a piece of software is executed. It can be a web browser and associated application programming interfaces, or other underlying software, as long as the program code is executed with it.
Procurement is the process of finding and agreeing to terms, and acquiring goods, services, or works from an external source, often via a tendering or competitive bidding process
Qualitative Value meaning: related to rules rather than numbers
Quantitative value can be measured, expressed by numbers
The willingness and commitment of data managers to be responsible for individual data records ensuring the compliance with Business Excellence Requirement.
Risk management is the set of processes through which management identifies, analyses, and, where necessary, responds appropriately to risks that might adversely affect realisation of the organisation’s business objectives. The response to risks typically depends on their perceived gravity, and involves controlling, avoiding, accepting or transferring them to a third party. Whereas organisations routinely manage a wide range of risks (e.g. Technological risks, commercial/financial risks, information security risks etc.), external legal and regulatory compliance risks are arguably the key issue in GRC.
A Semantic Mapping Data Model (SMDM) is a semantic data model mapped to a physical data model. A semantic data model in software engineering has various meanings: It is a conceptual data model in which semantic information is included. This means that the model describes the meaning of its instances. Such a semantic data model is an abstraction that defines how the stored symbols (the instance data) relate to the real world. It is a conceptual data model that includes the capability to express information that enables parties to the information exchange to interpret meaning (semantics) from the instances, without the need to know the meta-model. Such semantic models are fact-oriented (as opposed to object-oriented).
Semantic Meta Model (SMM)
The theoretical basis for the SMM is that of the semantic modelling of variously business rules and queries. For example, any query of a data source, independent of the language in which the query is expressed (natural or formal), is considered as following certain semantic rules that are essential for its understanding. A SMM enables the construction of meta bases which allow on the one hand the classifying of lexis appearing in queries and on the other hand the creation of links that can occur between the lexis. Using this approach, the influence of a specific natural language is prima facie slight and indeed is contained in a parametric fashion (as data within the meta base).
Semantics is the linguistic and philosophical study of meaning, in language, programming languages, formal logics, and semiotics. It is concerned with the relationship between signifiers—like words, phrases, signs, and symbols—and what they stand For, their denotation.
The semantic spectrum (sometimes referred to as the ontology spectrum or the smart data continuum or semantic precision) is a series of increasingly precise or rather semantically expressive definitions for data elements in knowledge representations, especially for machine use.
Seme, the smallest unit of meaning recognised in semantics, refers to a single characteristic of a sememe. These characteristics are defined according to the differences between sememes.
Semiosis is any form of activity, conduct, or process that involves signs, including the production of meaning.
Sensemaking is the process by which people give meaning to their collective experiences.
Sensemaking (information science)
While sensemaking has been studied by other disciplines under other names for centuries, in information science and computer science the term “sensemaking” has primarily marked two distinct but related topics.
Able to maintain at length without interruption or weakening.
The value of something such as a quality, attitude, or method is its importance or usefulness. If you place a particular value on something, that is the importance or usefulness you think it has.
In linguistics, “value” is polysemous, i.e., a word that has multiple meanings. Value has two meanings: the first is the “quantitative meaning”: what can be measured and expressed by numbers (e.g., money, energy), and the second is the “Qualitative meaning”: what can be expressed with rules rather than numbers and cannot be quantified by physical measures (e.g., humanity, human rights, policies, rules).
Value Creation refers to value (as per data excellence definition) and its realisation.
Value driven means putting the value (as per data excellence definition) at the center of decision-making framework and business activities. Value becomes the only recognised reference for actions.
The data Excellence Framework is based on four value pillars that are essential to the survival of any organisation or enterprise in the information age: agility, trust, Intelligence and transparency. These characteristics are fundamental value pillars to enable business sustainability and support economical growth.
Verbatim - DEMS
DEMS verbatim is a multilingual business semantic glossary, its main advantage is its ability to link the human semantics to the physical data model taking into consideration the context of usage. The overall goal of the semantic data model is to capture more meaning of data by integrating relational concept with more powerful abstraction concept known from the artificial intelligence field. It helps users understand the business language and the business meaning of data. By delivering accessible and trustworthy data, the business semantic glossary helps users to search, understand, trust and use data to support your business. It allows you to find your data easily without meeting conflicting meanings. The business semantic glossary provides data users with a dynamic semantic mapping view of all the business terms by displaying all the business terms along with their related data, metadata, and data lineage.
In linguistics, a word is the smallest element that can be uttered in isolation with objective or practical meaning.