Centralpoint Glossary

Results 1 - 500 of 1067
  • Training

    No, not the type that goes on in a gym. Employee training, that is. MDM is not just about software. It’s about the people using the software, hence they need to know how to use it best in order to maximize the Return on Investment (ROI). MDM users will have to receive training from either the MDM vendor, consultants or from your employees who already have experience with the solution.
  • User Interface (UI)

    The part of the machine that handles the human–machine interaction. In an MDM solution—and in all other software solutions—users have an "entrance," an interface from where they are interacting with and operating the solution. As is the case for all UIs, the UI in an MDM solution needs to be user-friendly and intuitive.
  • Vendor

    There are many Master Data Management vendors on the market. How do you choose the right one? It all depends on your business needs, as each vendor is often specialized in some areas of MDM more than others. However, there are some things you generally should be aware of, such as scalability (Is the system expandable in order to grow with your business?), proven success (Does the vendor have solid references confirming the business value?) and integration (Does the solution integrate with the systems you ne
  • Warehouse

    A data warehouse—or EDW (Enterprise Data Warehouse)—is a central repository for corporate information and data derived from operational systems and external data sources, used to generate analytics and insight. In contrast to the data lake, a data warehouse stores vast amounts of typically structured data that is predefined before entering the data warehouse. The data warehouse is not a replacement for Master Data Management, as MDM can support the EDW by feeding reliable, high-quality data into the system.
  • Warm standby

    Warm standby is a redundant method in which the secondary (i.e., standby) server runs and receives replicated data from the primary server. Since the data is replicated to the standby server at regular intervals, there are times when both the servers do not contain the exact same data.
  • What is the Golden Record?

    In the MDM world, also sometimes referred to as "the single version of the truth." This is the state you want your master data to be in and what every MDM solution is working toward creating
  • What is the Source of Truth

    In the MDM world, also sometimes referred to as "the single version of the truth." This is the state you want your master data to be in and what every MDM solution is working toward creating
  • Workflow automation

    An essential functionality in an MDM solution is the ability to set up workflows, a series of automated actions for steps in a business process. Preconfigured workflows in an MDM solution generate tasks, which are presented to the relevant business users. For instance, a workflow automation is able to notify the data steward of data errors and guide him through fixing the problem.
  • Workgroup

    Group of users in a company who share the same Content Store.
  • Workgroup configuration

    Set of business rules that define the workflow for the DITA and non-DITA documents in the Content Store. This configuration specifies the sequence of states that the documents must follow as well as access rights, user group definitions, dictionaries, etc.
  • Yottabyte

    Largest data storage unit (i.e., 1,000,000,000,000,000,000,000,000 bytes). No Master Data Management solution, or any other data storage solution, can handle this amount yet. But, scalability should be a considerable factor for which MDM solution you choose.
  • A Data Pool

    A data pool is a centralized repository of data where trading partners (e.g., retailers, distributors or suppliers) can obtain, maintain and exchange information about products in a standard format. Suppliers can, for instance, upload data to a data pool that cooperating retailers can then receive through their data pool.
  • Analytics

    The discovery of meaningful patterns in data. For businesses, data analytics are used to gain insight and thereby optimize processes and business strategies. Master Data Management can support analytics by providing organized master data as the basis of the analysis or link trusted master data to new types of information output from analytics.
  • Application Data Management (ADM)

    The management and governance of the application data required to operate a specific business application. ADM performs a similar role to MDM, but on a much smaller scale as it only enables the management of data used by a single application.
  • Application Programming Interface (API)

    An integrated part of most software, such as applications and operating systems, that allows one piece of software to interact with other types of software. In Master Data Management, not all functions can necessarily be handled in the software platform itself. For instance, you want to be able to deliver or receive data to or from external systems and applications. By using APIs built into the software, you can do that and thereby expand the functionality of your MDM solution.
  • Assets

    In the MDM lingo, an asset can be understood in slightly different ways. There’s the term "data as an asset," where asset is defined as something that can be "owned" or "controlled" to produce value. Here we talk about a way of perceiving something as an asset. But, when you hear about asset management and enterprise assets in conjunction with MDM, an asset is a more tangible thing of which the management can be optimized. Assets can be physical (people, buildings, parts, computers) and digital (data, image
  • Attributes

    In MDM, an attribute is a specification or characteristic that helps define an entity. For instance, a product can have several attributes, such as color, material, size and components. MDM supports the management of product data, including related attribute data.
  • B2B, B2C, B2B2C

    Whether you operate as a Business-to-Business company, Business-to-Consumer company or any combination, Master Data Management can be applicable if you deal with large amounts of master data about, for instance, products, customers, assets, locations or employees.
  • Bill Of Materials (BOM)

    A list of the parts or components that are required to build a product.
  • Business Intelligence (BI)

    Business Intelligence is a type of analytics. It entails strategies and technologies that help organizations understand their operations, customers, financials, product performance and a number of other key business measurements. MDM supports BI efforts by feeding the BI solution with trusted master data.
  • Business rules

    Business rules are conditions or actions set up in your MDM solution that allow for the modification of your data. According to your business rules, you can determine how your data is organized, categorized, enriched and managed. Business rules are typically used in workflows.
  • Change Management

    The preparation and support of individuals, teams and organizations in making organizational change. A necessity in any MDM implementation if you want to maximize the ROI, as it is very much about changing processes and mindsets.
  • Cleansing

    As in data cleansing. The process of identifying, removing and/or correcting inaccurate data records (e.g., by deduplicating data). Data cleansing eliminates the problems of useless data to ensure quality and consistency throughout the enterprise, and is an integral process of any decent Master Data Management process.
  • Cloud

    MDM solutions come in many variations, and a central question of today is whether to host it on-premises or in the cloud (or a mix, called a hybrid). Cloud MDM is slowly on the rise, and many vendors offer the possibility to host in the cloud, but still the majority of companies choose an on-premise solution due to security concerns. With a hosted cloud solution, typically run on Amazon's Web Services, Microsoft's Azure or Google Cloud, organizations don’t have to install, configure, maintain and host the h
  • Cold standby

    Cold standby is a redundancy method in which the secondary (or standby) system is turned on manually when the primary system fails.
  • Communication

    Is something you don’t want to forget in the implementation of an MDM solution. It’s important that the whole company is made aware of what MDM is, what value it brings, and what it means for everyone on a daily basis. That’s the only way people will commit to it.
  • Content Store

    Stores all the DITA content (maps, topics), non-DITA content (images, PDF files, videos, etc.), and workgroup configuration for the DITA CMS.
  • Contextual

    As in contextual Master Data Management. Sometimes known under the name situational MDM (ref. Gartner Hype Cycle). It refers to the management of changeable master data as opposed to traditional, more static, master data. As products and services get more complex and personalized, so does the data, making the management of it equally complex. The dynamic and contextual Master Data Management is forecast to be one of the next big hypes in the MDM world.
  • Customer Data Integration (CDI)

    The process of combining customer information acquired from internal and external sources to generate a consolidated customer view. CDI is often considered a subset of MDM for customer data.
  • Customer Data Platform (CDP)

    A marketing system that unifies a company’s customer data from marketing and other channels to optimize the timing and targeting of messages and offers. An MDM platform supports a CDP by linking the CDP data to other master data, such as product and supplier data, maximizing the potential of the data.
  • Customer Master Data Management (CMDM)

    Also sometimes referred to as MDM of customer data. The aim is to get one single and accurate set of data on each of your business customers—the so-called 360-degree customer view—across systems, locations and more, in order to create the best possible customer experience and optimize processes.
  • Customer Relationship Management (CRM)

    A system that can help businesses manage business relationships and the data and information associated with them. For smaller businesses a CRM system can be enough to manage the complexity of customer data, but in most cases organizations have several CRM systems used to various degrees and with various purposes. For instance, the sales and marketing organization will often use one system, the financial department another, and perhaps procurement a third. MDM can provide the critical link between these sys
  • Data Governance

    Data Governance is a collection of practices and processes aiming to create and maintain a healthy organizational data framework, by establishing processes that ensure that data is formally managed throughout the enterprise. It can include creating policies and processes around version control, approvals, etc., to maintain the accuracy and accountability of the organizational information. Data governance is as such not a technical discipline but an indispensable discipline of a modern organization—and a fun
  • Data Lake

    A data lake is a place to store your data, usually in its raw form without changing it. The idea of the data lake is to provide a place for the unaltered data in its native format until it’s needed. Why? Certain business disciplines such as advanced analytics depend on detailed source data. A data lake is the opposite of a data warehouse, but often the data lake will be an addition to a data warehouse.
  • Data Quality

    As in data quality, also sometimes just shortened into DQ. An undeniable part of any MDM vendor’s vocabulary as a high level of data quality is what a Master Data Management solution is constantly seeking to achieve and maintain. Data quality can be defined as a given data set’s ability to serve its intended purpose. In other words, if you have data quality, your data is capable of delivering the insight you require. Data quality is characterized by, for example, data accuracy, validity, reliability, comple
  • Data Silos

    When navigating the MDM landscape you will often come across the term data silos. A term describing when crucial data or information, such as master data, is held separately whether by individuals, departments, regions or systems. MDMs' finest purpose is to "break down data silos."
  • Data Stewardship

    Data stewardship is the management and oversight of an organization's data assets to help provide business users with high-quality data that is easily accessible in a consistent manner. Data stewards will often be the ones in an organization responsible for the day-to-day data governance.
  • Data Syndication

    Data syndication is basically the onboarding of data provided from external sources, such as suppliers. An MDM solution will typically automate the process of receiving external data while making sure that high-quality criteria are met.
  • Data Universal Numbering System (D-U-N-S)

    A D-U-N-S number is a unique nine-digit identifier for each single business entity, provided by Dun & Bradstreet. The system is widely used as a standard business identifier. A decent MDM solution will be able to support the use of D-U-N-S by providing an integration between the two systems.
  • Deduplication

    The process of eliminating redundant data in a data set, by identifying and removing extra copies of the same data, leaving only one high-quality data set to be stored. Data duplicates are a common business problem, causing wasted resources and leading to bad customer experiences. When implementing a Master Data Management solution, thorough deduplication is a crucial part of the process.
  • Digital Asset Management (DAM)

    The business management of digital assets, most often images, videos, digital files and their metadata. Many businesses have a standalone or home-grown DAM solution, inhibiting the efficiency of the data flow and thereby delaying processes, such as on-boarding new products into an e-commerce site. MDM lets you handle your digital assets more efficiently and connects it to other data. DAM can be a prebuilt function in some MDM solutions.
  • Digital Transformation

    (or Digital Disruption). Refers to the changes associated with the use of digital technology in all aspects of human society. For businesses, a central aspect of Digital Transformation is the "always-online" consumer, forcing organizations to change their business strategy and thinking in order to deliver excellent customer experiences. Digital Transformation also has major impact on efficiency and workflows (e.g., the so-called Fourth Industrial Revolution driven by automation and data, also known as Indus
  • Domain

    In the MDM world a domain is understood as one of several areas in which your business can benefit from data management, for example within the product data domain, customer data domain, supplier data domain, etc.
  • Enrichment

    Data enrichment refers to processes used to enhance, refine or otherwise improve raw data. In the world of MDM, enriching your master data can happen by including third-party data to get a more complete view, for example, such as adding social data to your customer master data. MDM eliminates manual product enrichment processes and replaces them with custom workflows, business rules and automation.
  • Enterprise Asset Management (EAM)

    The management of the assets of an organization (e.g., equipment and facilities).
  • Enterprise Resource Planning (ERP)

    Refers to enterprise systems and software used to manage day-to-day business activities, such as accounting, procurement, project management, inventory, sales, etc. Many businesses have several ERP systems, each managing data about products, locations or assets, for example. A comprehensive MDM solution complements an ERP by ensuring that the data from each of the data domains used by the ERP is accurate, up-to-date and synchronized across the multiple ERP instances.
  • Entity

    A classification of objects of interest to the enterprise (e.g., people, places, things, concepts and events).
  • Extract, Transform and Load (ETL)

    A process in data warehousing, responsible for pulling data out of source systems and placing it into a data warehouse. A-Z Letters-G
  • Global Standards One (GS1)

    The GS1 standards are unique identification codes used by more than one million companies worldwide. The standards aim to create a common foundation for businesses when identifying and sharing vital information about products, locations, assets and more. The most recognizable GS1 standards are the barcode and the radio-frequency identification (RFID) tags. An MDM solution will support and integrate the GS1 standards across industries.A-Z Letters-H
  • Golden Record

    In the MDM world, also sometimes referred to as "the single version of the truth." This is the state you want your master data to be in and what every MDM solution is working toward creating
  • Hierarchy Management

    An essential aspect of MDM that allows users to productively manage complex hierarchies spread over one or more domains and change them into a formal structure that can be used throughout the enterprise. Products, customers and organizational structures are all examples of domains where a hierarchy structure can be beneficial (e.g., in defining the hierarchical structure of a household in relation to a customer data record).
  • Hub

    A data hub or an enterprise data hub (EDH) is a database that is populated with data from one or more sources and from which data is taken to one or more destinations. An MDM system is an example of a data hub, and therefore sometimes goes under the name Master Data Management hub.
  • Hypervisor with virtual machine

    A hypervisor, also called a virtual machine manager, is a program that allows multiple operating systems to share a single hardware host. The hypervisor controls the host processor and resources, allocates the needs of each operating system, and checks that the guest operating systems or the virtual machines do not disrupt each other.
  • Identity resolution

    A data management process where an individual is identified from disparate data sets and databases to resolve their identity. This process relates to Customer Master Data Management.
  • Information

    Information is the output of data that has been analyzed and/or processed in any manner.
  • Integration

    One of the biggest advantages of an MDM solution is its ability to integrate with various systems and link all of the data held in each of them to each other. A system integrator will often be brought on board to provide the implementation services.
  • Internet of Things (IoT)

    Internet of Things is the network of physical devices embedded with connectivity technology that enables these "things" to connect and exchange data. IoT technology represents a huge opportunity—and challenge—for organizations across industries as they can access new levels of data. A Master Data Management solution supports IoT initiatives by, for example, linking trusted master data to IoT-generated data as well as supporting a data governance framework for IoT data.
  • Location data

    Data about locations. Solutions that add location data management to the mix, such as Location Master Data Management, are on the rise. Effectively linking location data to other master data such as product data, supplier data, asset data or customer data can give you a more complete picture and enhance processes and customer experiences.A-Z Letters-M
  • Maintenance.

    In order for any data management investment to continue delivering value, you need to maintain every aspect of a data record, including hierarchy, structure, validations, approvals and versioning, as well as master data attributes, descriptions, documentation and other related data components. Master data maintenance is often enabled by automated workflows, such as pushing out notifications to data stewards when there’s a need for a manual action. Maintenance is an unavoidable and ongoing process of any MD
  • Matching

    (and linking and merging). Key functionalities in a Customer Master Data Management solution with the purpose of identifying and handling duplicates to achieve a Golden Record. The matching algorithm constantly analyzes or matches the source records to determine which represent the same individual or organization. While the linking functionality persists all the source records and link them to the Golden Record, the merging functionality selects a survivor and non-survivor. The Golden Record is based only o
  • MDM Architecture

    An MDM solution is not just something you buy, then start to use. It needs to be fitted into your specific enterprise setup and integrated with the overall enterprise architecture and infrastructure, which is why MDM architecture is required as one of the first steps in an MDM process.
  • MDM Strategy

    As with all major business initiatives, MDM needs a thorough, coherent, well-communicated business strategy in order to be as successful as possible.
  • Metadata management

    The management of data about data. Metadata Management helps an organization understand the what, where, why, when and how of its data
  • Modelling

    Modelling in Master Data Management is a process in the beginning of an MDM implementation where you accurately map and define the relationship between the core enterprise entities (e.g., your products and their attributes). Based on that you create the optimal master data model that best fits your organizational setup.
  • Multi Tenancy

    A multitenancy Master Data Management solution masters the data of several enterprise domains, such as product and supplier domain, or customer and product domain or any combination handling more than one domain.
  • Multidomain

    A multidomain Master Data Management solution masters the data of several enterprise domains, such as product and supplier domain, or customer and product domain or any combination handling more than one domain.
  • Multi-site deployment

    A deployment where users work with the DITA CMS from multiple locations.
  • New Product Development (NPD)

    A discipline in Product Lifecycle Management (PLM) that aims to support the management of introducing a new product line or assortment, from idea to launch, including its ideation, research, creation, testing, updating and marketing.
  • Note

    Before the system on cold standby is brought online, the latest available copy of the data (including the Content Store) need to be installed or mounted in the cold standby system.
  • Omnichannel

    A term mostly used in retail to describe the creation of integrated, seamless customer experiences across all customer touchpoints. If you offer an omnichannel customer experience, your customers will meet the same service, offers, product information and more, no matter where they interact with your brand (e.g., in-store, on social media, via email, customer service, etc.). The term stems from the Latin word omni, meaning everything or everywhere, and it has surpassed similar terms such as multi-channel an
  • Party data

    In relation to Master Data Management, party data is understood in two different ways. First of all, party data can mean data defined by its source. You will typically hear about first, second and third-party data. First-party data being your own data, second-party data being someone else’s first-party data handed over to you, while third-party data is collected by someone with no relation to you—and probably sold to you. However, when talking about party data management, party data refers to master data t
  • Personally Identifiable Information (PII)

    In Europe often just referred to as personal information. PII is sensitive information that identifies a person, directly (on its own) or indirectly (in combination). Examples of direct PII include name, address, phone number, email address and passport number, while examples of indirect PII include a combination (e.g., workplace and job title or maiden name in combination with date and place of birth).
  • Platform

    A comprehensive technology used as a base upon which other applications, processes or technologies are developed. An example of a software platform is an MDM platform.
  • Primary site

    Primary or MAIN Site (in Centralpoint) In a multi-site deployment, this is the site where the primary Content Store is located.
  • Product Information Management (PIM)

    Today sometimes also referred to as Product MDM, Product Data Management (PDM) or Master Data Management for products. No matter the naming, PIM refers to a set of processes used to centrally manage and evaluate, identify, store, share and distribute product data or information about products. PIM is enabled with the implementation of PIM or Product Master Data Management software.
  • Product Lifecycle Management (PLM)

    The process of managing the entire lifecycle of a product from ideation, through design, product development, sourcing and selling. The backbone of PLM is a business system that can efficiently handle the product information full-circle, and significantly increase time to market through streamlined processes and collaboration. That can be a standalone PLM tool or part of a comprehensive MDM platform.
  • Profiling

    Data profiling is a technique used to examine data from an existing information source, such as a database, to determine its accuracy and completeness and share those findings through statistics or informative summaries. Conducting a thorough data profiling assessment in the beginning of a Master Data Management implementation is recognized as a vital first step toward gaining control over organizational data as it helps identify and address potential data issues, enabling architects to design a better solu
  • Reference data

    Data that define values relevant to cross-functional organizational transactions. Reference data management aims to effectively define data fields, such as units of measurements, fixed conversion rates and calendar structures, to "translate" these values into a common language in order to categorize data in a consistent way and secure data quality. Reference Data Management (RDM) systems can be the solution for some organizations, while others manage reference data as part of a comprehensive Master Data Man
  • Replication

    Process of creating one or more copies of the Content Store. These copies are then synchronized with the primary Content Store to provide read caching for geographically distributed teams, allowing them to perform read operations more effectively.If replication is implemented, all the read operations for users at secondary sites are performed locally on the replicated Content Store, while write operations are performed on the primary Content Store, over the network.
  • Satellite user

    A user who works from home or a very small office (with very few DITA CMS users).
  • Secondary site

    In a multi-site deployment, users at a secondary site are connected to the rest of the company through a wide area network (WAN) and they access the DITA CMS using a remote application solution. If replication is used, a replicated Content Store may be located at this site.
  • Single-site deployment

    A deployment where all the users are in the same geographical location (office).
  • Software as a Service (SaaS)

    A software licensing and delivery model in which software is licensed on a subscription basis and is centrally hosted. SaaS is on the rise, due to change in consumer behavior and based on the higher demand for a more flat-rate pricing model, since these solutions are typically paid on a monthly or quarterly basis. SaaS is typically used in cloud MDM, for instance.
  • Stack

    The collection of software or technology that forms an organization’s operational infrastructure. The term stack is used in reference to software (software stack), technology (technology stack) or simply solution (solution stack) and refers to the underlying systems that make your business run smoothly. For instance, an MDM solution can—in combination with other solutions—be a crucial part of your software stack.
  • Stock Keeping Unit (SKU)

    A SKU represents an individual item, product or service manifested in a code, uniquely identifying that item, product or service. SKU codes are used in business to track inventory. It’s often a machine-readable bar code, providing an additional layer of uniqueness and identification.
  • Supplier data

    Data about suppliers. One of the domains on which MDM can be beneficial. May be included in an MDM setup in combination with other domains, such as product data.
  • Supply Chain Management (SCM)

    The management of material and information flow in an organization—everything from product development, sourcing, production and logistics, as well as the information systems—to provide the highest degree of customer satisfaction, on time and at the lowest possible cost. A PLM solution or PLM MDM solution can be a critical factor for driving effective supply chain management.
  • Swamp

    A data swamp is a deteriorated data lake, that is inaccessible to its intended users and provides little value.
  • Synchronization

    The operation or activity of two or more things at the same time or rate. Applied to data management, data synchronization is the process of establishing data consistency from one endpoint to another and continuously harmonizing the data over time. MDM can be the key enabler for global or local data synchronization.
  • Action-based Planning

    The goal of action-based planning is to determine how to decompose a high level action into a network of subactions that perform the requisite task. Therefore the major task within such a planning system is to manage the constraints that apply to the interrelationships (e.g., ordering constraints) between actions. In fact, action-based planning is best viewed as a constraint satisfaction problem.
  • Adaptive Interface

    A computer interface that automatically and dynamically adapts to the needs and competence of each individual user of the software.
  • Agent Architecture

    There are two levels of agent architecture, when a number of agents are to work together for a common goal. There is the architecture of the system of agents, that will determine how they work together, and which does not need to be concerned with how individual agents fulfil their sub-missions; and the architecture of each individual agent, which does determine its inner workings. The architecture of one software agent will permit interactions among most of the following components (depending on the agent
  • Agents

    Agents are software programs that are capable of autonomous, flexible, purposeful and reasoning action in pursuit of one or more goals. They are designed to take timely action in response to external stimuli from their environment on behalf of a human. When multiple agents are being used together in a system, individual agents are expected to interact together as appropriate to achieve the goals of the overall system. Also called autonomous agents, assistants, brokers, bots, droids, intelligent agents, so
  • AI Effect

    The great practical benefits of AI applications and even the existence of AI in many software products go largely unnoticed by many despite the already widespread use of AI techniques in software. This is the AI effect. Many marketing people don’t use the term “artificial intelligence” even when their company’s products rely on some AI techniques. Why not? It may be because AI was oversold in the first giddy days of practical rule-based expert systems in the 1980s, with the peak perhaps marked by the Busin
  • AI Languages and Tools

    AI software has different requirements from other, conventional software. Therefore, specific languages for AI software have been developed. These include LISP, Prolog, and Smalltalk. While these languages often reduce the time to develop an artificial intelligence application, they can lengthen the time to execute the application. Therefore, much AI software is now written in languages such as C++ and Java, which typically increases development time, but shortens execution time. Also, to reduce the cost o
  • Algorithm

    a formula given to a computer in order for it to complete a task (i.e. a set of rules for a computer)
  • Analysis

    Analysis involves the examination of complex information in order to ascertain what has happened (or is about to happen), what it mean, and what should be done about it. The fundamental forms of analysis are
  • Applications of Artificial Intelligence

    The actual and potential applications are virtually endless. Reviewing Stottler Henke’s work will give you some idea of the range. In general, AI applications are used to increase the productivity of knowledge workers by intelligently automating their tasks; or to make technical products of all kinds easier to use for both workers and consumers by intelligent automation of different aspects of the functionality of complex products.
  • Artificial general intelligence (AGI)

    also known as strong AI, AGI is a type of artificial intelligence that is considered human-like, and still in its preliminary stages (more of a hypothetical existence in present day)
  • Artificial intelligence

    a subset of computer science that deals with computer systems performing tasks with similar, equal, or superior intelligence to that of a human (e.g. decision-making, object classification and detection, speech recognition and translation)
  • Artificial intelligence (AI)

    is the mimicking of human thought and cognitive processes to solve complex problems automatically. AI uses techniques for writing computer code to represent and manipulate knowledge. Different techniques mimic the different ways that people think and reason (see Case-based Reasoning and Model-based Reasoning for example). AI applications can be either stand-alone software, such as decision support software, or embedded within larger software or hardware systems. AI has been around for about 50 years and wh
  • Artificial narrow intelligence (ANI)

    also known as weak AI, ANI is a type of artificial intelligence that can only focus on one task or problem at a given time (e.g. playing a game against a human competitor). This is the current existing form of AI.
  • Artificial neural network (ANN)

    A learning model created to act like a human brain that solves tasks that are too difficult for traditional computer systems to solve.
  • Associative Memories

    Associative memories work by recalling information in response to an information cue. Associative memories can be autoassociative or heteroassociative. Autoassociative memories recall the same information that is used as a cue, which can be useful to complete a partial pattern. Heteroassociative memories are useful as a memory. Human long-term memory is thought to be associative because of the way in which one thought retrieved from it leads to another. When we want to store a new item of information in o
  • Automated Diagnosis Systems

    Most diagnosis work is done by expert humans such as mechanics, engineers, doctors, firemen, customer service agents, and analysts of various kinds. All of us usually do at least a little diagnosis even if it isn’t a major part of our working lives. We use a range of techniques for our diagnoses. Primarily, we compare a current situation with past ones, and reapply, perhaps with small modifications, the best past solutions. If this doesn’t work, we may run small mental simulations of possible solutions thro
  • Automatic indexing

    Automatic indexing uses a program to select words or phrases to identify content. It often employs several indexing languages (such as a classification scheme, natural language, a controlled vocabulary, a standard industry code, or a country code).
  • Automatic target recognition (ATR)

    The ability for an algorithm or device to recognize targets or other objects based on data obtained from sensors.
  • Autonomous Agents

    A piece of AI software that automatically performs a task on a human’s behalf, or even on the behalf of another piece of AI software, so together they accomplish a useful task for a person somewhere. They are capable of independent action in dynamic, unpredictable environments. “Autonomous agent” is a trendy term that is sometimes reserved for AI software used in conjunction with the Internet (for example, AI software that acts as your assistance in intelligently managing your e-mail). Autonomous agents p
  • Backpropagation

    shorthand for “backward propagation of errors,” is a method of training neural networks where the system’s initial output is compared to the desired output, then adjusted until the difference (between outputs) becomes minimal
  • Bayesian Networks

    A modeling technique that provides a mathematically sound formalism for representing and reasoning about uncertainty, imprecision, or unpredictability in our knowledge. For example, seeing that the front lawn is wet, one might wish to determine whether it rained during the previous night. Inference algorithms can use the structure of the Bayesian network to calculate conditional probabilities based on whatever data has been observed (e.g., the street does not appear wet, so it is 90% likely that the wetnes
  • Big Data

    Big data is extremely large data sets that may be analyzed computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions.
  • Boolean logic

    Boolean logic refers to an algebraic system in which all values are reduced to TRUE or FALSE (that is, 1 or 0 in the binary system), and thus forms the basis for all electronic computing. In the context of information retrieval, Boolean operators may be used for manipulating search terms or to represent relationships between entities. The operators most frequently use are
  • Browser

    A browser is a client software program that is used to identify and locate various kinds of Internet resources.
  • Business environment

    A Business environment encompasses all those factors that affect a company’s operations; including customers, competitors, suppliers, distributors, industry trends, substitutes, regulations, government developments. It may also be referred to as Operating environment.
  • Business Intelligence (BI)

    Business Intelligence is concerned with information technology solutions for transforming the output from large data collections into intelligence; usually through the integration of sales, marketing, servicing, and support operations. It covers such activities as customer relationship management, enterprise resource planning and ecommerce using data mining techniques. Those people involved in business intelligence tend to regard it as one aspect of Knowledge Management. Systems based on business intelligen
  • Case-based Reasoning

    Case-based reasoning (CBR) solves a current problem by retrieving the solution to previous similar problems and altering those solutions to meet the current needs. It is based upon previous experiences and patterns of previous experiences. Humans with years of experience in a particular job and activity (e.g., a skilled paramedic arriving on an accident scene can often automatically know the best procedure to deal with a patient) use this technique to solve many of their problems. One advantage of CBR is t
  • Classification

    algorithm technique that allows machines to assign categories to data points
  • Clustering

    algorithm technique that allows machines to group similar data into larger data categories
  • Cognitive computing

    computerized model that mimics human thought processes by data mining, NLP, and pattern recognition
  • Cognitive Science

    Artificial intelligence can be defined as the mimicking of human thought to perform useful tasks, such as solving complex problems. This creation of new paradigms, algorithms, and techniques requires continued involvement in the human mind, the inspiration of AI. To that end, AI software designers team with cognitive psychologists and use cognitive science concepts, especially in knowledge elicitation and system design.
  • Cognitive Task Analysis

    Cognitive task analysis (CTA) is a systematic process by which the cognitive elements of task performance are identified. This includes both domain knowledge and cognitive processing. Thus, CTA focuses on mental activities that cannot be observed and is in contrast to behavioral task analysis that breaks the task down into observable, procedural steps. CTA is most useful for highly complex tasks with few observable behaviors. Examples of cognitive processing elements include
  • Collaborative Filtering

    A technique for leveraging historical data about preferences of a body of users to help make recommendations or filter information for a particular user. Intuitively, the goal of these techniques is to develop an understanding of what may be interesting to a user by uncovering what is interesting to people who are similar to that user. See GIIF and IQE for examples of applications that use collaborative filtering techniques.
  • Commonsense Reasoning

    Ordinary people manage to accomplish an extraordinary number of complex tasks just using simple, informal thought processes based on a large amount of common knowledge. They can quickly plan and undertake a shopping expedition to six or seven different shops, as well as pick up the kids from soccer and drop a book back at the library, quite efficiently without logically considering the hundreds of thousands of alternative ways to plan such an outing. They can manage their personal finances, or find their w
  • Communication

    Communication is the process whereby knowledge is codified into information by the transmitter, passed through a medium to a receiver, who then reconverts that information into new knowledge.
  • Community of practice (CoP)

    A Community of Practice is an informal, self-organizing, interactive group that develops in response to a specific, work-related activity, subject, practice, or problem of mutual interest. Membership is determined by participation and may transcend hierarchical and organizational boundaries. It provides a means for developing best practices or solutions to problems through communication, that is, through participation in the exchange of information and the creation of knowledge. A community of practice may
  • Competitive Intelligence (CI)

    Competitive Intelligence is a systematic and ethical programme for gathering, analyzing, and managing any combination of data, information, and knowledge concerning the business environment in which a company operates that, when acted upon, will confer a singificant competitive advantage or enable sound decisions to be made. Its primary role is strategic early warning.
  • Competitive monitoring

    Competitive monitoring is intended to gain early warning through regular, frequent, and proactive monitoring and reporting of changes and trends in your business environment. These changes may stimulate more intensive research or call for the use of more sophisticated analytical techniques. When confined to competitors it is known as Competitor activity tracking.
  • Competitor

    Competitor is any organization that offers the same, a similar, or a substitute product or service in the field of endeavor in which a company operates.
  • Computer Vision

    Making sense of what we see is usually easy for humans, but very hard for computers. Practical vision systems to date are limited to working in tightly controlled environments. Synonym
  • Constraint Satisfaction

    Constraints are events, conditions or rules that limit our alternatives for completing a task. For example, the foundation of a building has to be laid before the framing is done; a car has to be refueled once every four hundred miles, a neurosurgeon is needed to perform brain surgery, or a Walkman can only operate on a 9-volt battery. Satisfying constraints is particularly important in scheduling complex activities. By first considering applicable constraints, the number of possible schedules to be consid
  • Convolutional neural network (CNN)

    a type of neural network specifically created for analyzing, classifying, and clustering visual imagery by using multilayer perceptrons
  • Corporate security

    Corporate security aims at protecting knowledge assets, whether in the form of physical entities or intellectual (tangible and intangible) property.
  • Crawler

    A crawler uses existing Internet search engines to carry out automatic search and retrieval of selected information on behalf of a user. It may also be known as a web crawler.
  • Current awareness service

    A current awareness service makes available knowledge of what is being done in specific fields of endeavor through documents (such as notes, abstracts, clippings email, selective dissemination of information, and database records) or orally (such as face-to-face or telephone conversations).
  • Dashboard

    A dashboard is a visualization tool that provides graphical depictions of current key performance indicators in order to enable faster response to changes in areas such as sales, customer relations, performance assessments and inventory levels.
  • Data

    Data consists of unconnected facts, numbers, names, codes, symbols, dates, measurements, observations, words and other items of that nature that are out of context, and that only aquire meaning through association.
  • Data Fusion

    Information processing that deals with the association, correlation, and combination of data and information from single and multiple sources to achieve a more complete and more accurate assessment of a situation. The process is characterized by continuous refinement of its estimates and assessments, and by evaluation of the need for additional sources, or modification of the process itself, to achieve improved results.
  • Data Mining

    The non-trivial process of uncovering interesting and useful relationships and patterns in very large databases to guide better business and technical decisions. Data mining is becoming increasingly important due to the fact that all types of commercial and government institutions are now logging huge volumes of data and now require the means to optimize the use of these vast resources. The size of the databases to which data mining techniques are applied is what distinguishes them from more traditional
  • Data science

    An interdisciplinary field that combines scientific methods, systems, and processes from statistics, information science, and computer science to provide insight into phenomenon via either structured or unstructured data.
  • Database

    A database is a collection of interrelated data stored together without harmful or unnecessary redundancy and structured in such a manner as to serve on or more applications. The data are stored so that they are independent of programs that use the data.
  • Decision Aids

    Software that helps humans make decisions, particularly about complex matters when a high degree of expertise is needed to make a good decision.
  • Decision Support

    Decision support is a broad class of applications for artificial intelligence software. There are many situations when humans would prefer machines, particularly computers, to either automatically assist them in making decisions, or actually make and act on a decision. There are a wide range of non-AI decision support systems such as most of the process control systems successfully running chemical plants and power plants and the like under steady state conditions. However, whenever situations become more
  • Decision Theory

    Decision theory provides a basis for making choices in the face of uncertainty, based on the assignment of probabilities and payoffs to all possible outcomes of each decision. The space of possible actions and states of the world is represented by a decision tree.
  • Decision Trees

    A decision tree is a graphical representation of a hierarchical set of rules that describe how one might evaluate or classify an object of interest based on the answers to a series of questions. For instance, a decision tree can codify the sequence of tests a doctor might take in diagnosing a patient. Such a decision tree will order the tests based on their importance to the diagnostic task. The result of each successive test dictates the path you take through the tree and therefore the tests (and their
  • Deep learning

    a machine learning technique that teaches computers how to learn by rote (i.e. machines mimic learning as a human mind would, by using classification techniques)
  • Digital ecosystem

    several software platforms or cloud services that work in tandem across a network
  • Document

    A document contains recorded human knowledge, in any format; or is information structured in such a a way as to facilitate human comprehension. Essential elements usually include
  • Document Clustering

    With document clustering techniques, documents can be automatically grouped into meaningful classes so that users of a database of full-text documents can easily search through related documents.Finding individual documents from amongst large on-line, full-text collections has been a growing problem in recent years due to the falling price of computer storage capacity and the networking of document databases to large numbers of people. Traditional library indexing has not provided adequate information retr
  • Domain

    An overworked word for AI people. “Domain” can mean a variety of things including a subject area, field of knowledge, an industry, a specific job, an area of activity, a sphere of influence, or a range of interest, e.g., chemistry, medical diagnosis, putting out fires, operating a nuclear power plant, planning a wedding, diagnosing faults in a car. Generally, a domain is a system in which a particular set of rules, facts, or assumptions operates. Humans can usually easily figure out what’s meant from the c
  • Domain Expert

    The person who knows how to perform an activity within the domain, and whose knowledge is to be the subject of an expert system. This person’s or persons’ knowledge and method of work are observed, recorded, and entered into a knowledge base for use by an expert system. The domain expert’s knowledge may be supplemented by written knowledge contained in operating manuals, standards, specifications, computer programs, etc., that are used by the experts. Synonym
  • Emergence

    Emergence is the phenomenon of complex patterns of behavior arising out of the myriad interactions of simple agents, which may each operate according to a few simple rules. To put it another way, an emergent system is much more than simply the sum of its parts. It can happen without any grand master outside the system telling the individual agents how to behave. For example, all the people in a modern city acting in their individual capacities as growers, processors, distributors, sellers, buyers, and cons
  • Enterprise Content Management

    Enterprise content management refers to the use of appropriate technology, software, and methods to create, collect, manage, store, retrieve, and disseminate content of any kind, including documents and unstructured information, within an organization in order to better achieve the aims and goals of the enterprise. The practice is sometimes inappropriately referred to as Enterprise Search.
  • Expert System

    An expert system encapsulates the specialist knowledge gained from a human expert (such as a bond trader or a loan underwriter) and applies that knowledge automatically to make decisions. For example, the knowledge of doctors about how to diagnose a disease can be encapsulated in software. The process of acquiring the knowledge from the experts and their documentation and successfully incorporating it in the software is called knowledge engineering, and requires considerable skill to perform successfully.
  • Explicit Knowledge

    Explicit knowledge consists of anything that can be codified, or expressed in words, numbers, and other symbols
  • Extensible mark-up language (XML)

    Extensible mark-up language allows content producers to add metadata to non-text items, such as image, audio, or video files, and facilitates retrieval of unstructured information (an important aspect of Knowledge Management).
  • File transfer protocol (FTP)

    File transfer protocol is a very common method of moving files between Internet sites
  • Firewall

    Firewall applies to software designed to protect internal computer networks against unathorized access of intentional hostile intrusion.
  • Fuzzy Logic

    Traditional Western logic systems assume that things are either in one category or another. Yet in everyday life, we know this is often not precisely so. People aren’t just short or tall, they can be fairly short or fairly tall, and besides we differ in our opinions of what height actually corresponds to tall, anyway.. The ingredients of a cake aren’t just not mixed or mixed, they can be moderately well mixed. Fuzzy logic provides a way of taking our commonsense knowledge that most things are a matter of d
  • Fuzzy Sets

    In mathematics, Fuzzy sets are sets whose elements have degrees of membership. Fuzzy sets were introduced by Lotfi A. Zadeh and Dieter Klaua in 1965 as an extension of the classical notion of set. At the same time, Salii (1965) defined a more general kind of structures called L-relations, which were studied by him in an abstract algebraic context. Fuzzy relations, which are used now in different areas, such as linguistics (De Cock, et al, 2000), decision-making (Kuzmin, 1982) and clustering (Bezdek, 1978),
  • Game Theory

    Game theory is a branch of mathematics that seeks to model decision making in conflict situations.
  • Generative adversarial networks (GAN)

    a type of neural network that can generate seemingly authentic photographs on a superficial scale to human eyes. GAN-generated images take elements of photographic data and shape them into realistic-looking images of people, animals, and places.
  • Genetic algorithm

    an algorithm based on principles of genetics that is used to efficiently and quickly find solutions to difficult problems
  • Genetic Algorithms

    Search algorithms used in machine learning which involve iteratively generating new candidate solutions by combining two high scoring earlier (or parent) solutions in a search for a better solution. So named because of its reliance on ideas drawn from biological evolution.
  • Granularity

    Refers to the basic size of units that can be manipulated. Often refers to the level of detail or abstraction at which a particular problem is analyzed. One characteristic of human intelligence, Jerry R. Hobbs has pointed out, is the ability to conceptualize a world at different levels of granularity (complexity) and to move among them in considering problems and situations. The simpler the problem, the coarser the grain can be and still provide effective solutions to the problem.
  • Heterogeneous Databases

    Databases that contain different kinds of data, e.g, text and numerical data.
  • Heuristic

    a computer science technique designed for quick, optimal, solution-based problem solving
  • Human-Centered Computing

    Computers and other machines should be designed to effectively serve people’s needs and requirements. All too often they’re not. Commonly cited examples of this are the difficulty people have in setting up their VCR to record a TV show; and the difficulties people have in setting up a home computer facility, or hooking up to the Internet. Artificial intelligence software can be used to deliver more human-centered computing, improving system usability, extending the powerfulness of human reasoning and enabl
  • Humint

    Humint is an abbreviation for human intelligence; that gathered by people directly from people, rather than from published sources, hence so called Soft information. It may be conducted face-to-face, by means of telephone or facsimile, or online (email, chat rooms, intranets, and so on).
  • Hybrid Systems

    Many of Stottler Henke’s artificial intelligence software applications use multiple AI techniques in combination. For example, case-based reasoning may be used in combination with model-based reasoning in an automatic diagnostic system. Case-based reasoning, which tends to be less expensive to develop and faster to run, may draw on an historical databases of past equipment failures, the diagnosis of those, and the repairs effected and the outcomes achieved. So CBR may be used to make most failure diagnoses
  • Hyper Text markup language (HTML)

    Hyper Text markup language is the coding language for creating hypertext documents for use on the World Wide Web. It is very like a typesetting code, where blocks of text are surrounded by codes that indicate how it should appear. In addition, HTML allows one to specify a block of text or word that is linked to another file on the Internet.
  • Image recognition

    the process of identifying or detecting an object or feature of an object in an image or video
  • Indexing

    Indexing provides a means of labeling documents using freely selected keywords or phrases (natural language) or authorized descriptors from a taxonomy or thesaurus (controlled vocabulary), or any combination of those, together with some means of indicating it location in the system.
  • Inference Engine

    The part of an expert system responsible for drawing new conclusions from the current data and rules. The inference engine is a portion of the reusable part of an expert system (along with the user interface, a knowledge base editor, and an explanation system), that will work with different sets of case-specific data and knowledge bases.
  • Information

    Information consists of data arranged in some sort of order (for instance, by classification or rational presentation) so that they acquire meaning or reveal associations between data items. Information may also be defined as a physical surrogate of knowledge (language for instance) used for communication.
  • Information Filtering

    An information filtering system sorts through large volumes of dynamically generated information to present to the user those nuggets of information which are likely to satisfy his or her immediate needs. Information filtering overlaps the older field of information retrieval, which also deals with the selection of information. Many of the features of information retrieval system design (e.g. representation, similarity measures or boolean selection, document space visualization) are present in information f
  • Information management

    Information management is the means by which an organization maximizes the efficiency with which it plans, collects, organizes, uses, controls, stores, disseminates, and disposes of its information, and through which it ensures that the value of that information is identified and exploited to the maximum extent possible. The aim has often been described as getting the right information to the right person, in the right format and medium, at the right time. It is sometimes referred to as
  • Information Overload

    Information overload refers to the existence of, and ease of access to, bewildering amounts of information, more than can effectively absorbed or processed by an individual. It often results in an obsessive addiction to new information in an attempt to clarify matters. This may induce a continual state of distraction which leads to loss of productivity and interrupts social activities. It is also known as Information fatigue syndrome and, more colloquially, as Infoglut or Datasmog.
  • Information Retrieval

    Information retrieval involves the identification, location, and collection of specific “>documents, information contained within those documetns, or metadata describing those documents, from any suitable source.
  • Information System

    Information system refers to the application and software that perform business functions or support key processes. Performance criteria concern the quality and functionality of the software, its flexibility, and the speed and cost of development and maintenance.
  • Intellectual Capital

    Intellectual capital refers to the total knowledge within an organization that may be converted into value, or used to produce a higher value asset. The term embodies the knowledge and expertise of employees, brands, customer information and relationships, contracts, internal processes, methods, and technolgies; and intellectual property. It equates, very approximately, the difference between the book value and the market value of a company. Intellectual capital is also referred to as intellectual assets, i
  • Intelligence

    Intelligence is high-level, processed, exploitable information.
  • Intelligence audit

    An Intelligence audit is an examination of an organization’s current level of intelligence activities with the objective of improving those operations in order to gain, and maintain, a significant competitive advantage. It involves identifying those people engaged in intelligence or related operations, together with their levels of expertise, locating collections of information, as well as other relevant resources, concerning the organization’s business environment, establishing a set of key intelligence to
  • Intelligent Entities

    It is an entity that exhibits a significant degree of intelligence. It has an ability to reason, make plans, carry out plans, acquire knowledge, learn from its environment, manipulate its environment, and interact with other entities within its environment to some extent.
  • Intelligent Tutoring Systems

    encode and apply the subject matter and teaching expertise of experienced instructors, using artificial intelligence (AI) software technologies and cognitive psychology models, to provide the benefits of one-on-one instruction — automatically and cost-effectively. These systems provide coaching and hinting, evaluate each student’s performance, assess the student’s knowledge and skills, provide instructional feedback, and select appropriate next exercises for the student. See Stottler Henke case studies.
  • Invisible web

    Invisible web is that portion (estimated to be between 60 and 80 per cent) of total web content that consists of material that is not accessible by standard search engines. It is usually to be found embedded within secure sites, or consists of archived material. Much of the information may, however, be accessed through a library gateway, a Vortal, or a fee-based database service.
  • KAPPA

    Rule-based object-oriented expert system tool and application developer (IntelliCorp Inc.). KAPPA is written in C, and is available for PCs. See AI Languages and Tools.
  • Key intelligence topics (KITs)

    Key intelligence topics are those topics identified as being of greatest significance to an organization’s senior executives, and which provide purpose and direction for Competitive Intelligence operations. Key intelligence topics are invariably derived from a series of interviews. They are then grouped into appropriate categories and allocated a priority, usually by the same, or a representative, group of people. The basic categories are x strategic decisions and actions (including the development of strat
  • Knowledge

    Knowledge is a blend of experience, values, information in context, and insight that forms a basis on which to build new experiences and information, or to achieve specific goals. It refers to the process of comprehending, comparing, judging, remembering, and reasoning.Knowledge is data that has been organized (by classification and rational presentation), synthesized (by selection, analysis, interpretation, adaptation, or compression), made useful (by presenting arguments, matching needs and problems, ass
  • Knowledge assets

    Knowledge assets are bodies of knowledge of value to an organization, including previously unarticulated expertise and experience held by individuals. They may take the form of documents, databases, individuals, or groups of people, and include records of projects or activities, knowledge maps, links to networks or communities of practice, reports, standard operating procedures, patent specifications, licenses, copyright material, taxonomies, glossaries of terms, and so on. Knowledge assets are sometimes r
  • Knowledge Elicitation

    Synonym
  • Knowledge Engineering

    Knowledge engineering is the process of collecting knowledge from human experts in a form suitable for designing and implementing an expert system. The person conducting knowledge engineering is called a knowledge engineer.
  • Knowledge Management

    Knowledge management (KM) is the process of capturing, developing, sharing, and effectively using organisational knowledge. It refers to a multi-disciplined approach to achieving organizational objectives by making the best use of knowledge. It includes courses taught in the fields of business administration, information systems, management, and library and information sciences. More recently, other fields have started contributing to KM research; these include information and media, computer science, publ
  • Knowledge Management (KM)

    Knowledge Management is an integrated, systematic process for identifying, collecting, storing, retrieving, and transforming information and knowledge assets into knowledge that is readily accessible in order to improve the performance of the organisation. The basic tenets of Knowledge Management are to enhance decision making, foster innovation, build relationships, establish trust, share information, and improve learning. The means for doing so might include apprenticeship schemes and mentoring programmes
  • Knowledge Management System

    Knowledge Management System is a process and procedure for enabling Knowledge Management. It usually incorporates a search engine, data-mining facilities, and -since knowledge is primarily embodied in people -an expertise directory or location service (known as a Knowledge map). Content may include profiles of key people, industry trends, market surveys, descriptions of current and proposed projects or activities, solutions to past problems, and discussion group facilities. The term also implies the creati
  • Knowledge map

    A Knowledge map may be either, or a combination of aspects of both, of the following
  • Knowledge Representation

    Knowledge representation is one of the two basic techniques of artificial intelligence, the other is the capability to search for end points from a starting point. The way in which knowledge is represented has a powerful effect on the prospects for a computer or person to draw conclusions or make inferences from that knowledge. Consider the representation of numbers that we wish to add. Which is easier, adding 10 + 50 in Arabic numerals, or adding X plus L in Roman numerals? Consider also the use of algeb
  • Knowledge-based Planning

    Knowledge-based planning represents the planner’s incomplete knowledge state and the domain actions. Actions are being modeled in terms of how they modify the knowledge state of the planner rather than in terms of how they modify the physical world. This approach scales better and supports features that make it applicable to much richer domains and problems. Knowledge rich approaches, such as hierarchical task network planning, have advantages of scalability, expressiveness, continuous plan modification du
  • Knowledge-based Representations

    The form or structure of databases and knowledge bases for expert and other intelligent systems, so that the information and solutions provided by a system are both accurate and complete. Usually involves a logically-based language capable of both syntactic and semantic representation of time, events, actions, processes, and entities. Knowledge representation languages include Lisp, Prolog, Smalltalk, OPS-5, and KL-ONE. Structures include rules, scripts, frames, endorsements, and semantic networks.
  • Knowledge-based Systems

    Usually a synonym for expert system, though some think of expert systems as knowledge-based systems that are designed to work on practical, real-world problems.
  • Limited memory

    systems with short-term memory limited to a given timeframe
  • LISP

    LISP (short for list processing language), a computer language, was invented by John McCarthy, one of the pioneers of artificial intelligence. The language is ideal for representing knowledge (e.g., If a fire alarm is ringing, then there is a fire) from which inferences are to be drawn.
  • Machine intelligence

    An umbrella term that encompasses machine learning, deep learning, and classical learning algorithms.
  • Machine learning (ML)

    focuses on developing programs that access and use data on their own, leading machines to learn for themselves and improve from learned experiences
  • Machine learning refers to the ability of computers to automatically acquire new knowledge, learning from, for example, past cases or experience, from the computer’s own experiences, or from exploration. Machine learning has many uses such as finding rule

    learning, automatic learning.
  • Machine perception

    The ability for a system to receive and interpret data from the outside world similarly to how humans use our senses. This is typically done with attached hardware, though software is also usable.
  • Machine translation

    an application of NLP used for language translation (human-to-human) in text- and speech-based conversations
  • Market Intelligence (MI)

    Market Intelligence concerns the attitudes, opinions, behavior, and needs of individuals and organizations within the context of their economic, environmental, social, and everyday activities. The emphasis is on consumers – product, price, place, promotion.
  • Marketing research

    Marketing research is the study of methods of selling and promoting a product or service; or gathering information that will support a marketing campaign (such as qualitative and quantitative data concerning customer preferences and behavior).
  • Metadata

    Metadata is information (in the form of a metatag) that describes an Internet document and facilitates its retrieval. It is very similar to a bibliographic reference, but – where present – is often more extensive, and may include author, title, affiliation, sponsor, abstract, keywords, language, publisher, date published, contact details, classification scheme, and so on; probably the most useful being keywords.
  • Model-based Reasoning

    Model-based reasoning (MBR) concentrates on reasoning about a system’s behavior from an explicit model of the mechanisms underlying that behavior. Model-based techniques can very succinctly represent knowledge more completely and at a greater level of detail than techniques that encode experience, because they employ models that are compact axiomatic systems from which large amounts of information can be deduced.
  • Natural Language Processing

    English is an example of a natural language, a computer language isn’t. For a computer to process a natural language, it would have to mimic what a human does. That is, the computer would have to recognize the sequence of words spoken by a person or another computer, understand the syntax or grammar of the words (i.e., do a syntactical analysis), and then extract the meaning of the words. A limited amount of meaning can be derived from a sequence of words taken out of context (i.e., by semantic analysis);
  • Natural language processing (NLP)

    helps computers process, interpret, and analyze human language and its characteristics by using natural language data
  • Neural Networks

    Neural networks are an approach to machine learning which developed out of attempts to model the processing that occurs within the neurons of the brain. By using simple processing units (neurons), organized in a layered and highly parallel architecture, it is possible to perform arbitrarily complex calculations. Learning is achieved through repeated minor modifications to selected neurons, which results in a very powerful classification system. A problem with neural networks is that it very difficult to un
  • Object-oriented programming

    An object-oriented problem-solving approach is very similar to the way a human solves problems. It consists of identifying objects and the correct sequence in which to use these objects to solve the problem. In other words, object-oriented problem solving consists of designing objects whose individual behaviors, and interactions solve a specific problem. Interactions between objects take place through the exchange of messages, where a message to an object causes it to perform its operations and solve its p
  • Ontology

    A formal ontology is a rigorous specification of a set of specialized vocabulary terms and their relationships sufficient to describe and reason about the range of situations of interest in some domain.
  • Open source information

    Open source information is unclassified published information. It includes non-proprietary grey literature as well as information published electronically (on the Internet, for example).
  • Optical Character Recognition (OCR)

    conversion of images of text (typed, handwritten, or printed) either electronically or mechanically, into machine-encoded text
  • Pattern recognition

    automated recognition of patterns found in data
  • Plan Recognition

    The goal of plan recognition is to interpret an agent’s intentions by ascribing goals and plans to it based on partial observation of its behavior up to the current time. Divining the agent’s underlying plan can be useful for many purposes including
  • Planning and Scheduling

    Planning is the field of AI that deals with the synthesis of plans, which are partial orders of (possibly conditional) actions to meet specified goals under specified constraints. It is related to scheduling, which is the task of determining when and with what resources to carry out each member of a specific set of actions to satisfy constraints regarding ordering, effectiveness and resource allocation. In 1991, SHAI developed the concept of intelligent entities for planning and scheduling applications. In
  • Planning and Scheduling Agents

    Multiagent planning is concerned with planning by (and for) multiple agents. It can involve agents planning for a common goal, an agent coordinating the plans (plan merging) or planning of others, or agents refining their own plans while negotiating over tasks or resources. The topic also involves how agents can do this in real time while executing plans (distributed continual planning). Multiagent scheduling differs from multiagent planning the same way planning and scheduling differ
  • Portal

    Portal is a web site that acts as a gateway to the Internet by providing a broad and diverse range of services, including directories, search engines or, links, email, reference tools, forums or chat facilities, access to online shopping and banking, games, entertainment, and so on.
  • Programming by Demonstration

    Programming by demonstration (PBD) is a term that describes a variety of end-user programming techniques that generate code from examples provided by the user. The motivation behind Programming by Demonstration is simple and compelling – if a user knows how to perform a task on the computer, that alone should be sufficient to create a program to perform the task. It should not be necessary to learn a programming language like C++ or BASIC. The most simple version of Programming by Demonstration is accompl
  • Prototyping

    Prototyping is an important step in the development of a practical artificial intelligence application. An AI software prototype is usually a working piece of software that performs a limited set of the functions that the software designer envisages will be required by the user. It is used to convey to the users a clear picture of what is being developed to ensure that the software will serve the intended picture. An AI prototype, contrary to the practice with many other sorts of prototypes, is grown into
  • Python

    Python is a high-level programming language designed to be easy to read and simple to implement. It is open source, which means it is free to use, even for commercial applications. Python can run on Mac, Windows, and Unix systems and has also been ported to Java and .NET virtual machines.
  • Qualitative Reasoning

    Inexact reasoning, the opposite of quantitative reasoning . Also see Commonsense Reasoning.
  • Reactive machines

    can analyze, perceive, and make predictions about experiences, but do not store data; they react to situations and act based on the given moment
  • Recurrent neural network (RNN)

    a type of neural network that makes sense of and creates outputs based on sequential information and pattern recognition
  • Reinforcement learning

    a machine learning method where the reinforcement algorithm learns by interacting with its environment, and is then penalized or rewarded based off of decisions it makes
  • Robotic process automation (RPA)

    uses software with artificial intelligence and machine learning capabilities to perform repetitive tasks once completed by humans
  • Robotics

    focused on the design and manufacturing of robots that exhibit and/or replicate human intelligence and actions
  • Rule-based System

    An expert system based on IF-THEN rules for representing knowledge.
  • Search engines

    Search engines are microprocessor-driven software programs capable of successfully retrieving information from computer networks or databases in order to match the needs of searchers. They automatically index keywords in context, usually by using robots, then search those indexes for keywords that match the user’s request. Generally speaking, they are more suitable than directories for conducting research. Current developments may incorporate visualization techniques.
  • Semantic networks

    Semantic networks represent knowledge in the form of concepts (known as nodes) and links (that indicate the relationships between concepts). A concept is an abstract class or set consisting of items or things that share common features or properties.
  • Server

    Server is a computer, or software package, that provides a specific service to client software running on other computers. A single server machine may have several different server packages, thus providing many different services to clients on the network.
  • Signal Filtering

    Signal filtering is a techniques for removing the noise or static from a signal so the clear or underlying signal remains. This is a conventional technique commonly used by electrical engineers and others.
  • Simulated Annealing

    Simulated annealing is an optimization method based on an analogy with the physical process of toughening alloys, such as steel, called annealing. Annealing involves heating an alloy and cooling it slowly to increase its toughness. In simulated annealing, an artificial “temperature” is used to control the optimization process, of finding the overall maximum or minimum of a function. As cooling a metal slowly allows the atoms time to move to the optimum positions for toughness, giving time to look for a sol
  • Simulation

    A simulation is a system that is constructed to work, in some ways, analogously to another system of interest. The constructed system is usually made simpler than the original system so that only the aspects of interest are mirrored. Simulations are commonly used to learn more about the behavior of the original system, when the original system is not available for manipulation. It may not be available because of cost or safety reasons, or it may not be built yet and the purpose of learning about it is to d
  • Social media

    Social media is a combination of sociology and information technology that allows people to publish their own content and to establish business or personal relationships.
  • Statistical Learning

    Statistical learning techniques attempt to construct statistical models of an entity based on surface features drawn from a large corpus of examples. These techniques generally operate independent of specific domain knowledge, training instead on a set of features that characterize an input example. In the domain of natural language, for example, statistics of language usage (e.g., word trigram frequencies) are compiled from large collections of input documents and are used to categorize or make predicti
  • Strategy

    Strategy is the timely adoption of courses of action and the allocation of resources necessary for carrying out the basic long-term goals and objectives of an enterprise with the emphasis on achieving something different or unique.Strategy is the calculation and co-ordination of ways and means to achieve ends. An organization’s strategy may be represented visually by a strategy map; a powerful communication tool. Strategy formulation involves the right brain, calling for creativity, as well as the ability t
  • Strong AI

    see artificial general intelligence (AGI)
  • Structural Pattern Recognition

    It is a form of pattern recognition, in which each object can be represented by a variable-cardinality set of symbolic, nominal features. This allows for representing pattern structures, taking into account more complex interrelationships between attributes than is possible in the case of flat, numerical feature vectors of fixed dimensionality, that are used in statistical classification.One way to present such structure is by means of a strings of symbols from a formal language. In this case the difference
  • Structured data

    clearly defined data with easily searchable patterns
  • Supervised learning

    a type of machine learning where output datasets teach machines to generate desired outcomes or algorithms (akin to a teacher-student relationship)
  • Swarm behavior

    From the perspective of the mathematical modeler, it is an emergent behavior arising from simple rules that are followed by individuals and does not involve any central coordination.
  • SWOT analysis

    A SWOT analysis is the evaluation of available information concerning the business environment in order to identify internal strengths and weaknesses, and external threats and opportunities. SWOT analysis is also known as Situational analysis and, when applied to competitors, as Competitor profiling.
  • Synonym

    mixed initiative planning.
  • Tacit knowledge

    Tacit knowledge is the product of interaction between people, or between people and their environment.
  • Tactical Diagrams

    Tactical diagrams increase, reduce, maintain specific levels related to the objectives. Tactics can be viewed as more concrete strategies of smaller scope and greater specificity
  • Task Transition Diagrams

    The state of an activity instance changes when a significant step in the execution of the activity instance occurs. The states and the state transitions depend on the type of activity so they are important in the life cycle of basic activities. In contrast to the state diagrams for process instances, activity end states are not explicitly exposed. The life cycle of an activity depends on the enclosing process and activities are always deleted with the process instance.
  • Taxonomy

    A taxonomy, in its original form, refers to the science of the classification of living and extinct organisms. In modern parlance, it applies to any system or software designed to organize information or knowledge so that it may be more easily stored, maintained, and retrieved. It usually reflects the language and culture of a specific enterprise or industry and acts as the authority for identifying documents and the content of knowledge maps. A taxonomy is often created by reference to several thesauries,
  • Time Series Analysis

    A time series is a sequence of observations of a particular variable over time (e.g., the daily closing level of Dow Jones Industrial Average). There are a wide range of statistical and temporal data mining techniques for analyzing such data. Two common uses for this type of analysis are forecasting future events (i.e., time series prediction) and searching a database of previous patterns for sequences that are similar to a particular pattern of interest. This is a conventional statistical technique.
  • Topic maps

    Topic maps are designed to facilitate the organisation and navigation of large information collections through the use of an open (non-controlled) vocabulary using topics, associations, and occurrences. A topic may represent any concept, including subject, person, place, organization, and event. Associations represent the relationships between those concepts; and occurrences represent relevant information resources. Although sometimes used when referring to an ontology, taxonomy, or thesaurus, it may, in fa
  • Toy System

    Small-scale implementation of a concept or model useful for testing a few main features, but unsuitable for complex or real-world problems. For example, a toy rule-based system may contain a few rules to construct an arch out of a number of pre-selected wooden blocks. It is a useful academic approach to unsolved problems. It is not employed in producing practical, real-world solutions.
  • Transfer learning

    a system that uses previously-learned data and applies it to a new set of tasks
  • Truncate

    Truncate means to shorten a word by omitting letters from the end and, when used as a search term, effectively broadens the scope of the search. For example, Defen*, would retrieve all words beginning with the chosen letters, such as
  • Truth Maintenance Systems

    Many conventional reasoning systems assume that reasoning is the process of deriving new knowledge from old, i.e., the number of things a person or intelligent software believes increases without retracting any existing knowledge, since known truths never change under this form of logic. This is called monotonic logic. However, this view does not accurately capture the way in which humans think since our actions constantly change what we believe to be true. Humans reason nonmonotonically, which means they r
  • Turing Test

    a test created by computer scientist Alan Turing (1950) to see if machines could exhibit intelligence equal to or indistinguishable from that of a human
  • Unstructured data

    data without easily searchable patterns (e.g. audio, video, social media content)
  • Unstructured information

    Unstructured information refers to the content of any document that has no defined or standard structure such as would allow for its convenient storage and retrieval. Examples include blogs, emails, images, audio and video files, and wikis.
  • Unsupervised learning

    a type of machine learning where an algorithm is trained with information that is neither classified nor labeled, thus allowing the algorithm to act without guidance (or supervision)
  • Weak AI

    see artificial narrow intelligence (ANI)
  • What are the differences between Tacit and Explicit knowledge?

    What are the differences between Tacit and Explicit knowledge?
  • What is Big Data?

    Big data is extremely large data sets that may be analyzed computationally to reveal patterns, trends, and associations, especially relating to human behavior and interactions.
  • What is Explicit Knowledge?

    Explicit knowledge consists of anything that can be codified, or expressed in words, numbers, and other symbols
  • Digital Experience Platform

    Digital Experience Platform
  • What is a Digital Experience Platform?

    What is a Digital Experience Platform?, DXP, Gartner, Magic Quadrant, Oxcyon
  • Validation

    FDA Validation
  • private exchange


  • private industrial networks


  • process specifications


  • processing


  • processing controls


  • procurement


  • product differentiation


  • production


  • production or service workers


  • profiling


  • profitability index


  • program


  • program-data dependence


  • programmers


  • programming


  • protocol


  • prototype


  • prototyping


  • public key infrastructure


  • pull-based model


  • pure-play


  • push technology


  • push-based model


  • query language


  • radio-frequency identification (RFID)


  • RAID (Redundant Array of Inexpensive Disks)


  • RAM (Random Access Memory)


  • Rapid Application Development (RAD)


  • rational model


  • rationalization of procedures


  • reach


  • real options pricing models


  • Real-Time Analytics


  • record


  • recovery-oriented computing


  • Reduced Instruction Set Computing (RISC)


  • reintermediation


  • Relational Database Management System (RDBMS)


  • relational DBMS


  • Repetitive Stress Injury (RSI)


  • Reporting


  • Repository


  • Request for Proposal (RFP)


  • resource allocation


  • responsibility


  • reverse logistics


  • richness


  • ring network


  • risk assessment


  • Risk Aversion Principle


  • ROM (Read-Only Memory)


  • router


  • rule base


  • safe harbor


  • Sales and marketing information systems


  • satellite


  • Scalability


  • scalability


  • Schema


  • scoring model


  • search costs


  • search engine


  • search-based advertising


  • secondary storage


  • security


  • selection construct


  • Self-Service


  • Semantic web


  • semistructured knowledge


  • semistructured knowledge system


  • senior managers


  • sensitivity analysis


  • sequence construct


  • server


  • server farm


  • Service Level Agreement (SLA)


  • service platform


  • shopping bot


  • six sigma


  • Slice And Dice


  • smart card


  • smart phone


  • Snapshot


  • SOAP (Simple Object Access Protocol)


  • social engineering


  • sociotechnical design


  • Software as a Service (SaaS)


  • software metrics


  • software package


  • source code


  • spam


  • spreadsheet


  • spyware


  • Standard Operating Procedures (SOPs)


  • star network


  • Storage Area Network (SAN)


  • Storage Service Provider (SSP)


  • storage technology


  • stored value payment systems


  • strategic decision making


  • strategic information systems


  • strategic transitions


  • strategic-level systems


  • streaming technology


  • structure chart


  • structured


  • structured analysis


  • structured decisions


  • structured design


  • structured knowledge


  • structured knowledge system


  • structured programming


  • Structured Query Language (SQL)


  • Structured Query Language (SQL)


  • subschema


  • supercomputer


  • supply chain


  • supply chain execution systems


  • supply chain management


  • supply chain management systems


  • supply chain planning systems


  • support activities


  • switched lines


  • switching costs


  • syndicator


  • system failure


  • system software


  • system testing


  • systematic decision makers


  • systems analysis


  • systems analysts


  • systems design


  • systems development


  • systems lifecycle


  • T1 line


  • tangible benefits


  • taxonomy


  • teamware


  • technostress


  • telecommunications system


  • teleconferencing


  • Telnet


  • test plan


  • testing


  • topology


  • Total Cost of Ownership (TCO)


  • Total Quality Management (TQM)


  • touch point


  • trade secret


  • transaction cost theory


  • Transaction Processing Systems (TPS)


  • transborder data flow


  • Transmission Control Protocol/Internet Protocol (TCP/IP)


  • transnational


  • Trojan horse


  • tuple


  • twisted wire


  • UDDI (Universal Description, Discovery, and Integration)


  • unified messaging


  • Unified Modeling Language (UML)


  • Uniform Resource Locator (URL)


  • unit testing


  • UNIX


  • unstructured decisions


  • up-selling


  • Usenet


  • user interface


  • user-designer communications gap


  • Utilitarian Principle


  • utility computing


  • value chain model


  • value web


  • Value-Added Network (VAN)


  • videoconferencing


  • virtual organization


  • Virtual Private Network (VPN)


  • Virtual Reality Modeling Language (VRML)


  • virtual reality systems


  • Visual Basic


  • visual programming


  • voice mail


  • Voice over IP (VoIP)


  • walkthrough


  • Web browser


  • Web bugs


  • Web content management tools


  • Web hosting service


  • Web personalization


  • Web server


  • Web services


  • Web site


  • Web site performance monitoring tools


  • Webmaster


  • Wide Area Network (WAN)


  • Wi-Fi


  • Windows 2000


  • Windows 2003


  • Windows 98


  • Windows XP


  • Wireless Application Protocol (WAP)


  • wireless NIC


  • wisdom


  • WML (Wireless Markup Language)


  • word processing software


  • work-flow management


  • workstation


  • World Wide Web


  • worms


  • WSDL(Web Services Description Language)


  • XHTML (Extensible Hypertext Markup Language)


  • XML (eXtensible Markup Language)


  • access control


  • Access Path

    The track chosen by a database management system to collect data requested by the end-user.
  • access point


  • accountability


  • accounting rate of return on investment (ROI )


  • accumulated balance digital payment systems


  • activity-based costing


  • administrative controls


  • Administrative Data


  • Advanced Analytics


  • agency theory


  • Aggregate Data


  • AI shell


  • analog signal


  • analytical CRM


  • Analytics


  • antivirus software


  • application controls


  • application server


  • application service provider (ASP)