Workflow is a generic term for orchestrated and repeatable patterns of activity, enabled by the systematic organization of resources into processes that transform materials, provide services, or process information. [1] It can be depicted as a sequence of operations, the work of a person or group, [2] the work of an organization of staff, or one or more simple or complex mechanisms.
From a more abstract or higher-level perspective, workflow may be considered a view or representation of real work. [3] The flow being described may refer to a document, service, or product that is being transferred from one step to another.
Workflows may be viewed as one fundamental building block to be combined with other parts of an organization's structure such as information technology, teams, projects and hierarchies. [4]
The development of the concept of a workflow occurred above a series of loosely defined, overlapping eras.
The modern history of workflows can be traced to Frederick Taylor [5] and Henry Gantt, although the term "workflow" was not in use as such during their lifetimes. [6] One of the earliest instances of the term "work flow" was in a railway engineering journal from 1921. [7]
Taylor and Gantt launched the study of the deliberate, rational organization of work, primarily in the context of manufacturing. This gave rise to time and motion studies. [8] Related concepts include job shops and queuing systems (Markov chains). [9] [10]
The 1948 book Cheaper by the Dozen introduced the emerging concepts to the context of family life.
The invention of the typewriter and the copier helped spread the study of the rational organization of labor from the manufacturing shop floor to the office. Filing systems and other sophisticated systems for managing physical information flows evolved. Several events likely contributed to the development of formalized information workflows. First, the field of optimization theory matured and developed mathematical optimization techniques. For example, Soviet mathematician and economist Leonid Kantorovich developed the seeds of linear programming in 1939 through efforts to solve a plywood manufacturer's production optimization issues. [11] [12] Second, World War II and the Apollo program drove process improvement forward with their demands for the rational organization of work. [13] [14] [15]
In the post-war era, the work of W. Edwards Deming and Joseph M. Juran led to a focus on quality, first in Japanese companies, and more globally from the 1980s: there were various movements ranging from total quality management to Six Sigma, and then more qualitative notions of business process re-engineering. [16] This led to more efforts to improve workflows, in knowledge economy sectors as well as in manufacturing. Variable demands on workflows were recognised when the theory of critical paths and moving bottlenecks was considered. [17]
Basu and Kumar note that the term "workflow management" has been used to refer to tasks associated with the flow of information through the value chain rather than the flow of material goods: they characterise the definition, analysis and management of information as "workflow management". They note that workflow can be managed within a single organisation, where distinct roles are allocated to individual resources, and also across multiple organisations or distributed locations, where attention needs to be paid to the interactions between activities which are located at the organizational or locational boundaries. The transmission of information from one organization to another is a critical issue in this inter-organizational context and raises the importance of tasks they describe as "validation", "verification" and "data usage analysis". [18]
A workflow management system (WfMS) is a software system for setting up, performing, and monitoring a defined sequence of processes and tasks, with the broad goals of increasing productivity, reducing costs, becoming more agile, and improving information exchange within an organization. [19] These systems may be process-centric or data-centric, and they may represent the workflow as graphical maps. A workflow management system may also include an extensible interface so that external software applications can be integrated and provide support for wide area workflows that provide faster response times and improved productivity. [19]
The concept of workflow is closely related to several fields in operations research and other areas that study the nature of work, either quantitatively or qualitatively, such as artificial intelligence (in particular, the sub-discipline of AI planning) and ethnography. The term "workflow" is more commonly used in particular industries, such as in printing or professional domains such as clinical laboratories, where it may have particular specialized meanings.
The following examples illustrate the variety of workflows seen in various contexts:
Several workflow improvement theories have been proposed and implemented in the modern workplace. These include:
Evaluation of resources, both physical and human, is essential to evaluate hand-off points and potential to create smoother transitions between tasks. [30]
A workflow can usually be described using formal or informal flow diagramming techniques, showing directed flows between processing steps. Single processing steps or components of a workflow can basically be defined by three parameters:
Components can only be plugged together if the output of one previous (set of) component(s) is equal to the mandatory input requirements of the following component(s). Thus, the essential description of a component actually comprises only input and output that are described fully in terms of data types and their meaning (semantics). The algorithms' or rules' descriptions need only be included when there are several alternative ways to transform one type of input into one type of output – possibly with different accuracy, speed, etc.
When the components are non-local services that are invoked remotely via a computer network, such as Web services, additional descriptors (such as QoS and availability) also must be considered. [31]
This section needs additional citations for verification .(January 2018) |
Many software systems exist to support workflows in particular domains. Such systems manage tasks such as automatic routing, partially automated processing, and integration between different functional software applications and hardware systems that contribute to the value-addition process underlying the workflow. There are also software suppliers using the technology process driven messaging service based upon three elements:[ citation needed ]
An information system (IS) is a formal, sociotechnical, organizational system designed to collect, process, store, and distribute information. From a sociotechnical perspective, information systems comprise four components: task, people, structure, and technology. Information systems can be defined as an integration of components for collection, storage and processing of data, comprising digital products that process data to facilitate decision making and the data being used to provide information and contribute to knowledge.
A business process, business method, or business function is a collection of related, structured activities or tasks performed by people or equipment in which a specific sequence produces a service or product for a particular customer or customers. Business processes occur at all organizational levels and may or may not be visible to the customers. A business process may often be visualized (modeled) as a flowchart of a sequence of activities with interleaving decision points or as a process matrix of a sequence of activities with relevance rules based on data in the process. The benefits of using business processes include improved customer satisfaction and improved agility for reacting to rapid market change. Process-oriented organizations break down the barriers of structural departments and try to avoid functional silos.
In industry, product lifecycle management (PLM) is the process of managing the entire lifecycle of a product from its inception through the engineering, design and manufacture, as well as the service and disposal of manufactured products. PLM integrates people, data, processes, and business systems and provides a product information backbone for companies and their extended enterprises.
In software testing, test automation is the use of software separate from the software being tested to control the execution of tests and the comparison of actual outcomes with predicted outcomes. Test automation can automate some repetitive but necessary tasks in a formalized testing process already in place, or perform additional testing that would be difficult to do manually. Test automation is critical for continuous delivery and continuous testing.
Business process modeling (BPM), mainly used in business process management; software development, or systems engineering, is the action of capturing and representing processes of an enterprise, so that the current business processes may be analyzed, applied securely and consistently, improved, and automated. BPM is typically orchestrated by business analysts, leveraging their expertise in modeling practices. Subject matter experts, equipped with specialized knowledge of the processes being modeled, often collaborate within these teams. Alternatively, process models can be directly derived from digital traces within IT systems, such as event logs, utilizing process mining tools.
A data-flow diagram is a way of representing a flow of data through a process or a system. The DFD also provides information about the outputs and inputs of each entity and the process itself. A data-flow diagram has no control flow — there are no decision rules and no loops. Specific operations based on the data can be represented by a flowchart.
Business process re-engineering (BPR) is a business management strategy originally pioneered in the early 1990s, focusing on the analysis and design of workflows and business processes within an organization. BPR aims to help organizations fundamentally rethink how they do their work in order to improve customer service, cut operational costs, and become world-class competitors.
Enterprise content management (ECM) extends the concept of content management by adding a timeline for each content item and, possibly, enforcing processes for its creation, approval, and distribution. Systems using ECM generally provide a secure repository for managed items, analog or digital. They also include one methods for importing content to manage new items, and several presentation methods to make items available for use. Although ECM content may be protected by digital rights management (DRM), it is not required. ECM is distinguished from general content management by its cognizance of the processes and procedures of the enterprise for which it is created.
A functional software architecture (FSA) is an architectural model that identifies enterprise functions, interactions and corresponding IT needs. These functions can be used as a reference by different domain experts to develop IT-systems as part of a co-operative information-driven enterprise. In this way, both software engineers and enterprise architects can create an information-driven, integrated organizational environment.
Enterprise integration is a technical field of enterprise architecture, which is focused on the study of topics such as system interconnection, electronic data interchange, product data exchange and distributed computing environments.
Business analysis is a professional discipline focused on identifying business needs and determining solutions to business problems. Solutions may include a software-systems development component, process improvements, or organizational changes, and may involve extensive analysis, strategic planning and policy development. A person dedicated to carrying out these tasks within an organization is called a business analyst or BA.
Business Process Model and Notation (BPMN) is a graphical representation for specifying business processes in a business process model.
Enterprise modelling is the abstract representation, description and definition of the structure, processes, information and resources of an identifiable business, government body, or other large organization.
Business process discovery (BPD) related to business process management and process mining is a set of techniques that manually or automatically construct a representation of an organisations' current business processes and their major process variations. These techniques use data recorded in the existing organisational methods of work, documentations, and technology systems that run business processes within an organisation. The type of data required for process discovery is called an event log. Any record of data that contains the case id, activity name, and timestamp. Such a record qualifies for an event log and can be used to discover the underlying process model. The event log can contain additional information related to the process, such as the resources executing the activity, the type or nature of the events, or any other relevant details. Process discovery aims to obtain a process model that describes the event log as closely as possible. The process model acts as a graphical representation of the process. The event logs used for discovery could contain noise, irregular information, and inconsistent/incorrect timestamps. Process discovery is challenging due to such noisy event logs and because the event log contains only a part of the actual process hidden behind the system. The discovery algorithms should solely depend on a small percentage of data provided by the event logs to develop the closest possible model to the actual behaviour.
Kepler is a free software system for designing, executing, reusing, evolving, archiving, and sharing scientific workflows. Kepler's facilities provide process and data monitoring, provenance information, and high-speed data movement. Workflows in general, and scientific workflows in particular, are directed graphs where the nodes represent discrete computational components, and the edges represent paths along which data and results can flow between components. In Kepler, the nodes are called 'Actors' and the edges are called 'channels'. Kepler includes a graphical user interface for composing workflows in a desktop environment, a runtime engine for executing workflows within the GUI and independently from a command-line, and a distributed computing option that allows workflow tasks to be distributed among compute nodes in a computer cluster or computing grid. The Kepler system principally targets the use of a workflow metaphor for organizing computational tasks that are directed towards particular scientific analysis and modeling goals. Thus, Kepler scientific workflows generally model the flow of data from one step to another in a series of computations that achieve some scientific goal.
In systems engineering, software engineering, and computer science, a function model or functional model is a structured representation of the functions within the modeled system or subject area.
Business process management (BPM) is the discipline in which people use various methods to discover, model, analyze, measure, improve, optimize, and automate business processes. Any combination of methods used to manage a company's business processes is BPM. Processes can be structured and repeatable or unstructured and variable. Though not required, enabling technologies are often used with BPM.
KNIME, the Konstanz Information Miner, is a free and open-source data analytics, reporting and integration platform. KNIME integrates various components for machine learning and data mining through its modular data pipelining "Building Blocks of Analytics" concept. A graphical user interface and use of JDBC allows assembly of nodes blending different data sources, including preprocessing, for modeling, data analysis and visualization without, or with minimal, programming.
In philosophy, a process ontology refers to a universal model of the structure of the world as an ordered wholeness. Such ontologies are fundamental ontologies, in contrast to the so-called applied ontologies. Fundamental ontologies do not claim to be accessible to any empirical proof in itself but to be a structural design pattern, out of which empirical phenomena can be explained and put together consistently. Throughout Western history, the dominating fundamental ontology is the so-called substance theory. However, fundamental process ontologies have become more important in recent times, because the progress in the discovery of the foundations of physics has spurred the development of a basic concept able to integrate such boundary notions as "energy," "object", and those of the physical dimensions of space and time.
Artifact-centric business process model represents an operational model of business processes in which the changes and evolution of business data, or business entities, are considered as the main driver of the processes. The artifact-centric approach, a kind of data-centric business process modeling, focuses on describing how business data is changed/updated, by a particular action or task, throughout the process.