from DSSResources.com


What is operations analysis?

by Daniel J. Power
Editor, DSSResources.COM

Organizations develop systems and processes to perform tasks. The performance of these operations determine success or profitability, the quality of goods and services, and so many more outcomes. Operations analysis (OA) is the study of operational systems with the goal of identifying opportunities to make improvements and enhance performance. OA is a systematic and scientific analysis and evaluation of problems and issues. A fundamental operations analysis task is answering the broad question “how can we perform better?” An operations analyst identifies issues and then gathers data, and performs analyses. Operations analysis has been identified as a major "big data" use case.

The primary objective of operations analysis is supporting decision makers with results and recommendations they can use as a basis for action. Often it is advantageous to use a multidisciplinary operations analysis team to recognize conflicting stakeholders’ perspectives. Analysis of problems should be from a comprehensive stakeholder and systems perspective.

According to wiseGEEK.org, "An operation analysis is a procedure used to determine the efficiency of various aspects of a business operation. Most reports include a careful scrutiny of a company's production methods, material costs, equipment implementation and workplace conditions. Professional consultants are often brought in from outside a company to perform an unbiased operational analysis, which provides a company with hard data concerning waste issues and operational risks. Many companies use the information from such an analysis to decide on what changes need to be made to improve operations."

The primary focus of operations analysis is on processes. An operation is composed of processes designed to add value by transforming inputs into useful outputs. Inputs may be materials, labor, energy, and capital equipment. Outputs may be a physical product (possibly used as an input to another process) or a service. Processes can have a significant impact on the performance of a business, and process improvement can improve a firm's competitiveness.

Possible operations analysis projects and research questions might include: Do products meet advertised specifications? Do manufactured products meet engineered specifications? Are customers satisfied with service quality? Does service quality meet management expectations? Is the process efficient? Are there bottlenecks in processes? Is the unit/department productive? How do operations at one location compare with other locations? Are there facility layout bottlenecks? How reliable is the product? Is there excessive or unnecessar waste in a process?

According to Cosentino (2013), Ms. Inhi Cho Suh, IBM Vice President of Product Strategy & Information Management, identified operational analysis as one of five compelling use cases for big data analytics. Cosentino notes operational analysis "is the ability to leverage networks of instrumented data sources to enable proactive monitoring through baseline analysis and real-time feedback mechanisms. The need for better operational analytics continues to increase. Our latest research on operational intelligence finds that organizations that use dedicated tools to handle this need will be more satisfied and gain better outcomes than those that do not."

According to the IBM Big Data & Analytics Hub, "Machine data is all around us: logs, sensors, GPS devices and meters to name a few. The enormous growth of machine data has become a major driver of big data solutions and a challenge for many organizations. The complex and diverse nature of machine data leaves many organizations unable to leverage it. Yet without it, they’re making business decisions on a fraction of the information available to them. Operations analysis is about using big data technologies to enable a new generation of applications that analyze machine data and gain insight from it, which in turn improves business results." Operations analytics with big data can improve reliability with root cause analysis and speed operations by identifying bottlenecks.

"IBM Predictive Maintenance and Quality can reduce asset downtime by analyzing asset data in real time, and detecting failure patterns and poor quality parts before problems occur. (http://www.ibm.com/big-data/us/en/big-data-and-analytics/operations-management.html).

What is the Operations Analysis big data use case?

According to IBM web materials, "Operations Analysis focuses on analyzing machine data, which can include anything from IT machines to sensors, meters and GPS devices. It’s growing at exponential rates and comes in large volumes and a variety of formats, including in-motion, or streaming data. Leveraging machine data requires complex analysis and correlation across different types of data sets. By using big data for operations analysis, organizations can gain real-time visibility into operations, customer experience, transactions and behavior."

IBM cites three customer examples. Becker Underwood uses predictive analytics to improve inventory turn and forecasting accuracy. Becker Underwood integrated and optimized its global supply chain, greatly improving inventory turns and forecasting accuracy. BASF acquired Ames, Iowa-based Becker Underwood in 2012. Recology (http://www.recology.com/) improves operational efficiency, transporting waste to collection facilities more efficiently by mining and analyzing data faster. Infinity Property & Casualty (https://www.infinityauto.com/) identified fraudulent claims faster and substantially increased subrogation recoveries.

Process Performance Measures and Measurement

Operations managers want to assess process characteristics such as cost, quality, flexibility, and speed. Some of the process performance measures for these characteristics include:

Productivity: an average measure of the efficiency of production. It is measured as the ratio of output to inputs used in the production process. Measures include total productivity, partial factor productivity and multi-factor productivity.

Efficiency: a measure of the amount of time, effort, or cost used for an intended task or purpose. The state or quality of being efficient means a task is accomplished with the least use of time and effort.

Process capacity: the capacity of a process is its maximum output rate, measured in units produced per unit of time. The capacity of a series of tasks is determined by the lowest capacity task in the series. The capacity of parallel strings of tasks is "the sum of the capacities of the two strings, except for cases in which the two strings have different outputs that are combined. In such cases, the capacity of the two parallel strings of tasks is that of the lowest capacity parallel string".

Capacity utilization: the percentage of the process capacity that is actually used.

Throughput rate or flow rate: the average rate that units flow past a specific point in the process. The maximum throughput rate is the process capacity.

Throughput time or lead time: the average time that a unit flows through a process from the entry point to the exit point. The flow time is the length of the longest path through the process. Flow time includes both processing time and any time the unit spends between steps.

Cycle time: the total time from the beginning to the end of a process. It is also measured as the time between successive units as they are output from a process. Cycle time for a process is equal to the inverse of the throughput rate. Cycle time can be thought of as the time required for a task to repeat itself. The cycle time of a process is equal to the longest task cycle time. The process is said to be in balance if the cycle times are equal for each activity in the process.

Process time: the average time during which one or more inputs are transformed into a finished output. Process time is flow time less idle time.

Time and motion: how a task is performed and how long it takes to perform the task. Measured by direct observation of a task with data recording including using a timekeeping device and in some situations a video recording. The purpose of a time and motion study is to improve business efficiency. The process establishes standard times for tasks and recommends chanes in methods and motions to improve work methods, cf. http://en.wikipedia.org/wiki/Time_and_motion_study.

Idle time: unproductive time on the part of employees or machines. It is time when no activity is occurring, for example, when an activity is waiting for input to arrive from a predecessor activity. The term can be used to describe both machine idle time and worker idle time. Idle time may be avoidable or unavoidable and due to factors that cannot be controlled.

Work In process quantity: stocks of components or unfinished products. Generally, the amount of partially finished inventory in the process.

Set-up time: the time required to prepare the equipment to perform an activity on a batch of units. Set-up time usually does not depend strongly on the batch size and therefore can be reduced on a per unit basis by increasing the batch size.

Direct labor content: the amount of labor (in units of time) actually contained in the product. Excludes idle time when workers are not working directly on the product. Also excludes time spent maintaining machines, transporting materials, etc.

Physical ergonomic data: Work and product related injuries can occur because of the design of products or equipment used in operations. Data is gathered to identify jobs or work conditions that might cause problems, using sources such as injury and illness logs, medical records, and job/task analyses.

Direct labor utilization: the fraction of labor capacity that actually is utilized as direct labor.

Defect frequency and cause analysis: data on defects may be collected from sensors or forms like check sheets. A check sheet includes a column with defect or error categories and then one or more columns in which the observations for different sources of defects are recorded. There may be columns listing suspected cause to see if present. Time information of observations is also collected. See http://en.wikipedia.org/wiki/Check_sheet.

Pareto analysis and chart: highlight the most important effects or factors. "In quality control, a Pareto chart often summarizes the most common sources of defects, the highest occurring type of defect, or the most frequent reasons for customer complaints, and so on, cf., http://en.wikipedia.org/wiki/Pareto_chart." The Pareto Principle, or 80-20 Rule, is a general rule-of-thumb or guideline that says that 80% of the effects stem from 20% of the causes.

Customer satisfaction: metrics include overall satisfaction, service satisfaction, product satisfaction, loyalty and attribute satisfaction. In many situations survey questions emphasize perceived quality, perceived reliability, extent of customer’s needs fulfilled, intent to recommend, and satisfaction with a particular product or service attribute, cf., http://www.qualtrics.com/blog/customer-satisfaction-measurement/ . Examples questions include: Overall, how satisfied are you with your experience at the university? How likely are you to recommend the university to a friend or family member? How satisfied are you with the courses in your major here at the university? Do you intend to return to the university in the fall?

Service quality assessment: customers, observers and management can assess service quality during a service interaction using a questionnaire or a check sheet. The questions are tailored to the process and organization. Common observations in a customer facing process like a restaurant or retail store include: Was customer greeted? The timing of tasks like time to take an order, bringing food, customer payment. Were there miscommunications, problems?

Waste: the quantity of scrap and rejected output from a process. A measure of unwanted or unusable materials. Also, waste is any step or action in a process that is not required to complete a process successfully (called a “non value-adding” action).

Operations analysis uses data to assess and improve processes and systems. There are many operations measures or metrics that can provide information relevant to decision making. An operations analysis requires understanding the process and identifies ways to measure what is occurring.

References

IBM Big Data & Analytics Hub http://www.ibmbigdatahub.com/infographic/operations-analytics

What is the Operations Analysis big data use case? http://www-01.ibm.com/software/data/bigdata/use-cases/operations-analysis.html

http://www.ibm.com/big-data/us/en/big-data-and-analytics/case-studies.html

Cosentino, T. "IBM’s Five Lenses for Big Data Analytics," April 10, 2013 at URL http://tonycosentino.ventanaresearch.com/2013/04/10/ibms-five-lenses-for-big-data-analytics/

Operations Analysis at http://www.squawkpoint.com/tutorials/operations-analysis/

Video, IBM Edge2013: "The Big Data Imperative" by IBM VP Inhi Cho Suh at URL http://youtu.be/tOnTKsHIBRE

wiseGeek, "What Is Operation Analysis?" at URL http://www.wisegeek.org/what-is-operation-analysis.htm

Last update: 2015-02-19 06:52
Author: Daniel Power

Print this record Print this record
Show this as PDF file Show this as PDF file

Please rate this entry:

Average rating: 2.56 from 5 (9 Votes )

completely useless 1 2 3 4 5 most valuable

You cannot comment on this entry





DSS Home |  About Us |  Contact Us |  Site Index |  Subscribe | What's New
Please Tell Your Friends about DSSResources.COMCopyright © 1995-2015 by D. J. Power (see his home page).
DSSResources.COMsm is maintained by Daniel J. Power. Please contact him at djpower1950@gmail.com with questions. See disclaimer and privacy statement.


Google
 
Web DSSResources.com

powered by phpMyFAQ 1.5.3