The most difficult decisions, in the business field, involve prior knowledge of the data. We present the most common types of data analysis so that you can choose the one that best suits your needs , depending on the volume of data you are handling, its characteristics and the capabilities of the company.
In the digital age, characterized by easy access to user and customer data, as well as the existence of tools that allow us to carry out such analysis, everyone should be able to interpret the information available to them. The question is how we approach that analysis . We focus on some preliminary issues that you should know and then we go on to list the types.
- You can also read: The 16 types of research (qualitative and quantitative).
ARTICLE CONTENT
1. Types of data analysis according to their nature.
2. Types of data analysis according to the methodology.
Types of analysis according to the nature of the data
First we tend to differentiate between these two main types according to the nature of the data that we examine. In this sense, the workload and the methodology that we will follow will be different.
Qualitative analyzes
In this case there is no numerical measurement of the data expressed in a value . We simply analyze the data to draw a conclusion that, in many cases, can be subjective. It has to do with non-economic concepts such as the quality of a company’s service or customer satisfaction, to name a few examples.
Quantitative analysis
In this type of analysis if there is a numerical measurement. We collect data of the same type to parameterize them , look for correlations or draw conclusions and make the right decisions. It has to do with statistical analysis and we often handle a high volume of data, so much so that sometimes the intervention of a computer system is necessary, as in the case of Big Data.
Types of data analysis according to the methodology
The data collection methods are many. They can be questionnaires, automated computer systems, or focus groups. In this case, we focus on the most common typologies that every businessman, statistician or analyst should know.
1. Big Data
Big Data is a model that is on the rise, since new technologies allow handling huge amounts of data that, mechanically or analogically, could not be ordered and compared correctly. In other words, the intervention of non-conventional software is needed to process the information in a reasonable time that allows a response margin.
- To learn more: Big data: what it is and what it is for.
2. AB Test
Receives many names such as AB tests or split testing. It consists of proposing two results and seeing how the client or user reacts to each of them. It is used in digital marketing when launching a new campaign, in the online world to see the usability and effectiveness of two web pages or when launching new products. The main limitation (and possibly the reason for its effectiveness) is that it only allows two possible scenarios.
3. Neural networks
Using advanced computer systems, the operation of a simple neuronal group is simulated. The information passes through the neural network and is processed, so that the system performs automatic learning that allows it to draw better and better conclusions. It is done through a series of mathematical rules and algorithms.
4. Linear optimization
In linear optimization, also called mathematical optimization or mathematical programming, a specific situation is considered and a series of variables or restrictions are applied , so that we can check which is the best possible result. This type of analysis is used in production systems to reduce production costs and increase profits.
5. Correlation
analysis A widely used statistical technique. In this case,We expose two different quantitative variables, compare them and see if there is a relationship between them. A practical example would be to see if the price of admission to an amusement park has an influence on the drop in the consumption of drinks within the same park. To see a relationship, it is necessary to analyze a large amount of data. Otherwise, we could reach a wrong conclusion.
Choose the right methodology for the data you handle. | Image by: Stephen Dawson/Unsplash.
6. Analysis of scenarios
We propose a series of possible scenarios or events , with different results, and we see which one is more favorable to us. This type of analysis is similar to the AB Test, but includes more options and is therefore less reliable.
7. Sentiment
analysis A concept that emerged as a result of the popularization of social networks is sentiment analysis. In this case, we try to determine what is the attitude of an individual or a group of individuals towards a particular topic. It is based on very subjective variables, so it is difficult to draw a conclusion. However, it offers us a reliable idea of what the opinion of the consumer, client or social group is.
8. Semantic analysis
If the nature of the data that we want to analyze is semantic (texts) , it is best to resort to semantic analysis. The best example is that of Google, which crawls and indexes thousands of web pages every day, analyzing their text (and many other variables) to order the results.
9. Surveys
The main objective of a survey is to obtain results from the interaction of the interviewee. They are part of a long-term investigation and serve to collect personal data that we could not otherwise obtain . For a satisfactory result, we must first select a target audience and the object of interest. They can be done through different channels (face to face, web or email).
10. Case studies
The case study is based on the simple observation of a real environment. The researchers, therefore, have no control over the variables involved. They are easy to design, although they cannot be repeated, since each of the case studies depends on the context in which it was carried out. In that sense, it is difficult to draw an objective conclusion.
- It may interest you: Difference between prevalence, incidence and cumulative incidence
11. Formal experiments
Unlike case studies, formal experiments involve different variables determined by the researchers. Therefore, it requires more preparation time, but the conclusions will be more accurate. Through formal experiments we can see the relationship between different variables when they interact in a real environment .
12. Systematic Reviews (SR)
This analysis is used to compare several scientific studies or publications(primary studies) and synthesize the different results. It starts from the following premise: the result of a single test is not reliable enough to be generalized. On the other hand, we must be clear about the criteria for including or excluding studies.
13. Benchmarking
To make a comparison, we must first establish what the standard result is . Once we have that starting point, we compare it with other results and observe the differences. If we take a look around the net, we can see benchmarking of hardware products. We take one of them as a reference and see if the next one offers better or worse performance.
14. Monte Carlo Simulation
Montecarlo simulation is a complex analysis technique, very useful to determine the feasibility of a project considering that some variables can change from one moment to another. Therefore, it takes risk into account and does so by substituting the real factors for random numbers. Thus, we can measure the impact if one of the factors does not work out as expected .
15. Mathematical prediction
The last type of data analysis is well known. Try to identify the most likely outcome in the future . Through a series of techniques and tools, we observe what has happened in the past to have an approximate (never exact) idea of what will happen in the future. It is used in macroeconomic projections or in revenue management.
- More about the most common analysis methods.