May 19, 2021
Big Data Analytics
Introduction
The coming of Big Data in the business world which we are recently witnessing is quite relevant, for many reasons. Investment priorities change, technologies and interpretative models evolve, the speed of trade rises and, as a result, companies tend to create professional teams committed to manage and valorize large amounts of data. The frontier for using data to make decisions (the so called data driven decisions) has shifted dramatically: certain high-performing companies are now building their competitive strategies around data-driven insights that in turn generate impressive business results.
We will start by clarifying what Big Data Analytics are, we will highlight some figures to stress the extent of the matter, we will shortly address possible Big Data Analytics use cases. Finally, we will discuss the solutions provided by Wenda, an Italian startup supported by European investors that offers the only cross-chain and cross-device Food Integrity Management Hub, to turn supply chain control from cost center to competitive edge, improving sales, quality and logistics.
What are Big Data Analytics?
Big Data are the totality of technologies and methodologies for the analysis of massive data, that indicates the ability to extrapolate, analyse and link huge, heterogeneous, structured and unstructured data, to discover the connection between different phenomena and predict their future occurrence. Big Data increasingly represent the new frontier of innovation fostered by Information Technology, gaining a central spot in the business and science worlds and in society as a whole. Analytics are algorithms, technologies and software implemented to study and research connections and correlations between data.
Big Data without Analytics are just a huge, shapeless quantity of data, Analytics without Big Data are just a statistical tool. It is the combination between the two that creates a completely different tool with enormous potential, Big Data Analytics. They are technologies and software applied to the study and the research of connections and correlations between Big Data, able to extract new information and create new forms of value. Analytical applications can indeed bring great competitive edges and they show up in different industries, in all the stages of the supply chain. If compared with the recent past, some differences arise:
- Unprecedented research potential, because the acceleration of transactions encouraged the development of real-time digital data-gathering, and today large and complex datasets regarding various phenomena are easily available;
- Change in the nature of research: previously, data were collected to test a hypothesis produced by humans, while today this is done to test assumptions that are yet to be thought, many of which will be generated by computers (machines are becoming smarter through auto-learning algorithms applied to artificial neural networks, and so algorithms can identify many relations between variables without human assistance);
- Change in the nature of experimentation: Internet introduced the possibility to run large scale controlled experiments about many economic and social phenomena, thanks to which you can understand an unprecedented quantity of variables.
This generalized exponential growth contributes to strengthen one of the main functions of Big Data Analytics, which is to provide the most accurate and detailed representation of reality through data. Although, to reach this goal it is necessary to develop methodologies and logics of representation within a data driven scenario, composed of 4 main types of Data Analysis:
- Descriptive Analytics: we start from descriptive analysis, consisting of the tools which allow to represent and describe – also graphically – the reality of particular situations or processes. With regard to enterprises, it comes to the representation of business processes. Descriptive Analytics enable the graphic display of performance levels;
- Predictive Analytics: we then move to predictive analysis, based on solutions to execute data analysis to outline future development scenarios. Predictive analytics are built upon mathematical techniques and models, among which predictive models and forecasting;
- Prescriptive Analytics: with prescriptive analysis we enter in the realm of tools associating data analysis to the ability to take on and manage decision making processes. Prescriptive Analytics are tools that provide strategic insights or operative solutions based on both descriptive and predictive analysis;
- Automated Analytics: in view of the results of descriptive and predictive analytics, Automated Analytics enable to perform actions either defined according to precise rules or as a consequence of fixed events (if x happens, then you answer with y). The said rules can in turn the result of an analysis process, such as the study of the behavior of a machine based on particular conditions previously analyzed.
Big Data Analytics hence represent the inner connection between software, calculus and advanced technological capabilities, capable of usher in a business and information cycle radically different from the past. Therefore, let us list some quantitative data to better appreciate their innovative force and to understand the way it can be harnessed to build services and solutions with a positive impact on businesses.
Numbers and applications of Big Data Analytics
The Big Data Analytics & Business Intelligence Observatory of the School of Management from Politecnico di Milano produced a dense report called Big Data: Fast & Smart. Here we outline some of its intriguing aspects to weigh the quantitative magnitude of Big Data Analytics. The study shows a sector in great shape which, with a 26% growth rate over 2018, stands at 1.393 million Euro: quite a rebound if compared to 1103 million Euro of 2017, when the growth rate came in at 22%, scoring anyway an improvement against 15% of 2016. Considering the total amount of 1.393 million Euro, 88% (1.223 million Euro) is created by big companies, while only 12% (170 million Euro) by SMEs.
Also the average data analysis speed seems strongly growing: almost real-time analysis more than doubled, from 14% over 2017 to 33% over 2018, while real-time analysis almost tripled, with a progress from 3% over 2017 to 8% over 2018. This considerable increase in the speed of Big Data Analytics enabled the unlocking of investments on monitoring and alerting, automated decision making and creation of new products and services (i.e. self-driving cars).
Within the scope of big companies, on a sample of 119 Italian organizations with more than 249 employees, the manufacturing sector is the one investing more resources in Big Data Analytics, with a share of 43%. Then there are the services industry with 19%, retail with 14%, telco and media with 12, utilities with 4%, banks and insurance companies both with 4% and finally PA and health-care with 2%.
In the framework of SMEs, on a sample of 501 Italian enterprises with a number of employees ranging from 10 to 249, the manufacturing sector is the one investing more resources in Big Data Analytics, with a share of 57%. Then follow retail with 12%, tourism with 9%, professional activities with 7% and finally ICT-media-communication, banks and insurance companies and other services, each with 5%. Therefore, 2018 brought new developments and fresh investments, witnessing companies more and more oriented towards value extraction from data, mainly for developments in the manufacture, retail and service industries.
Big Data Analytics are ideal for companies which are effective in the use of their numerous possibilities, since they provide deep insights to accelerate product innovations, optimize internal processes, accurately identify the drivers of financial performance and speed up the whole supply-chain.
Regarding the latter, while important growth rates can be observed in the employment of Big Data Analytics as a marketing tool, it also stands that logistics uses them for route and vehicle planning. Operations departments need them for inventory planning and working hours coordination. Finally, also Supply departments utilizes application for suppliers segmentation and risk measurement. Some core points that can function as hints from companies which successfully implemented Big Data Analytics in their supply chain are:
- use analytics that be coordinated across all the supply chain functions, aligning demand and offer, and hence do not use them to optimize single functions or single, isolated decision-making areas;
- efforts must be driven by a strategy leading to tactically focused applications rather than casual exploration activities;
- measure performances using thoroughly elaborated parameters, and engage in a path of continuous improvement;
- follow a step-by-step evolution in the implementation processes, starting with carefully selected pilot projects instead of jumping to the adoption of advanced analytics on a large scale.
Big Data Analytics enable to reduce implementation time, production costs and time-to-market, and to increase consumer satisfaction and profit margins. Furthermore, they can help extract significant value from large and untapped pools of data too complex to manipulate with standard analytical methods and tools. In food production, connected devices like IoT and sensors make it possible to gather vast amounts of data for in-cloud analysis via Big Data Analytics to gain in-depth knowledge of agricultural factors.
Microclimatic analysis regarding humidity, local rainfall rates, and temperature variations can be used to optimize many processes. In South America, for instance, the FAO ran a project on water efficiency using IoT devices, cloud storage and Big Data Analytics. After analyzing the data, the body made practical recommendations to help farmers make better decisions (also see UBS – The food revolution. The future of food and the challenges we face, 2019, passim).
Wenda Food Integrity Management Hub
We at Wenda, an Italian startup backed by European investors, aim to contribute with our technological innovations to fix some of the issues arising in the Food&Beverage industry. To this end, we deliver Wenda Food Integrity Management Hub, a digital platform sold Saas with an annual subscription dedicated to all the Food&Beverage actors whom handle or manage perishable or sensitive products, to manage their integrity information.
It is the only cross-chain and cross-device hub for the management of food integrity data: it gathers and analyzes unstructured data from data-logger and systems available on the market and employed in all the stages of the supply chain, and it aims to turn Food supply chain control from a cost center to a competitive edge. It provides useful information on traceability, cold chain and shelf-life to the different actors involved and fosters their cooperation.
Wenda Food Integrity Management Hub secures: an overview of integrity analytics and hazard points of the supply chain, differential access levels to journey data, wallet in cloud for the upload and sharing of journey documents and product quality certifications, integration and interoperability with many traceability systems and dataloggers. The Big Data Analytics section includes certain benefits, among which:
- Making data driven decisions to ensure Food safety by detecting hazard points in the supply chain. You can set up the alarm thresholds, which are configurable for any shipment or storage, and then connect your company systems to the platform to see all the information in a single dashboard;
- Verifying suppliers reliability and ranking them according to their performance. By analyzing the gathered data, you can determine if the product travelled under good conditions, so that you can choose only the best transporters for sensitive products;
- Receiving email notifications when issues arise, thanks to the alarm notification management, which allows to set up escalations depending on the severity of the issue. You can also diversify receivers of the alarms and assign different tasks, tailored for each of your collaborators.
Wenda has been accelerated by UniCredit, Maersk, Digital Magics and Intesa Sanpaolo. Moreover, Wenda is supported by a German investor who is a data science expert, and is developing predictive algorithms for the calculation of the probability of product deterioration calcolo when is shipped or stored in the warehouse. Under current development there are prescriptive algorithms which can point out – based on large datasets collected up to today – the best business choice in the face of particular conditions or specific events that might disrupt the smooth product flow in the supply chain.
For any information, to get to know us more, or to schedule a demo with Wenda Food Integrity Management Hub, we invite you to visit our website.