The Blog of operational excellence and energy management for industry


Browse our articles, expert advices and clients testimonials: our experience save your time!

Energy data analytics: Never forget your critical sense!

The boom of Big Data and Artificial Intelligence might lead us to think that data analytics lie in the hands of algorithms. Yet, algorithms alone are not enough to achieve an efficient analysis of energy data in industries. Human intelligence and critical sense are still indispensable. In order for analyses to result in improved energy efficiency in factories, it is crucial in particular to keep control over data aggregation and processing, in connection with the operational reality and the objectives sought. But also to do so methodically! A method that we share below step by step.

Energy data analytics always start with a phase of dialogue and discussions with the operational staff in the factory – a crucial step to determine the client’s challenges, expectations, constraints and possibilities of daily actions.

 

Detecting operational management strategies

“Our critical sense also helps us challenge the practices of the factory, even if they seem to be efficient on the ground”, suggests Antoine Roland, Energy Efficiency Engineer. Take the example of a client: the purpose of the project was to optimise refrigeration efficiency. Yet the graphical analysis of the machines power commitment had shown two different types of operation for the same cold production: either two refrigeration units each delivering 3 megawatts, or three units each generating 2 megawatts. “There is always a method that is more cost-effective than the other”, says Antoine Roland. “Based on our experience, we can say that generally the efficiency of refrigeration machines is better when they run at high rate, as they use less power.”  A new graphical analysis will be used to verify this assumption and define the best operating strategy.

 

Studying physical systems, understanding operational objectives and context prior to data analysis

Let us continue with the definition of an algorithm: it is a sequence of operations used to resolve a problem and achieve a result. If the problem is poorly stated… then the result will be wrong! Our critical sense should therefore enable us to define clearly the context of the analysis beforehand, while following a logic based on four key steps.

1. Choice of the performance indicator – The most suitable indicator for the project is not necessarily the one you might think of spontaneously. “In the context of the project as stated above, we had logically opted to monitor the coefficient of cooling performance (COP)”, explains Antoine Roland. “In view of the initial results, we could have suggested that our client shut down the refrigeration units with the lowest COPs. But it would have been without taking into account that this equipment, responsible for the poor refrigerating efficiency, was also producing mixed water at 45°C via a heat recovery system”. Under the circumstances, the COP was no longer meaningful as a performance indicator. The results were found to be more relevant after selecting a perimeter to optimise including iced water and mixed warm water, and with an economic indicator (in euro/hour) taking into account the cumulative performance for both iced water and mixed warm water production.

2. Choice of the analysis period – The analysis period must be chosen at a time when the plant is running in steady state and, most of all, operates in realistic conditions based on existing equipment. For instance, an excessively old time period should not be chosen: “Returning to the previous case, even though the 2014-2015 period might have looked interesting, it was however not suitable for the study since the site had changed significantly within 3 years. New refrigeration units have been installed and the equipment regulation system has been replaced”, explains Antoine Roland.

3. Choice of contextual filters – The context of the analysis should be defined based on steady-state operating scenarios, on refrigerating power ranges (3 to 4 megawatts, then 4 to 5 MW, etc.), on outdoor temperatures (0°C to 10°C, 10°C to 20°C, 20°C to 30°C), or the use of standby equipment by the client, such as heat pumps…

4. Choice of variables – Any data that are meaningless for the study should also be eliminated. “Let me explain”, says Antoine Roland; “Take an incinerator: in order to optimise the waste combustion performance, it is useless to track fume treatment parameters, or parameters on the power generated from recovered heat. The results must therefore be screened and the analysis should focus on the combustion area. It is a question of common sense.”

 

Using critical sense when looking at the influential variables recommended by the algorithm

After selecting the performance indicator, the study period, the filters and the most relevant variables, “you are going to run the algorithm”, says Antoine Roland. “But there again, you need to use critical sense to decide on the operational relevance of the influential variables recommended by the algorithm, since that algorithm has never visited the field!”.  You have to learn to sort out the uncontrollable variables that generate different operating scenarios (weather conditions, power demand, etc.), the consequential variables over which there is no direct action (e.g. furnace temperature depending on regulated air inputs), from the regulation variables. The latter are the variables of true interest since they are modifiable. Then you need to ensure that the regulation variable is really reliable, that it is not an artefact and/or that it does not result from a loss of data, an error or a malfunction.

By following these steps, we can be assured of the quality of the analysis and avoid any interpreting mistake. And that is precisely where human intelligence makes a big difference!”, concludes Antoine Roland.