The challenge lies in the exponential increase in the volume of data. We approach this Big Data problem using decentralised analysis techniques which take into account the uncertainty of the phenomena. This treatment consists in aggregating or filtering the raw data. From an architectural point of view, an intermediate level of computation, called “edge computing”, enables processing to be carried out close to the sensors and thus lightens the data exchanges in order to better distribute the analysis. This decentralisation makes it possible to improve the reactivity of the system within the framework of swarms of sensors or fleets of robots.