Data that was not available until recently can now be used to produce the city *. What about their potential use? What would be predictive project tools? Could it be a modeling tool that collects in real time relevant information: Context, constraints, programme and anticipates the formalisation phase of projects by proposing a field of possible? Or are the avenues of reflection elsewhere? On aid to the decision: expert system and role of the architect today, those who help the decision are experts of all types, among which are the architects. The expert, if he is a scientist, brings experience and ability to synthesize and provides, in the future, tools of prediction. When he positions himself as a designer, he produces anticipation tools. Predictive data analysis is a decision-support tool that not only identifies a possible universe but also enriches it. It therefore tends to re-examine the role of the expert. If the digital tools are faster and more efficient than it on the synthesis part of the data and identification of the field of possible, what then remains? Two repositioning fields: one upstream of the predictive analysis and the other downstream. Downstream of data processing, the expert has to determine how the outcome of the predictive analysis is made available to the recipients of the expertise. This is what the students of the ESAG-Penninghen have grasped perfectly, of which we cite the work on the theme we-are-data *. While a team has worked on the visualization of emotions conveyed within the stream of ‘ Twitts ‘, another has reflected on a possible way to give to read, poetically, the digital identity that each of us will voluntarily unwillingly. Yet another has imagined a tool for urban wandering focused on discovering unknown places but likely to please us… A predictive urban wandering, somehow… The issue of the modalities of “giving to read”, of appropriation, of l”utilisabilité of data is obviously paramount. Journalists thus worked on the issue when Wikileaks brutally highlighted the difficulty of dealing with a cloud of information and making sense emerge from the masses. The architect and the urban planner also face this dizzying question. Their area of intervention is neither the graphics nor the formatting of the information but the city. To take advantage of the multitude of urban data and to identify it by giving them meaning, geographers produce GIS maps, graphic designers generate images on the Internet or on Smartphone. The question arises as to what the architects can draw from it. If he chooses to seize the predictive analysis of the data upstream, the expert is the one that defines how the numerical algorithms will scan the “data-ecosystem”, what they will extract. There is also the problem of the modalities of overlapping useful to derive the predictive analyses. Tapan Patel, a specialist in predictive analysis in the field of marketing, explains that “data miners and statisticians devote 70% to 80% of their time to data preparation“. He adds: “Data management, access to different data sources, their cleaning and processing so that they are ready to be analyzed are critical elements for any type of analysis project.” So there is an expertise in the choice of data, the determination of what is relevant to cross. So, for the architect and the urban planner, do the new skills to establish an urban diagnosis reside in the ability to determine the data to confront? The competence of the Architect is based, for a long time, on the knowledge of the modalities of confrontation of heterogeneous information, on their synthesis and on their transformation into architectural, functional and technical devices. Today, the analysis of a neighbourhood is done by adding skilfully typo-morphology (i.e. synthesis of physical and visual data), sociology (data on use), urban engineering (data on the functioning), ecology (data on Environment), Economy (data on the evolution of market values), etc. Except now the data is replacing it. Necessarily, the challenge of choosing information and reading grids is different. And if the analysis of a neighbourhood, to escape the approximation and become predictive, now needed to know how to deliver Pell-Ming to the algorithms the occupation of the soil, the state of the buildings, the type of bakeries present, the cultural habits People, their ability to change their mind over time? From then on is questioned the meaning of the disciplinary specificities, when suddenly mixed, in an unusual, unexpected and fruitful way, in the algorithmic shaker, all these data so far well arranged in distinct categories. In this situation, where is the expert? What becomes of it when everyone seems dispossessed of the exclusivity of their domain to the benefit of emerging knowledge, reinvented with each predictive analysis? Here we have a recurring observation when we talk about digital: that of the loss of meaning, in this paradigm, of a certain type of compartmentalization of disciplines. An “augmented analysis”, base for which project approach? Predicting, anticipating and projecting: what relationship today and what new relationship? Today, the classical method of work, in urban planning as well as in architecture and landscape, is the trilogy ‘ Diagnosis/objectives/actions ‘ or ‘ site & programme/bias/project ‘. In this context, the very expression of “predictive analysis” almost resembles a oxymoron. It operates in a way a real-time synthesis between the project and the analysis. It proposes to see simultaneously t and “probable T + 1″, present and future, existing and potential. It describes a present that bears the germ of the future; This germ is visible at the same time as we look at the seed. It is the relationship to time that changes, with the hopes it arouses and the perhaps agonizing alternatives that it sometimes suggests. The problem is then the following: Although “augmented”, the predictive analysis of the data remains an analysis. However, the architect’s own or the planner’s own is to go beyond the analysis, to make it a tool in the service of the project. It is this design approach that fundamentally differentiates it from the geographer and any other scientist. We talked about the paradigm of anticipation, with what this notion has specifically ‘ interpelling ‘ for US designers. Anticipation is the invention, it is the fact of taking the lead. It is to build assumptions and develop them on the basis of more or less arbitrary choices. This is a serious reference to the designer’s approach. Similarly, the predictive analysis superimposes on the present the probable field, which is actually a part of the possible field. However, we know from experience that the designer keeps exploring this field, choosing a track, exploring on this track again the field of possible, etc. With the predictive analysis, it is the statistical algorithm that creates the link between the analysis and the “project”, the current and the future. In the designer, it is a sum of decisions that plays this role. The prediction is based on hypotheses. Designer work too. The difference is that the first ones are not formulated, the seconds are. So, between the designer and the algorithm: complementarity, rivalry? Tapan Patel addresses the issue in other words: “How can you attribute a quality to a tweet?” How can you determine who is representative on social networks? In fact, predictive analyses are a science that will allow you to go from point A to point B but become an art when you appeal to the professional expertise of the person who will ultimately take the decision and who will have to take into account The importance of interference. Finally, the best predictive analyses are the result of a combination of art and science (…)» **. Focus on the convergence of objective and subjective. It may be necessary, moreover, to postulate that for the designer, the question is less to predict, to know what evolution will know the system, than to determine what is the state that it must seek to achieve. The goal is above all to set goals and create the conditions for their realization, their “probability“. The fascination of prediction makes the project lose sight of this aspect. It is possible today to find a source of inspiration. The predictive analysis of the data is for the designer less a ‘ psychic ‘ approach than a possible awareness, opening and framing of this field, to be used then either against the current or in the pre-drawn direction. Finally, once a complex approach to phenomena is taken, the predictions lose the primordial necessity value that is attributed to them in a Cartesian posture. They do not add value. On the contrary, they can have destabilizing effects if a degree of certainty is too high. H.A. Simon demonstrates this in his book The Sciences of the artificial. What is detected here is a discrepancy between the predictive analysis and the use expected of what is considered to be its ‘ results ‘. Our Cartesian culture pushes us to use these ‘ results ‘ in a rather deterministic aim, according to the convictions and hopes of a paradigm dominated by certainty and positivism. On the other hand, the complex analysis, that of the “Sciences of Complexity”, calls for the logic, the modes and the tools of a paradigm dominated among other things by the notions of probability and uncertainty. The statistical and individual correlation is done automatically: the group and the individual work, in real time, each identified and confused. On the balance sheet the city of predictive data analysis asks us to fundamentally change paradigm and revise our thinking categories, our categories to think about architecture. It also asks us to seize these methods and new tools to make them real project tools.
* See the previous chronicle ‘ the City of data is good to take ‘ * * Tapan Patel in the workshop-San Francisco, 20 March 2012