With the advent of massively parallel computers with thousands of processors, a large amount of work has been done during the last decades in order to enable a more effective use of a higher number of processors, by superposing parallelism in time-domain, even though it is known that time-integration is inherently sequential, to parallelism in the space-domain . Consequently, many families of predictor-corrector methods have been proposed, allowing computing on several time-steps concurrently , . The aim of our present work is to develop a new parallel-in-time algorithm for solving evolution problems, based on particularities of a rescaling method that has been developed for solving different types of partial and ordinary differential equations whose solutions have a finite existence time . Such method leads to a sliced-time computing technique used to solve independently rescaled models of the differential equation. The determining factor for convergence of the iterative process are the predicted values at the start of each time slice. These are obtained using “ratio-based” formulae. In this paper we extend successfully this method to reaction diffusion problems of the form
, with their solutions having a global existence time when
≤ 1. The resulting algorithm
provides perfect parallelism, with convergence being reached after few iterations.