Monthly Archives: March 2013

You are browsing the site archives by month.

ProB: a model-checker for data validation


ProB is an animator and model-checker for the B-method. It can be used to systematically check a specification for range of errors. The constraint-solving capabilities of ProB can also be used for model finding, deadlock checking and test-case generation.

Prob is developed by the team of Michael Leuschel, University of Dusseldorf, while commercial support is provided by the spin-off company Formal Mind.

ProB is now being used within Siemens and Alstom for data validation of complicated properties.


PredicateB: a predicate animator

PredicateB was created by ClearSy in 2005, in order to calculate predicates expressed using the mathematical language of the B formal method. PredicateB has been initially developped in Java and is part of the Rodin framework hosted on sourceforge.

History of PredicateB uses for railways data validation

A new version of PredicateB, PredicateB++,  has been developed in 2008 in C++ on top of Qt libraries, with the aim of significantly improving performances.

Continue Reading →

Data validation in the railways

The Genesis

Metro de Malaga with Alstom Urbalis CBTC

Verifying railways systems covers many aspects and requires a large number of cross-verifications, performed by a wide range of actors including the designer of the system, the company in charge of its exploitation, the certification body, etc. Even if complete automation is not possible, any automatic verification is welcome as it helps to improve the overall level of confidence.

Indeed a railways system is a collection of highly dependent sub-system specification and these dependencies need to be checked. They may be based on railways signalling rules (that are specific to every country or even every company in a single country), on rolling stock features (constant or variable train size or configuration) and exploitation conditions.

Continue Reading →

Introduction to data validation

validationAccording to wikipedia, «in computer science, data validation is the process of ensuring that a program operates on clean, correct and useful data. It uses routines, often called  validation rules or check routines, that check for correctness, meaningfulness, and security of data that are input to the system. The rules may be implemented through the automated facilities of a data dictionary, or by the inclusion of explicit application program validation logic ».

In other words, the data are checked against a data model. The data model could be simple or elaborated : it is up to the human validator to define to which extent the data have to be validated.


Often data validation is understood as validating numbers, dates and text, i.e. verifying types and ranges. For that purpose, spreadsheet dedicated function can do the job. However in a number of applications the data model could be quite complex and verification difficult to automate, hence leaving room for human errors during validation.

For digital photographers, “data validation generally addresses three questions. Is the archive complete? Did the files transfer properly? And, are the files uncorrupted?“. This interpretation is out of the scope of this article.

Continue Reading →