The acquisition of manipulation skills in robotics involves the combination of object recognition, action-perception coupling and physical interaction with the environment. Several learning strategies have been proposed to acquire such skills. As for humans and other animals, the robot learner needs to be exposed to varied situations. It needs to try and refine the skill many times, and/or needs to observe several attempts of successful movements by other agents, in order to adapt and generalize the learned skill to new situations. Such skill is typically not acquired in a single training cycle, motivating the need to compare, share and re-use the experiments conducted each day by each individual robot. In LEARN-REAL, we propose to enable the learning of manipulation skills through simulation for object, environment and robot, with an innovative toolset comprising: 1) a simulator with realistic rendering of variations allowing the creation of datasets and the evaluation of algorithms in new situations; 2)a virtual-reality interface to interact with the robots within their virtual environments, to teach robots various object manipulation skills in multiple configurations of the environment; and 3) a web-based infrastructure for principled, reproducible and transparent benchmarking of learning algorithms for object recognition and manipulation by robots. These features will extend existing softwares in an innovative manner. 1) and 2) will capitalize on the widespread development of realistic simulators for the gaming industry and the low-cost virtual reality interfaces associated to them. 3) will harness the existing BEAT toolchain developed and maintained by Idiap. The same as in robotics, reproducible research has a crucial importance in other data science domains, including the definition of protocols for fair, transparent and facilitated evaluation of results. In LEARN-REAL, the long expertise developed through years in the field of evaluation-as-a-service will be put at the service of object recognition and manipulation by robots, with a competent handling of data, algorithms and benchmarking results. As use case, we will study the scenario of vegetable and fruit picking and sorting.