In the Software Crowdsourcingcompetitive model, crowd members seek for tasks in a platform and submit their solutions seeking rewards. In this model, the task description is important to support the choice and the execution of a task. Despite its importance, little is known about the role of task description as support for these processes. To fill this gap, this paper presents a study that explores the role of documentation on TopCoder platform, focusing on the task selection and execution. We conducted a two-phased study with professionals that had no prior contact with TopCoder. Based on data collected with questionnaires, diaries, and a retrospective session, we could understand how people choose and perform the tasks, and the role of documentation in the platform. We could find that poorly specified or incomplete tasks lead developers to look for supplementary material or invest more time and effort than initially estimated. To better support the crowd members, we proposed a model on how to structure the documentation that composes the task description in competitive software crowdsourcing. We evaluated the model with another set of professionals, again relying on questionnaires, reports, and a retrospective session. Results showed that although the documentation available covered the elements of the proposed model, the participants had issues to find the necessary information, suggesting the need for a reorganization. Participants agreed that the proposed model would help them understand the task description. Therefore, our study provides a better understanding of the importance of task documentation in software crowdsourcing and points out what information is important to the crowd.