Software Crowdsourcing, the act of outsourcing software development tasks to a crowd in the form of an open call, happens mediated by a platform and is based on tasks. In the competitive model, the members of the crowd seek for tasks and submit solutions attempting to receive financial rewards In this context, task description plays a relevant role since its understanding supports the choice and development of a task. Little is known about the role of task description as support for these processes. In order to contribute to fill this gap, this paper presents an empirical study exploring the role of documentation when developers select and develop tasks in software crowdsourcing. The TopCoder platform was studied in two stages: a case study with newcomers to crowdsourcing (in the classroom); and a study based on interviews with industry professionals. We identified that the documentation quality influences task selection. Tasks with unclear objective description, without specifying required technologies or environment setup instructions, discourage developers from selecting the task. We also found that poorly specified or incomplete tasks lead developers to look for supplementary material or invest more time and effort than initially estimated. The results provide a better understanding about the importance of task documentation in software crowdsourcing and point out what information is important to the crowd.