28/08/2020

Introduction

  • The anonymous survey aimed at facilitating a self-assessment of the proposal as well as the evaluation of the support and tools by the involved team including lessons learned.

  • The proposal evaluation used the same questions and scoring scale as the call for proposals.

  • The resulting dataset contains 20 variables including metadata from 7 participants.

  • Median answering time: 5.8 minutes (Min.: 1.2’; Max: 81.1’).

Aims

  • Collecting opinions from participants concerning the proposal following the official evaluation grid.

  • Collect feedback data from participants concerning tools, consultancy support, lessons learned and suggestions for improvement.

Results of the proposal evaluation

Results of the proposal evaluation

Results of the proposal evaluation

Results of the proposal evaluation

Results of the proposal evaluation

Results of the proposal evaluation

Scoring system

Qualitative assessment options Numerical interval
Very good / Very high 4.21 to 5.00
Good / High 3.41 to 4.20
Regular / Average 2.61 to 3.40
Poor / Low 1.81 to 2.60
Very Poor / Very Low 1.00 to 1.80


  • Mean score by the participants: 43.86 points (46% higher than the min. 30 points).

  • Minimal score for acceptance: 30 out of 50 points.

Predicted score (Bayesian bootstrap)

Results of a reproducible Bayesian bootstrap re-sampling

  • We used a re-sampling statistical algorithm to bootstrap a 95% confidence interval for the mean score by participants. This re-samples data sets with replacement 4,000 times registering the draws to generate representative data from smaller data sets.

  • The histogram in the next slide contains the Bayesian highest density interval (HDI), which indicates a score prediction of 43.8 points for the proposal following the survey results.

  • This prediction lies within a confidence interval between 41.4 and 46.5 points at 95% confidence level (minimum score threshold for approval: 30 points).

Predicted score (Bayesian bootstrap)

Results / team’s feedback

Results / team’s feedback

Results / team’s feedback

Results / team’s feedback

Results / team’s feedback

Results from qualitative data

  • Visualisation of text-based questions uses simple word clouds (see next slides) filtering automatically for “stop words” (e.g., articles and prepositions).

  • The size of words in word clouds represents the numbers of time (frequency) that they occur in the dataset.

  • Each specific frequency is also associated with a colour. Words with same frequency in the dataset have same colours.

  • Computer-based random sampling has been used to select examples of answers from the dataset in a reproducible way.

Word cloud - Liked the least

Liked the least

  • No comments
  • More discussions on the content and getting the opinion of experts, experienced to work with the EU
  • Use of modern technology; Use of statistical data, as well as their comparison with the result of the project.
  • I did not like problems with go to meeting online conference calls when the password does not work, the internet connection is quite slow and not everything can be heard during the workshop.
  • Not all the team were active and in time. Language barriers
  • Discussions about content and share the vision on idea from experts who used to work with the EU
  • Too long meeting

Word cloud - Liked the most

Liked the most

  • Teamwork and facilitation
  • The teamwork and spliting responsibilities
  • Teamwork, distribution of workload and issues, as well as planning your activities, efficient use of time.
  • I liked the collaborative work, the time efficiency of working on the same document and online synchronization, the capacity of Movimentar, the suggestions on improvement of the concept note, I liked Eduardo and his team who is so competent.
  • Online teamwork, reminder and good support
  • Teamwork, shared responsibilities, coherence in meetings
  • These people know what they are doing

Word cloud - Lessons learned

Lessons learned

  • Take a leading role on desicion making at least on explaining the recommendation, as you have done with suggesting the prolongation of project implementation period
  • The beneficiary table needs to be explained in more detail to avoid double work in the future.
  • It was good experience for me and I learned a lot from you.
  • Make more use of such platform
  • It would be good if you could take more the leadership on decision making. at least explain the reason of your suggestion as you have done with prolongation of project duration

Word cloud - Additional comments

Additional comments

  • May be you could make offers your services for whole project proposal provision approach, starting from concept note, full application to kick-off
  • You are doing a great job and I wish you success.
  • Many thanks for the excellent workshop and support.
  • This model of colobaration for proposal writting works wellIt is a competitive advatage of the company

Recommendations

  • Inputs to the results chain, outputs, and beneficiaries were shared by the local team right in the beginning of the assignment. The team used very well the online collaborative templates of the tables including the results chain, outputs by activity, and beneficiaries. Most importantly, the team followed the recommended sequence of steps in the process (first results chain and outputs, and only then start budgeting). This helped to achieve a good timing for the budget discussion so that it can be based on the output and beneficiary numbers (concrete results orientation). This was very important for the design process and is a best practice.

  • We see ourselves as facilitators of participatory processes for the design of the funding application by, and following inputs and ideas from, the local teams. We try to include local knowledge as much as possible for improved relevance, ownership, and better adaptation to local needs and context. That is why we avoid a traditional, top-down expert approach and prefer to build on the inptus suggested by client and partner staff.

  • Scale up and develop capacities on the use of online collaborative document editing as well as management information systems such as Teamwork Projects. They can help to increase productivity and to reduce face-to-face meetings and risks, particularly in the context of the COVID-19 pandemic.