PICD MODULES

1.0 Participatory Integrated Community Development

2.0 Participatory Intergrated Development Process

3.0 Participatory Impact   Monitoring    

4.0 Conflict Management   and Peace Building

2.0 Participatory Intergrated Development Process


  1. Input Tracking Matrix

Input tracking matrix is developed by project beneficiaries and service providers in a meeting at the community as shown below.

  1. First, in order to be able to track inputs, budgets or entitlements one must start by having data from the supply side.
  2. Take this information to the community and the project/facility staff and tell them about it. This is the initial stages of letting the community know their ‘rights’ and providers their ‘commitments.’
  3. Using the supply-side information above and the discussions in the sub-groups one needs to finalize a set of measurable input indicators that will be tracked. These will depend on which project   or service is under scrutiny.
  4. With the input indicators finalized the next step is to ask for and record the data on actual for each input from all of the groups and put this in an input tracking scorecard as shown in table-1below.
  5. Wherever possible each of the statements of the group member should be substantiated with any form of concrete evidence (receipt, account, actual drugs or food, etc.). One can triangulate or validate claims across different participants as well. In the case of physical inputs or assets one can inspect the input (like Community water Tank) to see if it is of adequate quality/complete. One can also do this in the case of physical inputs -like the number of mosquito nets present at the CDDC office-in order to provide first hand evidence about project and service delivery.

 

Input Indicator

Entitlement/

Planned

Actual

Remarks/Evidence

Fingerlings per pond

 

 

 

Ken Brew chicks per welfare group

 

 

 

Project Funds

 

 

 

Brooders

 

 

 

 

  1. Community Scoring of Performance

Community scoring of performance is done though the following process:

  1. Once the community has gathered, the facilitators (both local and external) face the task of classifying participants in a systematic manner into focus groups. The most important basis for classification must be usage, in order to ensure that there are a significant number of users in each of the focus groups. Without this critical mass, no useful data can be solicited. Each group should further have a heterogeneous mix of members based on age, gender, and occupation so that a healthy discussion can ensue.
  2. Each of the focus groups must brainstorm to develop performance criteria with which to evaluate the facility and services under consideration. The facilitators must use appropriate guiding or ‘lead-in’ questions to facilitate this group discussion. Based on the community discussion that ensues, the facilitators need to list all issues mentioned and assist the groups to organize them into measurable or observable performance indicators. The facilitating team must ensure that everyone participates in developing the indicators so that a critical mass of objective criteria is brought out.
  3. The set of community generated performance indicators need to be finalized and prioritized. In the end, the number of indicators should not exceed 5-8.
  4. Having decided upon the performance criteria, the facilitators must ask the focus groups to give relative scores for each of them. The scoring process can take separate forms – either through a consensus in the focus group, or through individual voting followed by group discussion. A scale of 1-5 or 1-100 is usually used for scoring, with the higher score being ‘better’.
  5. In order to draw people’s perceptions better it is necessary to ask the reasons behind both low and high scores. This helps explain outliers and provides valuable information and useful anecdotes regarding service delivery.
  6. The process of seeking user perceptions alone would not be fully productive without asking the community to come up with its own set of suggestions as to how things can be improved based on the performance criteria they came up with. This is the last task during the community gathering, and completes the generation of data needed for the CSC. The next two stages involved are the feedback and responsiveness component of the process.

 

Table-2: A Sample Community Generated Performance Scorecard TCB Farming

 

No

Input indicator

A

B

C

D=C-B

Reasons

Score

(0-100)

 

 

Current situation baseline

Measurable result

(target) month

Actual result achieved

Variance

(difference between target and achieved,

+/-)

Explain why the difference

The rating as per what is achieved

1

60  farmers trained on TCB husbandry

0

13

10

-3

 

76

2

Positive Attitude of Farmers

 

 

 

 

 

 

3

Management of Green Houses

 

 

 

 

 

 

4

Quality Management of Farms

 

 

 

 

 

 

5

Equal access to Green houses for all community farmers

 

 

 

 

 

 

6

Equal access to Extension services for all community farmers

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Table 3 Example of a Sample of a Community Scorecard within a Focus Group

Community Generated Criteria

S CORING

R e m a r ks

Very Bad

Bad

OK

Good

Very Good

1)       Positive Attitude of Farmers

 

 

 

 

80

-Farmers were trained well

2)       Management of Green Houses

 

 

 

70

 

-Rain water seeps through the Green House mesh, disinfectant available and in use

-Parts of the mesh worn out

3)       Quality Management of Farms

 

 

 

 

85

-Farms -have Clean-Manure applied on TCB stems

4)       Equal access to Green Houses for all community farmers

 

 

 

 

90

Restricted access-depending on availability of community member responsible for opening

5)       Equal access to Extension services for all community farmers

 

 

 

 

90

Farmers have equal chances to be visited and advised-lest a farmer is absent from their farms

 

 

Start by scoring the Table from Subject Received Size Categories 

Programme administrator WKCDD field work Fri 5/2 233 KB  

Very Bad – 20%       Bad-    35%  Ok    50%  Good –  60%  Very Good  – 80% and above

 

Related topics

About PICD

2.1 Unit 02: Community entry and re-entry

2.2 Unit 03: Awareness Creation & Attitude Change phase

2.2.2 Secret in the Box

2.2.3 The ‘Diamond Farm’

2.2.4 Take a step’

2.2.5 The Boat Is Sinking ’

2.2.6 The 65-Year Old Couple’

2.3 Unit 04: Situational analysis and visioning phase’

2.3.1 Community Mapping

2.3.2 Resource Bag

2.3.3 The 24 hour day schedule

2.3.4 Seasonal calendar

2.3.5 Family vision

2.3.6 Pair-wise Ranking

2.3.7 Visioning matrix

2.4 Unit 5: Planning phase

2.4.1 Long Term Goals

2.4.2 Selection and Formation of CDPC

2.4.3 Visioning Matrix Discussion

2.4.4 Short-term Goals

2.4.5 Future Mapping

2.4.6 Community Action Plan

2.4.7 Resource Mobilization

a. Wealth Ranking

b. Venn Diagram II – External Institutions

Election of the CDDC

2.5 Unit 6: Implementation Phase

2.5 Unit 6: Implementation Phase

2. Community Scoring of Performance