Our Watch

Skip to content

    Gather and report data

    It is important to assess the implementation of a program or activity to help your local government understand what works and what can be improved. This process is often referred to as monitoring and evaluation.  

    Monitoring refers to the collection of information during the activity to understand if it is on track to achieve the expected results. Evaluation usually happens at the end of an activity and can assess whether the overall objectives have been achieved.

    For more on measuring population-level progress, check out the Our Watch guide Counting on change.


    Indicators are markers of progress and impact of your project. Your indicators should be SMART (specific, measurable, achievable, realistic and relevant).  

    Indicators for measuring progress and participation (sometimes known as process indicators) could include:  

    • the number of participants in an activity  
    • the number (and type) of events held 
    • the rates of participation for different groups within the community, including religious and ethnic groups  
    • the levels of satisfaction with the initiative  
    • the number and activeness of inter-agency partnerships.  

    Indicators for measuring impact could include changes in:  

    • individual attitudes about gender equality 
    • knowledge and understanding of the drivers of violence against women 
    • commitments by community groups to take action to prevent violence against women 
    • confidence with regards to talking about gender equality and violence against women (for example, being able to explain the link between gender inequality and violence against women) 
    • the increasing ability of people to readily engage  
    • higher levels of reporting of incidents of violence against women initially, then lower levels of reported incidents over time. 

    Data collection

    There are two main types of data collection:  

    • quantitative information refers to numbers or percentages. This can include the number of women, men and people who are gender diverse who have been reached, completed surveys, attended activities, contributed to planning, responded to questionnaires or changed their attitudes.  
    • qualitative information refers to opinions, views and experiences. This can include people’s stories of their experience with the initiative, views about whether they think they have more knowledge or a better understanding, reflections about whether they now think or act differently and summaries about what they have learnt.   

    Information can be collected using a range of different methods. Before you start, consider the scope of the evaluation (ensure your evaluation is commensurate with the size of your program/funding), how much data you may collect and what you want to collect it for. Collection methods include:  

    • questionnaires or surveys  
    • interviews  
    • focus groups  
    • feedback forms  
    • observation 
    • reflective journal (often kept by a program officer) .


    Once your local government collects the results (data) through monitoring, you will need to conduct an analysis to understand what the results mean and draw a conclusion from them. Local governments can use the following questions to analyse the data collected: 

    • Did the activity go according to plan?   
    • Did it achieve the desired outcomes?  
    • Were there any surprising outcomes?  
    • What obstacles came up during implementation?  
    • Was the focus of the activity the right one?  
    • What was learnt about the process of creating and implementing a prevention activity?   
    • What do the stakeholders identify as the most important findings? Are these different depending on who the stakeholders are? 
    • What were the limitations of the activity?   
    • What do these findings mean for future prevention work? 

    Document the analysis in a clear and structured report that includes key findings from your evaluation. This can be used to inform your local government’s next steps or to share the findings with your stakeholders and community members. 

    Ethical considerations

    Ethics must be considered when evaluating prevention of violence against women activities. Ethics approval may be required in certain circumstances. Universities and other research organisations can assist with this.  

    Consider the following guidelines:  

    • put risk management strategies in place. Ensure that all those carrying out the monitoring and evaluation activities have a sound knowledge of the prevention and response to violence against women  
    • make sure that the evaluation is of benefit to those involved and does not harm them  
    • protect confidentiality of any sources and respondents  
    • ensure the monitoring and evaluation processes are inclusive, culturally safe and respectful, and responsive to diversity  
    • be open and transparent with the monitoring and evaluation so participants can give informed consent to participate  
    • model respectful behaviour  
    • ensure data is stored in a safe and secure way, in line with relevant legislation.

    Other considerations

    • Identify who the evaluation is for and what they need/want to know .    
    • Decide how you will share and communicate findings from the evaluation. 
    • Ensure the evaluation has a well-understood purpose.    
    • Include data for all the different groups including different gender identities (not just men/women, male/female).     
    • Involve target groups/populations in the evaluation process to empower and engage them in the change process and to better understand what works and what doesn’t .   
    • Who is evaluating your work? Evaluation can be undertaken by an external consultant or conducted internally with a clear understanding of evaluation principles and the right tools.    
    • Any evaluation of prevention strategies should follow the feminist principles of primary prevention. A feminist evaluation emphasises the importance of participatory approaches, empowerment and using evaluation for social justice.     
    • Ensure that your evaluation framework can measure both the intended and unintended impacts of your work so that you can celebrate all positive impact in your community. 
    • Be committed to openness and transparency. An evaluation is a learning process and this learning is important to your organisation, partners and the wider sector. 

    Case study

    City of Charles Sturt: Keeping track and building an evidence-base for future funding

    The City of Charles Sturt recognised that during the 16 Days of Activism against Gender-Based Violence, they would need a simple system to help collect and analyse data so they could measure the effectiveness and impact of their work. They recognised that this data would provide important evidence for future funding applications.

    They identified key objectives for their work within the organisation, in the community, in local schools and with various key stakeholders including domestic and family violence services in their local government. They then identified the metrics (indicators) for measuring their achievements, the actions required to achieve the objectives, who was responsible and accountable for each activity, who should be consulted, the resources required, and the date by which the actions need to be completed.

    They paid specific attention to data collection methods that were culturally appropriate and accessible for all involved. For example, they factored in adequate time to collect verbal feedback through short interviews or informal conversations, rather than expecting all participants to be comfortable with filling in evaluation forms.

    The final reports detailed what they did, how they did it, the impact of the activity, and who they were able to engage. These reports have been shared with neighbouring local governments who wish to undertake similar projects. They will also be used to help build a case for the local government to continue funding the broader ‘Safe and Diverse Communities’ work.

    Next step

    Communicate the results

    Supported by the Australian Government Department of Social Services.