13 days ago

Monitoring and Evaluation Officer

Dedicated Women Social Services Organization
2517

Position Title: Monitoring and Evaluation Officer

2025-03-15    Kabul     Full Time     2517

Job Location: Kabul
Nationality: National
Category: Monitoring and Evaluation
Employment Type: Full Time
Salary: 300 - 450
Vacancy Number: SBL-DWSSO-006
No. Of Jobs: 5
City: Kabul city with travel to provinces
Organization: Dedicated Women Social Services Organization
Years of Experience: • Experience: At least 5 years of work experience in monitoring and evaluation of development or humanitarian projects (Annex 1_Terms of Reference .docx). Experience should include designing M&E frameworks, developing indicators, and conducting both quali
Contract Duration: Seven Months
Gender: Male/Female
Education: • Education: University degree in a relevant field such as Statistics, Economics, Education, Business Administration, or Development Studies. Specialized training or certification in Monitoring & Evaluation is a strong asset (Annex 1_Terms of Reference .d
Close date: 2025-03-15

About Dedicated Women Social Services Organization:

Dedicated Women Social Services Organization (DWSSO) is a pioneering, women-led national non-governmental organization (NGO) headquartered in Daikundi Province, Afghanistan, with a mission to empower women, strengthen community resilience, and alleviate poverty. Established in 2016 and registered with the Ministry of Economy (Registration No. 3969), DWSSO operates as a trusted partner in Afghanistan’s development landscape, bringing life-changing resources and opportunities to some of the country’s most vulnerable populations, particularly women, children, and youth. DWSSO’s robust presence spans Daikundi, Ghor, Nooristan, Zabul, Faryab, Paktika, Nangarhar, Balkh, Herat, and Helmand Provinces, with a Liaison Office in Kabul to facilitate national and international collaboration.

 

Our organization leads rights-based, inclusive initiatives that uplift marginalized Afghan communities through sustainable interventions in food security, vocational training, literacy, livelihood support, and social protection. With a focus on fostering self-reliance, economic empowerment, and social cohesion, DWSSO works to advance the Sustainable Development Goals (SDGs) and align with Afghanistan’s National Development Strategy, creating pathways out of poverty and toward dignified livelihoods.

 

Through strong partnerships with international donors, UN agencies, and local civil society networks, DWSSO integrates protection, gender equity, and accountability principles into every aspect of its programming. Our skilled personnel and established community relationships ensure effective, culturally sensitive service delivery across complex regions. As an active participant in key humanitarian clusters, DWSSO’s innovative approach and commitment to humanitarian values position us as a critical driver of sustainable progress and social change in Afghanistan.

Job Descriptions:

Overview:

The M&E Officer is responsible for developing and implementing the monitoring and evaluation strategy for the SBL project. This includes setting up systems to collect, analyze, and report data on project inputs, activities, outputs, and outcomes. The M&E Officer ensures that the project’s progress can be measured against its targets (e.g., 21,000 learners gaining literacy and skills) and that data-driven insights inform decision-making. This role will manage all project data (learner enrollment, attendance, assessment results, etc.), ensure data quality, train staff in data collection, and coordinate any evaluations or studies (such as post-training tracer studies). The M&E Officer also leads on documenting results and preparing the final evaluation report for the project.

 

Key Responsibilities:

  • M&E Plan Development: Develop a comprehensive M&E plan at the project’s inception, clearly defining the performance indicators, targets, data sources, and collection frequency for all key aspects of the project (enrollment, completion, learning outcomes, etc.) (Annex 1_Terms of Reference .docx). Align the plan with the project’s logframe and UNESCO’s reporting requirements. Get the M&E plan approved by the Program Manager and share it with all team members so they understand the metrics being tracked.
  • Tool Design and Data Systems: Design or adapt data collection tools needed for the project. This includes enrollment forms, attendance sheets, class monitoring checklists, training evaluation forms, and pre/post literacy assessment tests. Set up a database or utilize UNESCO’s NFEMIS/KOBO Toolbox for digital data entry (Annex 1_Terms of Reference .docx). Customize Kobo forms for field staff to use on smartphones for submitting data on enrollments, attendance, etc. Ensure all tools and systems are user-friendly and culturally appropriate.
  • Training of Staff on M&E: Conduct training sessions for all relevant project staff (PCMs, DCMs, Facilitators, Master Trainers) on the M&E system. Explain each indicator and the importance of accurate data. Train them on how to use data collection tools and submit data (for example, how facilitators should fill daily attendance, how DCMs use the Kobo app to register learners). Emphasize data quality, honesty, and timelines in reporting.
  • Data Collection and Compilation: Oversee the routine collection of data. The M&E Officer will collect enrollment data once classes start, ensure that all classes and learners are registered in the database (including disaggregation by gender, age, location) (Annex 1_Terms of Reference .docx). On a monthly basis, gather attendance summaries from each class (via DCMs) and any ongoing monitoring data. Also, compile results from any tests or assessments conducted. Maintain a master spreadsheet/database that is updated in near real-time as data comes in.
  • Data Verification and Quality Control: Implement mechanisms to verify data accuracy. This could include random spot checks of classes to compare reported attendance with actual, calling a sample of learners or facilitators to verify information, and cross-checking data sources (e.g., ensuring the number of facilitators trained matches the number of facilitators reporting classes). Clean the data by checking for inconsistencies or missing entries and resolving these with field staff promptly.
  • Monitoring Field Activities: Together with the Field Coordination Officer, plan and undertake field monitoring visits. Use structured observation checklists during visits to evaluate the quality of teaching, the level of learner participation, and the usage of materials. Interview some learners and facilitators for qualitative feedback. These visits will help validate the information being reported and provide context. Document findings and share with the team for continuous improvement.
  • Progress Analysis: Analyze collected data to produce insights on project performance. For example, calculate the average attendance rate per class, dropout rates, literacy test pass rates, and compare these with targets or between provinces. Identify trends (e.g., improving attendance over time, or higher dropout in certain areas) and flag any indicators that are off track. Provide this analysis in a simple dashboard or summary each month to the Program Manager and discuss implications.
  • Reporting: Contribute to the drafting of monthly and quarterly progress reports by providing the latest data and interpretation. Ensure that every report includes updated values for each key indicator and highlights notable changes or achievements (e.g., “X% of the target 21,000 learners have been enrolled to date, of whom Y% are female” (Annex 1_Terms of Reference .docx)). Prepare visualization (charts, tables) as needed to communicate data clearly.
  • Mid-Term Review: If the project performs an internal mid-term review or reflection workshop, provide the necessary M&E input: a mid-term M&E report with results to date, and facilitation of discussions on what the data indicates about needed adjustments.
  • Final Assessment & Tracer Study: Plan and coordinate the final evaluation of the project. This includes overseeing end line literacy and skills assessments to measure learning gains. Compile all end line data on outcomes (how many learners reached a functional literacy level, how many started an income-generating activity, etc.). Collaborate with UNESCO on any tracer study of beneficiaries (Annex 1_Terms of Reference .docx) (which might involve follow-up with a sample of graduates after course completion to see if they utilized their skills). Ensure that tracer study data is collected and analyzed to show the impact on participants’ lives.
  • M&E Reporting: Produce the final M&E report for the project (Annex 1_Terms of Reference .docx). This comprehensive report should cover the entire duration, comparing targets vs achievements for each indicator, explaining the factors behind successes or shortfalls, and including qualitative findings (e.g., testimonials, case studies). It should also highlight lessons learned and recommendations for future programming. Work with the Program Manager to integrate this into the overall final report to UNESCO.
  • Data Handover: At project’s end, ensure all collected data and M&E documentation (raw data files, databases, filled forms, reports) are properly archived. Provide UNESCO and the implementing organization with a copy of the cleaned dataset and any analysis files, respecting data privacy protocols (no personal identifiable information shared without consent).

Expected Deliverables:

  • Approved M&E Plan: An M&E plan document or matrix completed and approved within the first month of the project, detailing each indicator (output, outcome), baseline values (if any), target values, data sources, and collection schedule (Annex 1_Terms of Reference .docx). This serves as the guiding framework for all monitoring activities.
  • M&E Tools and Database: A complete set of data collection tools (forms, questionnaires, electronic surveys) and a functional database system (e.g., Kobo Toolbox deployment and Excel master sheet). These should be in place early in the project and refined as needed. All project staff should have the final versions of forms and be trained in their use by the end of the second month.
  • Training on M&E: Training completion evidence, such as training agenda, materials, and attendance sheets, showing that key field staff (PCMs, DCMs, etc.) have been trained on M&E and data reporting procedures.
  • Monthly M&E Updates: Brief M&E reports or dashboards each month (could be part of the progress report or separate) highlighting key statistics: e.g., number of active classes, learners enrolled vs. target, average attendance, % of female learners, number of facilitators trained, etc. These updates ensure the team is aware of progress and can act on any anomalies quickly.
  • Field Monitoring Reports: Reports from monitoring visits, at least one visit report per province (or more for high-priority areas) during the project. Each report covering observations, compliance with standards, data verification results, and any recommendations for that province.
  • Mid-term M&E Summary: A mid-point performance summary (for internal use) that aggregates data and results achieved in the first half of the project, helping guide the second half.
  • Final Evaluation Data and Report: By the end of the project, a final evaluation report prepared, containing an analysis of all key performance indicators (e.g., how many of the 21,000 learners actually enrolled and completed, literacy rate improvements, etc.), outcome assessments, and results of any tracer study (Annex 1_Terms of Reference .docx). This report should be evidence-based, with supporting data annexes.
  • Data Sets and Annexes: Cleaned data sets (in Excel/CSV or Kobo) of all collected data, submitted as annexes to the final report or stored for audit. This includes the final list of learners, test score sheets, and monitoring checklists, demonstrating due diligence in data collection.
  • Lessons Learned Document: A section or separate document compiling lessons learned and recommendations for future projects, derived from the M&E findings. For example, if certain teaching methods proved particularly effective (as evidenced by higher literacy gains in those classes), or if certain challenges (like seasonal dropouts) were seen, note these insights for future programming.

Job Requirements:

  • Education: University degree in a relevant field such as Statistics, Economics, Education, Business Administration, or Development Studies. Specialized training or certification in Monitoring & Evaluation is a strong asset (Annex 1_Terms of Reference .docx).
  • Experience: At least 5 years of work experience in monitoring and evaluation of development or humanitarian projects (Annex 1_Terms of Reference .docx). Experience should include designing M&E frameworks, developing indicators, and conducting both qualitative and quantitative evaluations. Proven experience conducting M&E for projects implemented by international NGOs or UN agencies is required (Annex 1_Terms of Reference .docx). Experience in the education sector (especially literacy or training programs) in Afghanistan is highly desirable.
  • Technical Skills: Proficiency in data analysis and visualization tools. Advanced Excel skills are required; familiarity with statistical software (SPSS, STATA) or data visualization platforms is an advantage. Experience with digital data collection platforms (such as Kobo Toolbox, ODK) and database management is expected (Annex 1_Terms of Reference .docx).
  • Analytical Skills: Strong analytical thinking to interpret data and identify trends or anomalies. Able to triangulate different data sources (quantitative data with qualitative field observations) to get a complete picture. Attention to detail is crucial for maintaining data quality.
  • Training and Communication: Ability to train and mentor staff on M&E concepts and tools. Good communication and presentation skills to convey complex data findings in simple terms to stakeholders. Fluency in English for report writing is required, and proficiency in Dari and/or Pashto is necessary to communicate with field staff and perhaps design local language tools (Annex 1_Terms of Reference .docx).
  • Evaluation Design: Knowledge of evaluation methodologies (formative, summative, impact evaluations) and experience conducting surveys, focus groups, or interviews for data gathering. If a tracer study is to be done, experience in designing and executing such follow-up studies is needed.
  • Quality Focus: Demonstrated ability to implement data quality assurance processes. References or examples of past work where the candidate improved data reliability or introduced innovations to capture results are appreciated.
  • Report Writing: Excellent report writing skills with the ability to produce well-structured, insightful, and visually appealing M&E reports. A sample of a past M&E report may be requested.
  • Independence: Capability to operate independently with minimal supervision, while also collaborating closely with program staff. M&E sometimes requires taking initiative to point out when things are not on track – the candidate should be comfortable giving that feedback constructively, backed by evidence.
  • Ethical Standards: Commitment to use data ethically, protect the privacy of respondents (learners and communities), and uphold transparency and accuracy in reporting findings.

Submission Guidelines:

Interested candidates should submit their applications (CV, Tazkera & Educational documents) by e-mail or in writing (marked confidential and clearly indicating on the sealed envelope the vacancy announcement number) to:

House# 10, Left Lane 02, Street 13, Wazir Akbar Khan, Kabul, Afghanistan.

Email address: hr@dwsso.org

Phone: 0799 149 513

 

Submission Email:

hr@dwsso.org

Apply


Similar Jobs