How to define and identify lessons learned?
Dear EvalForward colleagues,
As M&E professionals, we are often requested to identify and consolidate “lessons learned”.
From my experience, there is not a shared understanding/ definition of what a lesson learned is.
Often I come across lessons that sound like findings, conclusions, or sometimes even like recommendations and therefore belong somewhere else.
I would be interested to learn from this community: which is the definition of lesson learned that you adopt or one you find useful?
Are there any resources that can be shared which lay out a step-by-step approach to identify lessons learned that can be useful for our practice? Can you recommend any workshop you have attended or professionals that can facilitate a workshop on this topic?
Thanks in advance!
Emilia BretanEvaluation Manager FAO
Thank you for all the responses and interesting and insightful inputs.
What motivated me to write this post was the impression that most lessons learned identified in evaluation reports are either not lessons or are poorly formulated and rarely used; in other words, evaluators are generating (but are we ?) knowledge that is not really serving to any purpose. Also, I had the impression that behind this issue was the lack of a shared understanding of the concept and of the processes to identify and capture these specific types of evidence.
So, how do we how do we identify real and useful lessons learned?
I will try to summarize the key-points raised:
1. The diversity of responses makes it clear that as evaluators we still do not have a shared understanding of what lessons learned are. Many times, the lessons seem to be there just to check a box on the reports’ requirements.
2. What are key-elements of lessons? Lessons should:
- be formulated based on experience and on evidence, and on change that affects people’s lives;
- be observable (or have been observed);
- reflect the perspective of different stakeholders (therefore, observed from different angles);
- reflect challenges faced by stakeholders in different positions (i.e. also donors may have something to learn!);
- be something new [that represents] valuable knowledge and/ or way of doing things;
-reflect what went well and also what did not went well;
- be able to improve intervention;
- specific and actionable (while adaptable to the context) so that they can be put in practice.
I really like the approach of ALNAP that synthesizes lessons that should be learned. From my perspective a lesson is only really learned if you do it differently next time. Otherwise, it’s not yet learned! Which is why I tend to call the ones we identify during evaluations as simply “lessons”.
3. How do we collect lessons learned? The collection process should:
- be systematic and result from consultations with different stakeholders;
- include what (content), for whom (stakeholders) and how (context!!);
- be clear on who is the target public (e.g., operational staff, staff involved in strategic operations, policy makers etc)
- take into account power dynamics (donors, implementers, beneficiaries etc);
- consider the practical consequences (e.g. is it possible to adapt?);
- include operational systems/feedback mechanisms to ensure that the findings will be discussed and implemented when that is agreed;
- balance rigour and practicality.
4. My final question was: how do we (try to) guarantee that the lessons will be actually “learned” – i.e. used, put in practice? (here I am using the interesting concept of ALNAP, that lessons are formulated to be learned). Some tips shared were:
-associate strategies for putting in practice, including incentives;
- lessons should inform or be aligned with the recommendations;
- recommendations should be reflected in the "management" response; and
- management should be accountable for the implementation.
It’s good to see other organizations and colleagues are interested and that there are some resources available. I hope that this can help us improve our practices. I have compiled below approaches and tools recommended, and examples and resources shared.
Approaches or tools recommended as effective to capture and process lessons learned:
Resources and examples shared:
KIEN NGUYEN VANprincipal researcher plant resources center
Dear all friends and colleagues,
Greeting from PRC, Hanoi, Vietnam. Happy Lunar New Year 2023.
In my opinion, terminology "' lesson learned" looks like very familiars to everyone but it is not easy to define and identify in real life, especially in the agriculture sector. Basically or theoretically, lessons learned also get the correct answers. This is only right in basic/mechanical science and psychological/ behavioral science. Actually, lessons learned be just defined and identified as they are applied and disseminated in the homologous condition and environment. Additionally, learned lessons are approved as lessons are rightly applied by applicants in various situations but this differences have be considered and eliminated by learners. In this case, learners could be seen as modifiers, innovators, event if creators although almost they are not approved because the differences is no clear and very littles
sylvain LIMBISAExpert en Recherche, Suivi-Evaluation et Redevabilité Centre de Recherche pour le Développement Integral CREDI
Dear Emilia, I come to give my contribution on the very important point: how to define and identify lessons learned? Indeed, lessons learned are defined as a reasonable approach that serves to discover mistakes and strengths during the process of implementing activities, with the aim of using them as a reference in order to appropriate by correcting, adapting and adopting the same activities for the achievement of the objectives. To achieve this, there is an identification process that I use during evaluations and project management:
1. Use of quarterly implementation fidelity assessments "FMO": This is a unit of measurement to assess the ToC designed on the basis of project effect, outcome, activities, outputs etc. whereby this assessment measures if the planned activities are able to achieve the outputs quarterly. Thus, the monthly MEAL report will be drafted in the form of a dashboard to demonstrate the quarterly progress of the implementation of activities.
2. Quarterly workshop organization: this activity is very important in which the overall result on weakness and strength will be shared with all departments (technical, logistics, grant, finance..) to identify the mistakes or strengths making the activities have effectively achieved the outputs or have not achieved the outputs as planned in order to draw an effective model.
3. Organise focus groups during the workshop: this last step brings together the managers of different departments to reflect on the weakness or strength, the real cause on which each department has contributed to the success or failure in order to to provide a solution, which will be taken as a lesson to launch the next quarter of activities. These steps are taken into account during the impact evaluation as well to ensure that all lessons are included in the report for the next project.
[translated from French]
Daniel TicehurstMonitoring > Evaluation Specialist freelance
First, we can not always assume that those who claim to be learning organisations are necessarily so. I have learned that very often the most conceited and intolerant are the ones who congratulate themselves on their capacity to learn and tolerance of other views.
My crude answer is putting lessons to work is about strategies associated with incentives to do so - the organization should not only be accountable for the quality of evaluand objectives and their achievement but also for their adjustment as operating circumstances change; that is, accountability extends to accountability to learn.
My understanding of current practice, in relation to evaluations, in ensuring lessons learnt are taken heed of and put into practice is typically about:
a) the lessons learnt inform or are aligned with the recommendations - their consequences - lest they be missed altogether;
b) the recommendations are reflected in the "management" response; and
c) management actually implements them.
That's the theory and it defines much practice, yet a lot of this depends on who holds management to account in following through - to what extent are they accountable to learn?
Thanks again and best of luck moving forward with this.
Emilia BretanEvaluation Manager FAO
Thank you so much for your interesting comments, insights and resources!
While going through all the comments and resources provided, I have another question:
Assuming that (i) we are capturing actual/real lessons, (ii) based on experience/evidence, (iii) actionable, that they (iv) reflect a diversity of perspectives and (v) are applicable in other contexts (some of the characteristics of lessons I take from your contributions), how do we (try to) guarantee that they will be actually “learned”? (here I am using the interesting concept of ALNAP brought up by Jennifer Doherty, that lessons are formulated TO BE learned).
In other words, which are strategies that work to support putting lessons into practice?
Thanks for this engagement!
Daniel TicehurstMonitoring > Evaluation Specialist freelance
Hi and many thanks for such a useful post, and great to see how it has provoked so many varied and interesting responses from other community members.
While I do not have any resources cum text book answers in mind, my experience has taught me three things:
1. Crudely put - i apologise - there are two types of lessons, each with their own questions a well phrased lesson needs to answer : what went well for whom and how; what did not go quite so well, for whom and why?. An adequate balance is not always struck between the two, perhaps due to the power dynamics between those that fund, that do and among those intended to benefit from development aid; implied from this
2. To be clear and search for who has learnt what from whom, why this is important and what is the consequence? Of course, providing discretion and opportunity to learn from those that matter most - the intended clients - is important, yet so is it the responsibility of senior managers, who often know little about the practical consequences of their decisions on the ground, so to say, to do the same for form those who deliver the support. Their silence often stifles learning among them; and so, too, the programme's or organisation's capacity to adapt. (And it's an obvious point, yet worth mentioning: evaluation also needs to generate lessons on the performance of those that fund. This is politically a tricky and messy ask as they commission evaluations and fund what is being evaluated. The main point point holds, however: they seldom make themselves available in being held to account to those that matter most; rather to their respective treasury or finance ministries.) ho hum!
3. It is through doing this, listening to those on the ground, with an emphasis on the assumptions less so indicators, that generates the most revealing lessons. In other words, exploring the unknowns. Not doing so hampers success; it also encourages failure.
I've shot my bolt, yet hope some of the above is helpful.
Best wishes and thanks again,
Sansar ShresthaMonitoring, Evaluation, Accountability and Learning World Vision International Nepal
Here I share a lesson learned document, under the Collective Learning Initiative, DEC funded projects in Nepal.
It was a qualitative study on three different themes post earthquake 2015 in Nepal, that was led by World Vision International in Nepal on collaboration of 7 different international agencies working for earthquake response.
I anticipate, this document can contribute further in lesson learn studies.
TOMÁS Netzahualcóyotl RICO-MORAExecutive Director The Planning Lab - MX
💠 DEAR COLLEAGUES,
FROM my involvement in different development interventions, countries, contexts, I consider that proper-relevant-valuable-applicable LESSONS LEARNED:
🔹 CAN RESULT from Micro (project)-, Mezzo (region/sector)-, or Macro (economy-wide, policy)-level interventions;
🔹 ORIGINATE from IMPACT on the ground;
🔹 CONDENSE RESULTS aligned with a ‘Theory of Change’;
🔹 ARE 2nd Order REFLECTIONS out of relevant implementation EXPERIENCE;
🔹 RESPOND to Higher-Level Goals; and
🔹 CAN be EXTRAPOLATED, i.e., APPLIED elsewhere.
I THEREFORE would like to share a compacted summary of Lessons Learned, which have been compiled from various community development initiatives in Mexico, and proposed —together—, to illustrate action of relevance beyond the response to the COVID-19.
📕 The 10 ‘Lessons’ (Spanish original text):
[A.] Seek to respond to the primary question: What lessons can be derived from the pandemic for the development of regions and municipalities?
[B.] Are presented under the following thematic flags: 1. Social Governance. 2. Prevention, in General. 3. Operational Readiness. 4. Training and Commitment. 5. Targeting and Budget Flexibility. 6. Urgent and Effective Impact. 7. Local Management. 8. Vision and Focus. 9. Coordination. 10. Results, Impact and Risk Governance.
[C.] Can be found at the open book: ABC de la COVID-19. Prevención, Vigilancia y Atención de la Salud en las Comunidades Indígenas y Afromexicanas. (Bertha Dimas Huacuz. INPI. 2020), accesible at the webpage of the National Institute of Indigenous Peoples (INPI-MX): https://lnkd.in/gpv3wgu (book cover)/ https://lnkd.in/gG5wpVE (book text).
I hope this will be useful in your work, and for further discussion and learning.
[Morelia & the p’urhépecha community of Santa Fe de la Laguna, Michoacán, Mexico]
latifa rahmaniManager ONG
As a contribution to this interesting topic, I share with you the evaluation document of a project I managed (in French). You can find the lessons learned from page 38 onwards.
I hope this will be useful for you.
[Translated from French]
Simon MakonoMEAL MANAGER World Vision International
Thank you Monir, that is insightful. I just would like to know if good ones remain lessons or they are best practices to be replicated? Regards Simon
Wilm van BekkumHead of Monitoring and Evaluation Self Help Africa
Lessons Learned is a very general statement.
It all starts with observations, you can only have “learn lessons” when you observe them in the first place.
In principle this implicates that you need the right people being involved to observe possible lessons to learn. These observers needed depend on your approaches and it is essential to acknowledge that in advance.
Some observers are for example:
To name just three of …?
Essential here is not only naming the lessons learnt but also the operational systems/feedback mechanisms to ensure that the findings will be discussed and implemented when that is agreed.
Wilm van Bekkum
Thank you for this very interesting topic which deserves special knowledge. In fact, almost all evaluations insist on lessons/lessons learned, but my opinion is that this step is always at the discretion of the evaluator, his previous experiences and competences on the theme and/or the sector evaluated in relation to the results of the evaluation (attached, in French). I am attaching one of my examples of lessons learned from an evaluation on gender equality to benefit from the resources and services of the project evaluated. I hope it will contribute to the discussion.
Monir A WahasFounder Inspiring Vision for Consulting
From practical experience perspective, the key to discover lessons learned is "outliers": look at your data, find outliers, discover how any why it's there, then report good ones as a lesson to replicate, and bad one to be avoided.
If that sound good, let me know I can explain more if needed.
Hope that helped !!
Founder & Partner
Inspiring Vision for Consulting (IVC)
It's been useful to read all your comments - thank you.
In ALNAP, we’ve been producing ‘Lessons Papers’ for many years as a synthesis of the findings are recommendations from multiple evaluations and studies conducted by different agencies that focus upon addressing different disaster types (or in some cases a specific disaster). We don’t phrase these so much as lessons ‘learned’ but about lessons that should be learned by the broader humanitarian system based on the learning emerging from different agencies that could help to improve in future responses. The Lessons are typically predominantly focused on actions for operational staff but also include lessons for policymakers and staff involved in strategy to provide the supportive structure to carry out necessary changes.
The papers have evolved over time in their substance but also in their methods. In the most recent papers we draw on a literature review of findings and recommendations from evaluations and written studies but also pair these with a review by a board of experts that we intentionally try to draw from different locations and to represent a diversity of different experiences to comment on how relevant the Lessons are and to help make them more useful in practice. In developing the initial Lessons from the literature, the authors assess their frequency (in the findings and recommendations of existing documents) and the quality of the original research to determine which are more reliable but then check their relevance with the reference group to help ensure they’re useful for the broader sector.
We try to assess the methodology periodically to improve it, with one challenge being a balance between rigour and practicality. We’re also increasingly thinking about how to provide lessons for crises that are likely to be more frequent and rapidly evolving in the future when the traditional lessons paper method focuses on the past. In case of interest, you can see more about the methodology from our latest paper focused on climate-related disasters here where we had to consider some of those challenges: https://www.alnap.org/help-library/annex-alnap-lessons-paper-adapting-humanitarian-action-to-climate-change
In their framing, we try to make each Lesson as specific and actionable as possible (while keeping in mind the need to adapt to contextual factors!) to help people put them into practice. The level of detail, however, does rely on the underlying information and specificity in the original evaluations and studies or what the expert group offer in terms of their experiences. We recently conducted a review of all the lessons paper ALNAP had produced over the years and could see a difference in thematic areas where the sector had more specific recommendations and where the Lessons were vaguer, depending on the information and available to be synthesised. The reflection paper is here if people are interested: https://www.alnap.org/help-library/lessons-of-lessons-a-window-into-the-evolution-of-the-humanitarian-sector There’s also a shorter summary article: https://www.thenewhumanitarian.org/editors-take/2022/10/06/humanitarian-reforms-accountability-localisation
Abubakar Muhammad MokiCommissioner Policy Development and Capacity Building Office of the President-Cabinet Secretariat
Sometimes lessons learned are taken on as the key observations or take aways or discoveries or striking issues that one comes across in the process of MEAL
Luis PemánConsultor especialista en evaluación y formación AID Social Consultores
Ladies and Gentlemen.
In my experience, I consider that the good practices and lessons learned that are usually requested in evaluation reports are not part of an evaluation process as such. In my opinion, the identification of good practices and lessons learnt are part of the systematisation of a project and/or programme. And it is here where, through a methodological process, lessons learned can be identified.
In the case of evaluation, it has always been difficult for me to determine what a lesson learned is, and reading the comments of my colleagues reiterates this difficulty (each one of us has a different concept).
Currently, I define lessons learned as elements linked to the implementation of an intervention that have served as learning (for the better or for improvement) and that can and should be taken into account in future interventions, i.e. they are replicable and scalable.
Sincerely Luis Pemán
[translated from Spanish]
Catrina PerchEvaluation Manager WFP
I co-authored this publication “ Lessons Learned from Evaluation: A Platform for Sharing Knowledge” many years ago (2007) (link here) but it may still be of use. It includes definitions and discussion of problems in formulating and applying lessons learned.
Emile Nounagnon HOUNGBOAgricultural Economist, Associate Professor, Director of the School of Agribusiness and Agricultural Policy National University of Agriculture
I am very interested in the subject under discussion: "lessons learned". I would like to clarify that the notion of lessons learned has a much more scientific and didactic connotation. It is not about findings or recommendations. They are strong inferences that emerge as lessons that can be retained and applied in other contexts. The lessons are drawn for application beyond the current study context. Indeed, monitoring and evaluation is carried out in a given context. However, the in-depth analysis of the results obtained and the facts observed makes it possible to draw lessons that go beyond this context; lessons that are like formulas applicable in other circumstances. The lessons learned are therefore meant to shape our knowledge, know-how and behaviour in other professional situations. They are lessons that can be used in a decontextualized way, i.e. without necessarily referring to the circumstances in which they were generated. Recommendations should be formulated taking into account these lessons learned. The same applies to the method of conducting future similar studies. The strongest and most stable lessons learned are those that are methodologically rigorous and exist as accepted formulae. Examples are the Pythagorean theorem (a² + b² = c²), the law of diminishing returns (Turgot), demand is a decreasing function of price (Neoclassics), etc. The lessons learned are in line with the logic of knowledge accumulation as a continuous and cumulative process in social sciences. The lessons learned are thus miniature contributions to the improvement of scientific and technical knowledge in project monitoring and evaluation.
Dr Ir. Emile N. HOUNGBO, Senior Lecturer, Agroeconomist, Director of the School of Agribusiness and Agricultural Policy, National University of Agriculture of Porto-Novo.
[translated from French]
Miriam ChikwandaEvaluator SH
I have just completed a short course on report writing evaluation reports and one of the resources I got was a USAID document on tips to write evaluation reports.
Below is USAID's description of what lessons learnt entail.
Norbert TCHOUAFFE TCHIADJESenior lecturer / Researcher Pan-African Institute for Development Cameroon
Thanks for sharing your view points; for me the lessons learned are the outcomes and impacts of your results.
Warm regards .
Norbert TCHOUAFFE PhD
Author of Tchouaffé's theory of change (TToC)
Malika BounfourPresident Association Ayur pour le Développement de la femme Rurale
My tiny contribution: I formulate "a lesson learnt" as something new valuable added to our knowledge and/or our way of doing things. It is usually different from hypothesis/expectations. However, not all lessons learnt can be recommendations.
Once identified, it should be analyzed to decide if it would improve the intervention and whether it can be translated into a recommendation for other projects or other places.
Mohammed LardiConsultant UNFPA and UNICEF (retired)
For me lessons learned are any statement that has contribute to success of implementation to be recommended or failure that should be avoided.
Paul L. MendyMonitoring and Evaluation Specialist IFAD/AfDB/IsDB co-funded Gambia National Agricultural Land and Water Management Development Project - Nema Chosso
From one of my former projects we developed a publication called Compendium which is a consolidation of lessons learned from seven years of implementation. This task was in preparation of the Project Completion Report which subsequently informed the design of a new/follow up project.
The process of documenting lessons learned in project management is a systematic one and involves consultations with a wide range of stakeholders and beneficiaries alike to identify and generate evidences which substantiate the selected experiences as lessons learned.
Techniques for processing lessons learned could be one or a mix of the following:
1. Outcome Harvesting (OH),
2. After-Action Reviews (AAR) and/or
3. Perspective Analysis (PA)
These are techniques I have experience applying and I found them effective.
Baraka Leonard MfilingeMEAL Specialist Consultant
TOMÁS Netzahualcóyotl RICO-MORAExecutive Director The Planning Lab - MX
INDEED. MOST RELEVANT QUESTION!
IN MANY CASES, the so-called 'Lessons Learned' are —simply—, a listing of statements collected to comply with the relevant section in an evaluation report.
A true 'LESSON' should be drawn from actual-relevant implementation EXPERIENCE, and condense-compact real facts of RESULTS, which are aligned with the outlined ‘Theory of Change’ whilst pointing to measurable-verifiable IMPACT on the ground…Thus on effecting CHANGE in peoples’ communities and livelihoods.
T. N. Rico-Mora
Purhépecha community of Santa Fe de la Laguna, MX
Baraka Leonard MfilingeMEAL Specialist Consultant
By accurately documenting the lessons learned during your project lifecycle, you can learn from your mistakes and share those findings with other project managers.
There are five steps of lessons learned: Identify, Document, Analyze, Store, and Retrieve
What are lessons learned in project management? You learn something new on every project, but a lessons learned session ensures you capture and codify that information to share it with other teams. When you conduct lessons learned and create a lessons learned report, you’re producing a document the entire project team can use to improve future projects.
Documented lessons learned can be passed on to other project managers running similar initiatives or used by team members who are getting started on similar projects. Sharing lessons learned between teams is a great way to prevent the same mistakes from happening. Not only can you learn from your project mistakes—with a lessons learned report, everyone else can learn from them, too.
You can capture lessons learned at any point during the project timeline. In fact, depending on the complexity of the project, you may want to conduct a lessons learned session at the end of each project management phase, in order to capture information when it’s still fresh. That way, you can evaluate what went well, what went wrong, and what you can learn from it.
The important thing is to capture the information and share it with everyone. No matter what you call it, aim to conduct at least one lessons learned session per project.
5 steps to conduct a lessons learned.
If you’re just getting started with lessons learned, use these five steps to ensure you’re accurately capturing, documenting, and sharing the project’s information in a way that everyone can access.
This is where you identify lessons learned from the project to document in step two. The Identify phase is made up of three steps:
Step 1: Send lessons learned survey Immediately after the project is completed—or at the end of a significant project phase for larger initiatives—send a lessons learned survey to every project team member. This way, you’re capturing feedback while it’s still fresh in everyone’s mind. Then, aggregate that information to get a general picture of what everyone learned from the project.
The lessons learned survey is one of the most important parts of the lessons learned process. Below, we have a template you can use. This survey is typically general to any project, though you can adapt the questions to suit your project’s needs.
Step 2: Schedule the lessons learned session Before the lessons learned session, select a session facilitator. Ideally, find a facilitator who isn’t the project manager, so team members feel comfortable speaking freely. Ask the team lead or an adjacent team member to run the session.
After scheduling the lessons learned session, the facilitator shares any pre-reading information to make sure project team members are on this same page. This could include re-sharing project planning documentation like the project plan or project objectives. Depending on the complexity of the project, you could also share a timeline of the project and accomplishments.
Step 3: Conduct the lessons learned In addition to the lessons learned survey, host a live brainstorming session for all team members. This is a chance for team members to expand upon their lessons learned. In particular, there are three main questions to ask during the lessons learned brainstorming session:
What went right?
What went wrong?
What could be improved?
The main point of running a lessons learned session is to share these lessons with the entire team. Plan to create a detailed lessons learned report with all of the project information and discussion notes, as well as an executive summary of the lessons learned for relevant project stakeholders to review.
Format of a lessons learned report Executive summary
Summary of findings
Lessons learned survey(s)
Recommendations in detail
Create a project documentation template
Analyze and apply the lessons learned so other teams and future projects can benefit from it. This is especially relevant if you’re conducting a lessons learned session mid-project. Analyze the information from the lessons learned survey in order to effectively improve your project for the upcoming phases. Alternatively, if you’re running a lessons learned at the end of a project, use the Analyze phase to glean insights and opportunities before beginning your next project.
Store the lessons learned in a central repository that everyone can access, like a project management tool. With a central source of truth, as project leads can access shared information to best prepare for their projects.
If you’re running a similar project, search for a lessons learned report from a past project to avoid making the same mistakes from a previous project. These reports should be shared in a central source of truth that all project managers can review before beginning the project planning process.
PEDRONEL LOBATON BEDONCONSULTANT AGRICULTURE CONSULTANT INDEPENDIENT
Dear Emilia Bretan,
A cordial greeting. You asked to identify and consolidate the lessons learned, from our experience: I believe that we have all lived with this task in a situation directly or indirectly carrying a shared understanding or definition of what a lesson learned is. Regardless of whether they are presented to us as lessons that sound like findings, conclusions, recommendations or they belong to another place, the interesting thing about this is to create a direction, in other words, to organize the subject according to the pillars or foundations that are showing us a result of an experience in any sector, be it environmental, technical, socioeconomic, or productive, and take it into the subject that we are working on, applying all the guidelines and strategies that are presented to us to finally achieve a successful result of this puzzle that generates the knowledge and learning of the community.
Now, as for any resource that can be shared that presents a step-by-step approach to identify the lessons learned that can be useful for our practice, I can cite an experience of lessons learned, best practices and adding results of case studies in coexistence and knowledge. Within the indigenous, settler and city communities, a task or workshop within the target population can be cited for study or recommendation based on experience to replicate to professionals interested in the event or workshop, such as Local Production Systems and Technologies-( STLP), an event that can be applied in the field, environmental, technical, socioeconomic, or productive sectors, because it argues how complex the topic is and finally all this structure or navigation chart that is taken, brings together and concludes a lesson learned which is the objective task. Finally, it is a Workshop or Event that takes two to three days with all its arguments and strategies for any chosen target population. I just have to thank you for the interest in reading this simple contribution to this topic. Sincerely,
Pedronel Lobaton Bedon