Shifting focus to accountability | Daily News
National Policy on Evaluation

Shifting focus to accountability

It is not often that Sri Lanka asks itself, where are we really going? Are we doing the right thing? Is what we are doing working? Do we need to change the system? But it was these very questions that recently appointed Deputy Speaker Ananda Kumarasiri was faced with when he happened to glance at yet another report on his own district, Moneragala.

Having represented it for over 30 years, he said the district has seen many programmes, aid agencies and NGOs come and go, and yet, year after year, Moneragala has become infamous for occupying the bottom of the poverty index.

“The UNDP had prepared a five-year programme and when I looked at the draft report, I was surprised by some of the figures given by various departments. I felt embarrassed looking at the report. The targets given by certain government departments were unachievable. They had simply compiled some numbers and given it to the UNDP. Who would monitor them thereafter? The report will sit on some shelf again,” he said.

In the 1980s, former President J.R. Jayewardene invited countries to invest in any district of their choosing. Norway selected Moneragala.

“They had discussions with the grassroots and other groups and Norway asked us what they could do for us. That brought in a lot of development - schools, roads, water services. Thereafter, NGOs and INGOs also spent funds on the district. If you speak to the District Secretariat in Moneragala, you will find out that a huge amount of money has been pumped into this district over the years, but where has it all gone? There has been no proper monitoring. What is the ultimate benefit to the people?”

Today, Kumarasiri heads the Sri Lanka Parliamentarians’ Forum for Development Evaluation to promote the incorporation of the National Policy on Evaluation. The policy is expected to stress the need for all government projects to go through a cycle of evaluation and monitoring prior to the project, during and after. Earlier this month, the policy received Cabinet approval and he hoped to have it passed in Parliament in time for the International Conference on Evaluation hosted by the Forum on September 17. Over 100 Parliamentarians around the world are expected to arrive in the country for the event.

The Deputy Speaker has also received the Speaker’s approval to set up an Evaluation and Monitoring Unit in Parliament. “Once an annual report comes in, they go through it and we get an executive summary. So we know what is happening at the beginning and end of the year and we know who the beneficiaries of the programmes are.”

Together with 16 other MPs, they have also started promoting the concept of evaluation among public officials in their respective districts, to increase awareness on evaluation and the need to implement it.

“It is about the will to get value for money,” observed Kumarasiri, as he stressed that once implemented, every cent of public money spent would be accounted for.

Finding the needle in the haystack

When an email was sent out to all 225 MPs in Parliament asking them to come support the move to push for a National Policy on Evaluation, UNP MP Mayantha Disssanayake said only 30 showed up for the meeting. Today, only 14 are actively lobby for it and 10 are from the government. Ninety-five percent of them are young MPs.

“It is the Opposition which should be pushing this really, because it will benefit them the most to know what the government is doing with public money. But they are not interested.”

After much consultation, the National Policy on Evaluation was introduced to Parliament in 2016 by UNP MP Buddhika Pathirana as a private member’s bill and since then, the MPs have been pushing for it to be turned into legislation. This year they had a breakthrough.

Dissanayake believes that this legislation would be as important as the Right to Information Act and would change the way government worked. It would also bring in a system of ‘accountability’ and ‘scientific reasoning’ to decisions and policies adopted by the government.

“When Cabinet ministers change and get new portfolios, they do not look at overall government policies, they just implement whatever they perceive is advantageous to them. That should not be happening. When these things happen, people get affected,” he said.

Dissanayake believed that introducing evaluation and monitoring to all government projects would also keep politicians and officials in check. “People always ask where the money went, but until now we didn’t have a proper mechanism to evaluate on what the money has been spent and if it was worth it.”

Dissanayake is the fifth politician in his family and represents the Kandy District. If there was one project MPs in Kandy have been advocating for, it would be the ‘Central Expressway’. It was an election promise made by his father Gamini Dissanayake in his Presidential Election manifesto in 1994, but since then, consecutive governments have done little to implement the project.

“For over 30 years we spoke of this. But this government is very keen. The difference between the last government and the present regime is that the last spent billions on projects that were politically motivated. For example, Hambantota. It is a poor district, but to develop an airport and port without supporting infrastructure which would only bring benefits in the next 25 years is a waste of money. We must look at and evaluate projects that would bring benefits to the country in the next 5-10 year period. If you are looking at the next 30 years, it is a waste of money.”

Speaking of political decisions, he stressed that the presence of evaluation would also limit changing regimes shifting development to their own constituencies with no scientific logic to do so.

“Hypothetically, say a politician from Kegalle becomes the President. Every economic project and development would suddenly shift to Kegalle. By that time, if this policy is entrenched, yes, there will be a lot of development in Kegalle, but everything will be evaluated and that would hopefully prevent someone from opening a port in landlocked Kegalle.”

More importantly, MPs, he said, would have time to actually find the information they need to question bureaucrats and government in Parliament.

Now part of several Standing Committees in Parliament, MPs were overwhelmed by the information sent to them by ministries. More often than not, such a process means that vital information is buried among a stack of files,

“At Budgeting times, it is very difficult to get information from ministries, especially the Finance Ministry. They flush you with files which reach three to four feet high. When you ask the official, he says it is in such and such a book. You would be lucky to have 15 minutes of reading time every day. But if there is a summary report which is concise, the MP can get all the information he needs. It can make politicians smarter and more able to do their job.”

UNP Kegalle District MP Sandith Samarasinghe, in the meantime, said the Evaluation Policy would also question the effectiveness of legislation. “For example, we can evaluate our policies in the education sector, look at unemployment, etc,” he said.

Further, the policy would play an important role in ensuring that the government is on the right track when it comes to Sustainable Development Goals (SDGs) 2030. Both Dissanayake and Samarasinghe reiterated that Evaluation would help the government with its programmes on SDGs and alert them when it is not taking them towards the desired goal.

Managing for results

This is not the first time the government has taken up ‘Evaluation’. The former government too had promoted it in a bid to ensure that government projects were delivering results to the people.

Former Plan Implementation Ministry’s Department of Foreign Aid and Budget Monitoring Director General under the previous regime, and former National Integration Secretary under President Sirisena, Velayuthan Sivagnanasothy, writing a paper in October 2007, explained that the government at the time was moving towards Managing for Development Results (MfDR), thus shifting focus from ‘inputs, activities and outputs’ to ‘accountability for results’.

“A good Monitoring and Evaluation (M&E) system should go beyond institutional boundaries to cover national, sectoral, programme and project level to ensure results orientation in government,” he stated in the paper.

At the time, the Plan Implementation Ministry had taken on board MfDR and became the “National Focal Point for Monitoring and Evaluation of all government development projects and programmes to ensure achievement of results and development effectiveness.”

“Today, what counts is not so much on how many clinics have been built, but whether citizens’ health has improved. Not how many schools have been constructed, but how many girls and boys are better educated,” declared an ambitious government plan.

In the 1990s, with the technical support of the Asian Development Bank (ADB), the Post Evaluation System was strengthened in the Plan Implementation Ministry. In the late 1990s, the UNDP provided technical support in a large way to strengthen the Results-Based Monitoring and Evaluation System (RBME) in Sri Lanka. This enabled the government officials at the national and sub-national level to understand and recognise the importance of Results-Focused Monitoring. Also many positive factors such as political will, overarching policy, coordination of information collection, flow of information from line ministries and projects to the MPI/DFABM, strengthening of the electronic Information Management System in the National Operations Room (NOR) of the Plan Implementation Ministry and demand for information for decision making contributed to a positive enabling environment. However, concerns such as capacity in government agencies, a large number of ministries and the resultant coordination issues are some of the challenges that need to be addressed, wrote Sivagnanasothy further.

In addition, between 2006-7, a comprehensive Performance Measurement System was piloted with four key line ministries (Education, Health, Agriculture and Highways). A range of activities such as awareness programmes, advocacy and sensitisation to policymakers and training programmes were conducted with the technical support of the UNDP and ADB. MfDR was thus made operational in 35 line ministries.

Sivagnanasothy noted that it was necessary to have incentives to reward success to help government strengthen performance accountability and improve a continuous learning culture.

The implementation phase however was not smooth. Sivagnanasothy found that dissemination of M&E findings were inadequate. M&E institutions and the planning institutions functioned in isolation and did not have an effective formalised feedback arrangement to integrate lessons into the planning and design of new projects. Donors and partner countries continued to be disbursement-oriented and used their own donor systems rather than country’s systems to maintain visibility and attribution. “In addition, there was a shortage of professionals, multiple results frameworks, too many indicators, lack of aid predictability and weak statistical capacity.”

Further, it was also difficult to get consensus to set up Key Performance Indicators (KPIs) for government departments.

“Specifying and agreeing on expected results is not easy. The results chain is not always logical as expected. Indicators are missing for some results areas. Targets and baselines are not given. Setting achievable targets is not possible in the absence of a baseline. The greatest problem associated with performance management is unrealistic expectations.”

Sivagnanasothy who is also helping the incumbent government to bring about the National Evaluation Policy helped by USAID, this time concluded in his paper, “It is necessary to look at the balance between learning and accountability. While independent evaluation is important for ensuring objectivity, too much emphasis on accountability-focused donor-driven independent evaluation function can be a potential constraint for lessons learning and feedback”.

SLEVA

Regardless of government or the funding agency, the Sri Lanka Evaluation Association (SLEVA), set up in 1990, has worked for the last 18 years to entrench evaluation and monitoring in the development process of the country.

Former SLEVA President and one of its founding members, Dr. Soma de Silva was very pleased with their journey thus far and said that no country in the world had progressed so far in such a short period of time with Evaluation Associations as Sri Lanka has done. Evaluation which started to be promoted within UN agencies in the 1990s was brought into the government process when Dr. de Silva and 35 interested professionals in various fields met to form SLEVA. The country having gone through many management styles, from command-driven to objective-oriented, was now being asked to look at whether ‘management brought results’.

“Results meant positive changes for people, changes in their standard of living. If there is development there needs to be improvement in access to health, education, quality of services, etc.,” she said.

SLEVA which also started the first international conference on evaluation in the country helped set up a post graduate diploma in Evaluation at the University of Sri Jayewardenepura. The goal is to eventually make it a fully-fledged department. She stressed that this would help address the lack of professionals in the field.

“Government officers are now working on a framework for implementation. I believe it has to be done carefully, giving sufficient flexibility while requiring adherence to certain standards,” said Dr. de Silva.

You cannot be prescriptive, she explained, as no one can see the future.

Contrary to the Parliamentarians, Dr. de Silva said instead of having every project evaluated, it was more practical to prioritise the most important programmes in every sector and evaluate them.

“I would be happy if we have at least one good evaluation done for every sector every year. The number of evaluations should not be the priority.”

For this, the entire government mechanism would have to have an evaluation plan to continually make development processes efficient, sustainable and reach the marginal. She stressed that it should not be a fault-finding missing, but driven by a ‘concern for the people and the desire to deliver them better results’.

“If you take higher education, rather than continuing with what has happened, can we stop and think? What is the quality of the graduates we are producing? What are we achieving? Their employability? Their ability to create a society to live in? Then look at the programmes for that and prioritise the most important ones. Is this teaching working? Is it giving us quality graduates? Is it giving us necessary quantities or are we overproducing? Under-producing in certain sectors? Those questions should be brought out by the experts in those fields and it should be a consultative process with the people who are benefitting from the system and not benefitting from the system. Likewise, one must look at the most important questions in the education, healthcare systems and so on.”

The issue, at times, she said was that many departments had well established procedures but few strong heads of departments to look at the system and say, ‘Is what we are doing necessary? What is our contribution? Can we do an evaluation and see if we can do better?’

“At times, we do so many things, everybody is busy, but are they leading to results? Are they changing lives? Can we check that?” More often than not, lack of money was not the issue in government; the question was how it was being utilised.

Further for evaluation to be most effective there needs to be a campaign to educate the people and officials to use the information that is found. If the people are to question authority, they must first understand the problem, said Dr. de Silva.

Learning from mistakes

The major complaint about comprehensive evaluations of systems, however, was that they took too long and the reports were often too technical to understand. Dr. de Silva recommended that these reports be disseminated in an official friendly manner and to the media, so the people understood the results better. Further, she said officials could learn while conducting the evaluation process through consultative discussions with various stakeholders. “Results can be used from the word go.”

At the end of the day however, for government to evaluate their policies, Dr. de Silva noted that it was important to create an enabling environment where it was acceptable for officials and politicians to admit when policies have not worked and thereafter adjust them for the better,

“Why can’t we get up in annual reviews, donor meetings or Parliament and say, ‘Yes we did this, but it was a mistake’? We can do it without having to take personal responsibility. All this work is not personal, it is collective and it is a system approach. We are continuously trying to improve the system. But we must find out what has gone wrong first.”

Sometimes if you put the basics right, the system will work, she said.


Add new comment