e-Government 2.0 – are we ready for the next chapter?

Arne Ansper

Chief Technology Officer

“We have created a large number of e-government applications. Many of them are mission-critical. And they work. But the environment around the applications changes and systems must change as well in order to remain functional and relevant.”

Arne Ansper


Image credit: Sportsmen in France form a human pyramid, 1919. (Photo: M. Rol/Ullstein Bild/Getty Images)

We have come a long way with e-governance and secure interoperability solutions. Many governmental business processes have been automated, e-services are offered to citizens etc. Those applications have been in action for over two decades and with rapidly advancing technology, they need to start adapting to the novelties of the tech world, and society’s expectations that go hand in hand with tech developments. But before we rush forward to implement the new things, we need to reimagine how we build the governmental systems. The new era, e-government 2.0, has to begin from redefining the very foundation of these solutions – we shouldn’t try to offer many new revolutionary e-services to the citizens, but we should revolutionize the way we build e-government systems.

We have created a large number of e-government applications. Many of them are mission-critical. And they work. But the environment around the applications changes and systems must change as well in order to remain functional and relevant. Software requires maintenance just like hardware does, sometimes it even needs to be replaced. Many of the applications that we were proud about at first are in a dire need of an overhaul. Technology advances at a rapid pace and users expect that governments take advantage of it: they want better UI, useful AI, proactive services, better privacy, resiliency against cyberattacks, and so on and so forth. This means that all applications must be in a constant change – applications are never “ready and completed”, they are always a “work in progress”.

Wasteful lack of reuse

It is possible to create a state-of-the-art application, but it requires lots of work by the best architects and programmers and it will still become outdated within a couple of years. Moreover, we are lacking more and more manpower to build and maintain applications as they are. Due to the same reason, we cannot afford to treat all the applications separately, approach them as if they are unique, develop them from ground up using general-purpose tools and languages to the same high standards.
It’s a question of centralization vs decentralization. For example: with X-Road, Cybernetica hit the sweet spot between centralization and decentralization: totally decentralized peer-to-peer system with end-to-end encryption at the transaction level with centralized coordination and policing. Similar approach has to be taken with the development of government applications: we must establish an optimal balance between centralization and decentralization. Right now, this field is too decentralized and resources are being wasted on creating and maintaining systems that are quite similar to one another.

Let’s have a look at the Estonian ecosystem of e-government services. From a functional viewpoint, the majority of government business applications are very similar to each other: they are small registries with a web-based user interface for officials, citizens and enterprises with API-s for integration. Workflows of the registries are similar to each other and are defined by the law. Integration needs are similar as well: nationwide e-ID for authentication, access rights management, common reference data management system, other registries via their API-s. Additionally, they feature many similar requirements: security, hosting, user interface, data protection requirements etc.

Governments have stated the common non-functional requirements (NFR-s) that apply for all governmental information systems. Those requirements act as an input to the architecture design process that is repeated over and over for each system.
Architecture design is a costly process. Given the very detailed nature of the NFR-s, the resulting architectures are similar to each other but still different. Due to that, such architectures and the resulting code bases rarely see any significant reuse. Even though the NFR-s are common to all government agencies, the design process is done separately in each of them. This is quite wasteful and demotivating from the developers stand point. Smart people do not want to solve same problem over and over – and this is exactly what is happening now. Good solutions must be reused at all levels, so that people can concentrate solving the new problems and restore the pace at which the e-government advances. If we want to move on to the next era of e-governance, we have to start reimagining the way how we develop our systems.

Reuse done right

Governments should invest in creating the platform and tools that allow creation of run-of-the-mill applications quickly without programming. This is known as a “no-code/low-code” approach: there are tools to describe the functionality of the application in a declarative manner, without programming This formal description is then executed by the platform runtime instance. Most of the work during the creation of application is done by the analyst, not by the programmer: design of the data model, data entry forms, reports, services, business rules etc. Most of the work is strictly related to the functionality of the application. All the non-functional requirements are fulfilled by the platform runtime that conforms to all the latest NFR-s, supports all the integrations etc. Given the complexity of the non-functional requirements and relative simplicity of the functional requirements of majority of the governmental systems we can claim that 80% of work that is needed to create a new application will be reused. And the remaining 20% can be done mostly by analysts and testers.

Of course, there are already many “low-code/no-code” tools available. The biggest issue is that in order to be usable for creating e-government applications, they must fully conform to the NFR-s of the government. And of course, there is no such tool or platform readily available, because so far there has been no market for such tools. Attempts to use existing tools for creation of the governmental applications will fail – functionally the applications might be fine, but they do not fit into the overall e-government ecosystem.

There is a way out – the government should create their own platform/toolchain that conforms to all NFR-s. And most importantly maintain, enhance and update it, to make sure, that all the applications that depend from it, are always secure, offer great user interface and are well integrated with the rest of e-government ecosystem.

We in Cybernetica have tried this approach with good results. Cybernetica has created the majority of the Estonian Tax and Customs Board customs declaration processing systems. ETCB had very detailed NFR-s, very detailed visual stylebook that we had to follow. From the very beginning we decided to start creating higher-level languages and tools to give more power to analysts for defining the data models, user interface views, state-machines, business rules etc. – all in machine-processable form. Over the course of several projects, we refactored the common functionality into a reusable framework that greatly simplified the development of the new applications. Doing common changes to many projects was really easy and efficient task. Tens of applications were created on the platform and the reuse and strong emphasis on using domain-specific languages allowed to reduce the cost of creation and maintenance of the application, reduce the time required to create and change the applications (which is especially important in case of security-related changes) and provide more uniform experience to users.
It is a lot of work – it does not make sense for a single authority to create such a tool for their own usage only. ETCB was an exception because they have such a huge system. But it was a grassroots movement by engineers to make their life easier and more entertaining and not a conscious strategic decision by ETCB. It meant that the platform eventually became obsolete when the inflow of new projects decreased and platform was not updated at a required pace.

It will be most useful when this tool would be available for all governmental agencies. Of course, for every single application a decision must be made – should it be built using the common tool/framework or developed from scratch using low-level tools and languages.

It’s in the interests of the government to ensure the widespread usage of the tool maximize the usefulness of the investment. So, the platform must be capable and must evolve in order to allow the creation of more complex and more powerful applications. It would also be natural to open-source the platform and tools so that governmental authorities do not have to spend on the licenses.
This is not a new approach for solving governmental IT issues. We have done it before and very successfully: X-Road solved the problem of inter-organizational information exchange. It is centrally developed and supported. X-Road is used by all governmental agencies without modifications. On the one hand the reuse has saved a lot of money. On the other hand, the common, high-quality, highly secure system has saved us from security incidents and ensured the smooth operation of the e-government.
For maintenance the emphasis must be on compatibility – it should be possible to upgrade the platform without touching the application declarations. Whenever there is a security fix it can be rolled out immediately without doing any changes to the business logic of the applications. Or changes to the common visual, introduction of the new authentication methods – everything can be rolled out without touching the business logic and data of the applications.


To summarize – the architectural and development work done in parallel in many governmental agencies should be centralized to create and maintain a platform. It will be much more interesting and rewarding to developers, helping to keep them. What would be new is the development of rapid application development tools, new languages for analysts – those are the interesting problems to solve. Analysts are provided with much more capable tools that help them to do more – not to write an informal specification in natural language, but a formal, executable specification. Of course, specialized trainings, support and quick fixes for their problems must be provided by the organization developing the platform.

We believe that this approach would help us rise the overall level of maturity of the e-government applications, make them more secure, more convenient to use and more sustainable.