What is it, and why is it important for the entire process of working with welfare technology?
This step is of paramount importance in order to ensure that any technical solution you might consider implementing actually fits within your organization. The evaluation model should ensure that the technology meets the needs you have identified and can deliver on the parameters you set for it – that being for example improved service quality, enhanced work environment, higher cost efficiency or all of the above.
The evaluation of the solution is your chance to gather information about the interaction between the proposed technology and your organization, the service you deliver and of course between the technology and your staff and end users. A good evaluation model can help you weed out potential failed products and projects and is the foundation upon which later implementation is often build. This means validity is of paramount importance.
The evaluation model tests whether the products you have identified in the marked screening actually meets the needs in a satisfactory way and gives you a platform to build on for future implementation and effect monitoring.
- To increase effectiveness and ensure conformity in the evaluation process it is recommendable to use a set evaluation model (for example one of the models mentioned in this document). This ensures a systematic approach and leads an increased clarity in both the decision-making process and in the prioritization process.
- It is recommendable to do a "potential assessment" (assessing the potential benefit of a given technology) before using a lot of resources on an entire evaluation process. It is important however, that this assessment of potential is built on the same framework as your set evaluation model in order to insure continuity and accuracy in the assessment. If it is not build on the same framework, you increase the risk of the initial "potential assessment" turns out to be way off and that your decision to either do the evaluation or not, is taken on inaccurate information.
- When assessing the potential of a given technology remember to connect this with your identified needs as well as your strategy and vision for working with welfare technology.
- An evaluation process is demanding on both time and competences. Please ensure that you allocate the required resources and if possible coordinate with other existing activities.
- It is wise to start any evaluation process by determining the goals for the process. Always consider including the end-user in determining the goals. The goals can later be adjusted as you go along, but when set from the start, it helps give the process direction and creates transparency. Similarly, it is often beneficial to include the practical level (front line staff) and/or end users when determining which data-collection methods to use (questionnaire, interviews etc) as involvement helps with ownership and commitment.
Back to top
Guidelines based on experience
- How your municipality is organized will play an important role in how the technology will perform. Be sure to include an analysis of whether the given technology performs well within your organization in the evaluation model. When doing this analysis consider if an organizational changes would enhance performance.
- In order to ensure local ownership and commitment it is very important that the evaluation goals are relevant in everyday life. This, to make sure that the evaluation process is a part of the normal day routine, comparable to other work activities – and not something extra on the side.
- Remember the future success of the technology depends on the validity of your evaluation. So, when designing the evaluation process make sure the technology is tested for an adequate amount of time and with a sufficient amount of test subjects. Skipping corners may result in a cheaper evaluation process, but significantly increases the chances of failure when implementing the final solution.
- Ensure that the test group is representative for the intended target group. Furthermore, ensure the test group is fully informed, also about what will happen after the testing period ends.
- It is essential to remember to establish baseline measurements to serve as your foundation upon which the evaluation is build. A baseline is required for all the parameters you wish to evaluate.
- Check your network or national authorities to see if others have evaluated this specific technology or even something similar. Even if an existing evaluation is not directly applicable for you, it can serve as valuable inspiration and save you time and resources.
- It is recommendable that your evaluation, amongst other parameters, includes an economic perspective, a cost-benefit analysis If you will. Furthermore, if you wish to effectuate any financial savings the technology may bring, remember to include this in your evaluation model. It is important to include this as "havesting" any economic benefit often requires a closer look at your organization and the way you finance services internally.
- Remember your evaluation rapport should also include a model for how the given technology is supported and distributed after a potential implementation.
- Remember the long term success of a technology may depend on the competences of the staff. Any additional training of staff is naturally an important part of any evaluation model, and will affect the cost side of the final business case.
Back to top
What has having an evaluations model meant to the CONNECT participants?
It has given Odense a tool that is used in all departments, ensuring that everyone is using similar methodology and parameters. It provides us with a systematic approach and also raises awareness about the complexities involved in evaluating new technology. Finally, the evaluation model provides a solid platform for decision-makers.
Back to top
Methods and tools
Odense uses the Danish evaluation model called VTV developed by the Danish technological institute, together with a separate business case model. Both can be found on our webpage www.nordicwelfare.org/connect
The VTV offers a good overview and covers the most important issues in a product evaluation. It also provides a visual perspective on the performance of the technology and serves as a good platform for decision-making. In Odense it is combined with a more comprehensive business case model which has a more in depth look at the financial aspects of the technology and its potential impact. You can read more about the VTV at Danish Technological Institutes webpage or go to the CONNECT webpage to find Odenses guidance documents on how to do a VTV.
Gothenburg uses the balanced scorecard model in four perspectives
- Users/relatives, employees, organization and finances.
- Different parameters can be positioned in the second axes such as functionality, safety, quality and cost.
Gothenburg also recommends the following tools:
- OPIguide.dk contains both business case models, as well as innovation evaluation models that could be useful tools or used for inspiration.
- Samveis.no contains both advice and tools to guide you through the evaluation process.
- Sitra in Finland has a handbook on social and health care management that South Karelia finds helpful, both with regards to evaluation and other management issues.
Back to top
Examples of evaluations models from the CONNECT participants
Examples for this particular step is mirrored by the methods and tools section as to methodology. The CONNECT municipalities are pleased to share their evaluation rapports of given technologies if you contact them directly.
As an example, on Center for Velfærdsteknologi you can see which technologies are tested and implemented in Odense municipalities.
Here you can find also find Odenses guidance documents on how to use the VTV methodology (mentioned above) and how they do a thorough business case (in Danish).
Identificering af forudsætninger:
BC EM Dignio, Mellemregninger, Dokumentation og datagrundlag, Løntabel - faggruppe, Årsløn med ulemtetillæg
Manual til gennemførelse af Velfærds Teknologi Vurdering
Odense FitLight Trainer
The Connect participants also do the evaluation process in different settings and with or without partners. As an example of a setup with partners se the below example from Oulu:
OuluHealth Labs provides a unique, integrated health test and development environment – including professionals' feedback – for every phase of R&D process. OuluHealth Lab services are provided by regions top organizations, such as Oulu University Hospital, Oulu University of Applied Sciences, and City of Oulu's Health and Social Care Services. Oulu CityLab is part of OuluHealth Labs. It is a test environment where the end-users are – at customers' and patients' homes and in all social and health care services within in the City of Oulu. Product developers will get direct professional and customer feedback on product in a real, everyday social and health care environment.
In Västerås they use the following general approach to evaluating new solutions:
We have done evaluations of new technology for seven years now. We try to design the evaluation method individually from case to case.
We generally divide the evaluation into two phases:
- In the first phase we ask test persons to help us with their expert knowledge (on being an elderly or a person with a disability) when trying the technology for us. We emphasize that it is all about helping us evaluate and we carry on giving them as much help as usual or at least being present when they use the technology.
- The second phase is where we replace the normal way of giving care with the new way through technology. Here we still have some extra resources for support and to replace technology when it fails so in that sense it is still a pilot running but now we can say that we actually give our care with the technology.
From both these phases we collect experiences from all involved – users, relatives, staff and others. The data is usually qualitative and unstructured but sometimes – especially when comparing similar solutions – we also use enquiries and other quantitative data.
Västerås Reporting template
Back to top